We provide IT Staff Augmentation Services!

Sr. Teradata Developer Resume

5.00/5 (Submit Your Rating)

O Fallon, MO

SUMMARY

  • Enterprising individual with Six years of experience in Information technology with expertise in Teradata/ETL Development in a Data warehousing Environment
  • Experience in full Software Development Life Cycle (SDLC) analysis, application design, development, testing, implementation and maintenance in data warehousing environment.
  • Experience in implementing Data warehouse and data base applications with Informatica ETL in addition with data modeling and reporting tools on Teradata, Oracle RDBMS.
  • Proficient with the concepts of Data Warehousing, Data Marts, ER Modeling, Dimensional Modeling, Fact and Dimensional Tables with data modeling tool ERWIN.
  • Very good understanding of Teradata’s MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Indexes (UPI,NUPI,USI,NUSI,Join,Aggregate) etc.
  • Extensively created and used various Teradata Set Tables, Multi - Set table, global tables, volatile tables, temp tables.
  • Experienced in both Waterfall and Agile Methodologies
  • Extensively used different utilities of Teradata such as BTEQ, Fastload, Multiload, SQL Assistant along with DDL and DML commands.
  • Automated the BTEQ report generation using UNIX scheduling tools on weekly and monthly basis.
  • Knowledge in Query performance tuning using Explain, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes.
  • Extensively worked on PMON/Viewpoint for Teradata to look at performance Monitoring and performance tuning.
  • Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain.
  • Very good experience in Oracle database application development using Oracle 10g/9i/8i/x, SQL, PL/SQL, SQL Loader.
  • Strong SQL experience in Teradata from developing the ETL with Complex tuned queries including analytical functions and BTEQ scripts.
  • Extensive work experience in designing and building SQL, PL/SQL-Stored Procedures, Functions and Triggers.
  • Responsible for all activities related to the development, implementation and administration and support of ETL processes for large-scale data warehouses using Informatica Power Center.
  • Expertise in tuning the performance of Mappings and sessions in Informatica and determining the performance Bottlenecks.
  • Expertise in RDBMS, Data Warehouse Architecture and Modeling. Thorough understanding and experience in data warehouse and data mart design, Star schema, Snowflake schema, Slowly Changing Dimensions (SCD Type 1, Type 2, and Type 3), Normalization and Demoralization concepts and principles.
  • Proficient in converting logical data models to physical database designs in Data warehousing Environment and in-depth understanding in Database Hierarchy, Data Integrity concepts.
  • Extensive experience with Data Extraction Transformation and Loading (ETL) from multiple data sources such as DB2 UDB, Oracle, Teradata, SQL Server, XML files, Flat Files, CSV files.
  • Provided Production support, flexible in managing multiple projects and off-shore-onsite model experience.
  • Extensive experience in UNIX shell scripting to automate and schedule the jobs.
  • Involved in Unit Testing, Integration Testing and preparing test cases.
  • Self-initiative, excellent analytical and Communication skills, Ability to work independently and good team player.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.0/8.5/8.1/7.1.

RDBMS: Teradata V2R 15/14.10/14/13/12/6/5 , Oracle 8.0/8i/9i/10g,MS SQL Server.

Teradata Tools & Utilities: BTEQ, Multi Load, Fast Load, Fast Export, Tpump, Teradata Manager, SQL Assistant, Teradata Administrator, Index Wizard, Statistics Wizards.

Data Modeling: Erwin

Languages: C, SQL, PL/SQL, Unix Shell Scipting and Korn Shell Scripting.

Operating Systems: Windows, UNIX, Linux.

Scheduling and Other: UC4/Automic, Control M.

BigData: HDFS,Hive,Pig and Sqoop.

PROFESSIONAL EXPERIENCE

Confidential, O’Fallon, MO

Sr. Teradata Developer

Responsibilities:

  • Extensively worked in data Extraction, Transformation and Loading from source to target system using Informatica and Teradata utilities like Fast export, Fast load, Multi load, TPT.
  • Developed complex mappings to load data from Source System (Oracle) and flat files toTeradata.
  • Created scripts using FastLoad, Multi-Load to load data into the Teradata data warehouse.
  • Involved in Sql scripts Macros, stored procedures inTeradatato implement business rules.
  • Updated numerous BTEQ/Sql scripts, making appropriate DDL changes and completed unit test.
  • Worked onTeradataSQL Assistant querying the target tables to validate the BTEQ scripts.
  • Used Informatica Power Center Workflow manager to create sessions and workflows to run the logic embedded in the mappings.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Wrote UNIX scripts to run and schedule workflows for the daily runs.
  • Created different transformations like Source Qualifier, Expression, Filter, Lookup transformations for loading the data into the targets.
  • Wrote complex SQL scripts to avoid Informatica joiners and look-ups to improve the performance as the volume of the data was heavy.
  • Automated the Informatica process to update a status table inTeradatawhen maps are run successfully, following this a view will be run.
  • Involved in the analysis and optimization of long running jobs.
  • Timely escalation of issues to avoid delay in deliverables.
  • Development of ETL Mappings, Workflows and prepare them for deployment into test and production environments.
  • Participate in code reviews and ensure that all solutions are aligned to pre-defined architectural specifications.
  • Involved in UAT and Production support and implementation.
  • Analysis of the Unix and SQL scripts to rate them in terms of complexity.
  • Working onTeradataViewpoint to check status of the system health.
  • Creation of customized Mload scripts on UNIX platform forTeradataloads.
  • Opening session management contents like Query monitor and query spotlight to check for the performance of individual queries before implementing into production keeping in mind for future issues.
  • Tuning of SQL to Optimize the Performance, Spool Space Usage and CPU usage.
  • Prepared and maintained TPT scripts inTeradata.
  • Write BTEQ and MLoad scripts to load data from Oracle toTeradata.
  • Perform tuning activity for the highly skewed queries.
  • Mainly handle the spool space request from the end users.
  • Creating join indexes as per architectural strategies and standards also creating PPI's on large tables to improve performance of the queries.

Environment: Teradata, Informatica Power Center 9.1, Oracle 11g, Unix,UC4, Hadoop, Hive, Pig.

Confidential

Sr.Teradata Developer

Responsibilities:

  • Developed Informatica mappings, enabling the extract, transform and loading of the data into target tables.
  • Created workflows with the event wait task to specify when the workflow should load the tables. Loaded data from various sources (Flat files, Oracle, Teradata) using different Transformations like Source Qualifier, Joiner, Router, Sorter, Aggregator, Connected and Unconnected Lookup, Expression, Sequence Generator, Union and Update Strategy to load the data into the target.
  • Used Informatica Power Center Workflow manager to create sessions and workflows to run the logic embedded in the mappings.
  • Fixed issues with the existing Fast Load/ Multi Load Scripts in for smooth loading of data in the warehouse more effectively.
  • Worked on loading of data from several flat files sources to Staging using MLOAD, FLOAD.
  • Created BTEQ scripts with data transformations for loading the base tables.
  • Generated reports using TeradataBTEQ.
  • Worked on optimizing and tuning theTeradataSQLs to improve the performance of batch and response time of data for users.
  • Used Fast Export utility to extract large volume of data and send files to downstream applications.
  • Provided performance tuning and physical and logical database design support in projects forTeradatasystems and managed user level access rights through the roles.
  • Developed TPump scripts to load low volume data intoTeradataRDBMS at near real-time.
  • Created BTEQ scripts to load data fromTeradata Staging area toTeradata
  • Performed tuning of Queries for optimum performance.
  • Preparation of Test data for Unit testing and data validation tests to confirm the transformation logic.
  • Tuned various queries by COLLECTING STATISTICS on columns in the WHERE and JOIN expressions.
  • Performance tuning forTeradataSQL statements usingTeradataEXPLAIN.
  • Collected statistics periodically on tables to improve system performance.
  • Instrumental in Team discussions, Mentoring and Knowledge Transfer.
  • Responsible for Implementation & Post Implementation support.
  • Documentation of scripts, specifications and other processes.

Environment: TeradataV2R12, TeradataSQL Assistant, MLOAD, FASTLOAD, BTEQ, TPUMP, Informatica Power Center 9.1, Oracle 11g, Flatfiles, XML, DB2,Oracle SQL Developer, Erwin, Unix, WinSCP, Putty.

Confidential

Teradata/ETL Consultant

Responsibilities:

  • Expertise in Designing and developing anonymous blocks, views, materialized view, stored procedures, functions, Ref & Traditional Cursors, and Dynamic SQL as part of Project/Application requirements.
  • Experienced in developing mappings, sessions and workflows in Informatica Power Center.
  • Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
  • Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
  • Developed Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Applied Slowly Changing Dimensions like Type 1 and 2 effectively to handle the delta Loads.
  • Initiative taken towards automation of ad-hoc processes as per Business requirement.
  • Created, Maintained and ETL Processes using BTEQ routines for Extraction, Transformations and Loading Purposes.
  • The automation of getting the file feeds, loading, transformation is done using Unix Scripts and Autosys Jobs
  • Interact with technical and business analyst, operation analyst to resolve data issues.
  • Creation of BTEQ, Fast export, Multi Load and Fast load scripts for extracting/loading data from various production systems.
  • Experience in partitioning tables to improve performance and scalability.
  • Experience in developing analytical functions like Rank, Dense Rank, Row Number and Partition Over.
  • Tuned database SQL statements and procedures by examining explain plans, monitoring run times and system statistics. Applied database hints, fine-tuned indexes and rewrote code to improve performance.
  • Design the mapping to remove to duplicate source records.
  • Responsible for Unit level testing and Application testing.

Environment: Informatica power center 9.6.1, Teradata, Oracle 12c/11g (SQL, PL/SQL), SQL plus, UNIX Shell Scripting, Teradata SQL Assistant, Toad and Windows.

Confidential

ETL/ Teradata Developer

Responsibilities:

  • Involved in extracting, discussing business requirements from users and team members.
  • Involved in the code reviews prepared by the teams
  • Fulfilled ad-hoc requests coming from users.
  • Managing & monitoring productions runs & fix it in case of failure.
  • Writing SQL, BTEQ scripts, UNIX scripts.
  • Worked on Teradata utilities such as Fast Load, Multi Load, Fast Export, Teradata Administrator, etc.
  • Developing mappings and workflows using Informatica client tools. Used various transformations like SQ, Expression, Aggregator, Look Up, Router, Filter, Sorter and Update Strategy
  • Did error handling and performance tuning in Teradata queries and utilities.
  • Performing unit testing & system testing. Checking that all the system parameters are properly defined before starting the production runs.
  • Enhanced the solutions for client by developing complex ad-hoc query which meets the business requirements
  • Optimizing code that gives better execution plan & faster availability of data.
  • Performing regression between two databases / tables after transforming & loading of data. Has developed an Asset for performing Regression which is widely used by other project member as a Reusable code. Name of the Asset is 'Database Replication & Regression'.
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Managed a team of 5 members and delivered productive results at client place.

Environment: Teradata 14, Informatica 9.5, BTEQ, Multiload, FastLoad, FastExport, Microstrategy and UNIX, Windows, Tidal Scheduler

Confidential

Teradata Consultant

Responsibilities:

  • Performed the systems analysis on Teradata DW in preparation for the complex data loads
  • Participated in Joint Analysis Design with user community and Business users
  • Extracted Data from several sources, Flat files and RDBMS Databases used BTEQ for Batch Processing Job to automate daily reports for transactions with multiple access level. Developed Fast Export, FastLoad and Multi Load scripts to simulate the environment in development database
  • Executing change requests for DDL & DML statements from the users.
  • Managed Teradata databases and space issues (MAXPERM, MAXSPOOL etc) created user accounts and fixed performance issues and used tools like Teradata Administrator, Teradata Manager, PMON, DBs control.
  • Code and test complex SQLs using Query man / SQL Assistant
  • Responsible for moving the data from operational data store (ODS) to staging warehouse and from the warehouse to the data mart for an application
  • Created tables, access views and update views in the development database for unit and system test.
  • Built the Operational data store by using Bteq, Teradata, Oracle, SQL, PL/SQL, and UNIX.
  • Working with the users and testing teams to implement the business logic as expected.
  • Performed unit and system test for the modified code and loaded shadow data marts for testing prior to production implementation
  • Worked with DBA's for transition from Development to Testing and Testing to Production.

Environment: Teradata V2R5, Teradata Manager, Teradata Administrator, Query man, BTEQ, Fast Load, Fast Export, Informatica power center 8, SQL, PL/SQL, and UNIX.

We'd love your feedback!