We provide IT Staff Augmentation Services!

Sr. Etl-teradata Developer Resume

3.00/5 (Submit Your Rating)

Rancho Cordova, CA

SUMMARY

  • Over 6+ years of IT consulting experience in analysis, design, coding, development, testing and maintenance of data warehouse systems. Hands on experience include developing Data Warehouses/Data Marts/ODS using in Insurance and Energy and utilities.
  • Strong experience in Teradata Database in Data Warehousing Environment.
  • Expert level experience in Data Integration and Data Warehousing using ETL tool Informatica PowerCenter/7.x/8.x/9.x. Knowledge of Informatica tools Power Exchange, PowerCenter, Data Analyzer and Meta Data Manager.
  • Expert - level mastery in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables, legacy system files, XML files, Applications, COBOL Sources and Teradata.
  • Expertise in using Teradata SQL Assistant and data load/export utilities like BTEQ, Fastload, MultiLoad, Fast export and TPump on UNIX environment.
  • Proficient in Teradata V2R 6.2/12/13.10/14 database design (conceptual and physical), Query optimization, Performance Tuning.
  • Familiarity with Teradata’s MPParchitecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes, secondary indexes, Teradata Explain etc.
  • Strong experience with data analysis, data modeling, extraction, loading, creation of tables, views, query optimization and performance tuning.
  • Technical and Functional experience in Data warehouse implementations ETL methodology using Informatica Power Center 9.0.1/8.6/8.1/7.1
  • Extensively worked on the Informatica Designer, Repository Manager, Repository Server, Workflow Manager/Server Manager and Workflow Monitor.
  • Worked with Informatica Mapping Variables, Mapping Parameters and Parameter Files.
  • Proficient in implementing Complex business rules by creating re-usable transformations, workflows/worklets and Mappings/Mapplets.
  • Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, transformations and sessions.
  • Knowledge of push down optimization concepts and tuning Informatica objects for optimum execution timelines.
  • Developed slowly changing dimension mappings of Type1, Type2 and Type3 (version, flag and time stamp).
  • Experience in developing Reusable components and Partition sessions across the projects.
  • Experience in developingIncremental Aggregation mappings to update the values in flat table.
  • Sound Knowledge of Data Warehouse/Data Mart, Data Modeling Techniques. Very good understanding of Dimensional Modeling.
  • Self-motivated person with excellent communication and interpersonal skills.
  • Excellent communication, interpersonal skills and has a strong ability to work as part of a team and as well as handle independent responsibilities.

TECHNICAL SKILLS:

Primary Tools: Informatica Power Center 9.5/9.0.1/8.6/8.1, Ab Initio (Co>Op 3.0.3.9/2.15/2.14, GDE 3.0.4/1.15/1.14 ), Teradata SQL, Teradata Tools and Utilities

Languages: Teradata SQL, Unix Shell Scripting

Teradata Utilities: BTEQ, Fast Load, Multi Load, TPump, SQL Assistant, Teradata Manager

Databases: Teradata 14/13.10/13/12/ V2R6.2, Oracle 10g/9i,DB2, MS SQL Server 6.5/7.0/2000

Operating Systems: Windows 95/98/NT/2000/XP, UNIX, Linux

Scheduling tools: Control M,Autosys,UC4

Data Modeling: Erwin 7.3, ER Studio

PROFESSIONAL EXPERIENCE:

Confidential, Rancho Cordova, CA

Sr. ETL-Teradata Developer

Responsibilities:

  • Involved in scrum (agile) implementation for this project.
  • Designed and developed data solutions to help the product and business teams make data driven decisions.
  • Worked closely with Business Systems Analyst’s (BSA’s) to get the requirements, translate them into technical requirements, and deliver the solution to end users.
  • Lead end-to-end efforts, including design, development, and implementation, of the data integration process.Adhered to the best standards for naming conventions and coding practice to ensure consistency of data model.
  • Responsible forPerformance Tuningof High CPU consuming queries and tables with high skew for daily batch jobs.
  • Interacted closely with data infrastructure and engineering teams to build and extend ETL processes.
  • Provided consultation to business partners such as analysts, management, end users, and developers to clarify objectives, determine scope, drive consensus, identify problems and recommend solutions.
  • Supported end users on ad hoc data usage and be a subject matter expert on functional side of the business.
  • Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from oracle Golden Gate to flat files and then to target Teradata tables.
  • Created reusable Mapplets and Transformations starting concurrent batch process in server and did backup, recovery and tuning of sessions.
  • Designed and developedcomplex mappings tomovedata from multiple sources into a common target area such as Data Marts and Data Warehouse using lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Normalizer, Sequence Generator, and Update Strategy from varied transformation logics in Informatica.
  • Performed Development using Teradata utilities like BTEQ, Fast Load, MultiLoad and TPT to populate the data into BI DW.
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • Developed the Teradata BTEQ’s to load data into Incremental/Staging tables and then move data from staging into Base tables.
  • Prepared Test cases for various functionalities and creation of test scripts that used in testingoracle-stored procedures.
  • Executing test scripts to verify actual results against expected results by using Oracle Golden Gate for source validation and Teradata for target validations.
  • Developed the FTP process from the Unix servers to the file services location for vendor delivery
  • Handled customer service data into the EDW for the BI team to generate reports for the end users.
  • Executed unit tests and validated expected results; iterating until test conditions has passed.
  • Utilized Teradata DBQL to monitor the queries running in production and modified for better SLA’s.
  • Performed Teradata SQL assistant Import and Export utility to move data from production to Development to refresh staging tables.
  • Developed several jobs to improve performance by reducing runtime using different partitioning techniques.

Environment: Teradata 14.10/14/13.10, Teradata SQL Assistant, Teradata Utilities (TPT, BTEQ, Multi load, Fast load), Informatica Power Center 9.6/9.1, Workflow Manager, Workflow Monitor, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, UC4, Teradata Viewpoint, UNIX, Putty, Power Designer, Oracle Golden Gate, oracle SQL Developer.

Confidential, Framingham MA

Sr. ETL-Informatica/Teradata Developer

Responsibilities:

  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Worked on developing mappings/Reusable Objects/Transformation/Mapplet by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center 9.6.
  • Worked with Partitioned Primary index (PPI) tables. Utilized RANGE N and CASE N for partitioning.
  • Created reusable sessions, Workflow, Worklets, Assignment, Decision, Event Wait and Raise and Email Task and Scheduled Task based on Client requirement.
  • Used Informatica debugger to test the data flow and fix the mappings.
  • Application development that enables data to be moved from legacy systems and external data sources into a staging area for the transformation effort to final data load into the Enterprise data warehouse Teradata.
  • Apply broad in-depth business and technical knowledge to resolve production support and sustainment activities.
  • Performed Unit testing on the module designed.
  • Implemented various Teradata specific features like selection of PI, USI/NUSI, PPI and Compression based on requirements
  • Developed processes using utilities such as TPT, Multi Load, Fast Load, Fast Export, BTEQ (Teradata).
  • Design and development of new data warehouse (Analytical Data Warehouse) for better reporting and analysis.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Expert level knowledge of complex SQL using Teradata functions, macros and stored procedures.
  • Analyzed the source data, made decisions on appropriate extraction, transformation, and loading strategies.
  • Fast Load jobs to load data from various data sources and legacy systems to TeradataStaging.
  • Modified the existing BTEQ script to enhance performance by using Volatile tables, incorporating parallelism, collect stats when needed and using the index techniques.
  • Developing as well as modifying existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW. Also created Mapplets to use them in different mappings.
  • UsedInformaticadebugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results.
  • Used to make an entry for all new tables into ETL-CTRL table.
  • Assist developers, DBAs in designing, architecture, development and tuning queries of the project. This included modification of queries, Index selection, and refresh statistic collection.
  • Proactively monitoring bad queries, aborting bad queries using Viewpoint, looking for blocked sessions and working with development teams to resolve blocked sessions.
  • Created UNIX shell scripts for Informatica post and pre session operations, database administration and day-to-day activities like, monitor network connections and database ping utilities.
  • Developed UNIX shell scripts to run batch jobs in Autosys and loads into production.
  • Successfully Integrated data across multiple and high volumes of data sources and target applications.

Environment: Informatica Power Center 9.6/9.5/9.1, Teradata 13.10/14, DB2, Oracle 11g, SQL Server, UNIX Shell Scripting, Windows XP, Erwin, Fixed width files, Flat Files, Control-M, Autosys.

Confidential, Bloomfield, CT

Sr.Informatica/Teradata Developer

Responsibilities:

  • Prepared requirements document in order to achieve business goals and to meet end user expectations.
  • Involved in Unit and QA testing of multiple DW projects using SQL and Testing tools.
  • Created Mapping document from Source to stage and Stage to target mapping.
  • Involved in creating data models using Erwin.
  • Worked with Designer tools like Source Analyzer, Target designer, Mapping designer, Mapplets designer, Transformation Developer.
  • Designed Mappings by including the logic of restart.
  • Created Source and Target Definitions, Reusable transformations, Mapplets and Worklets.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator.
  • Involved in tuning the mappings, sessions and the Source Qualifier query.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Performed Unit testing and created Unix Shell Scripts and provided on call support.
  • Created sessions and workflows to run with the logic embedded in the mappings.
  • Actively participated in Scrum Meetings.
  • Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
  • Wrote Unix Shell Scripts to process the data received from source system on daily basis.
  • Involved in the continuous enhancements and fixing of production problems.
  • Worked with systems analysts to understand source system data to develop accurate ETL programs.
  • Loading Data into the Enterprise Data Warehouse using Teradata Utilities such as BTEQ, Fast Load, Multi Load and Fast Export in both mainframes and Unix environments
  • Utilized BTEQ for report generation and running the batch jobs as well
  • Reviewed the SQL for missing joins & join constraints, data format issues, mis-matched aliases, casting errors.
  • Used extensively Teradata Analyst Pack such as Teradata Visual Explain, Teradata Index Wizard and Teradata Statistics Wizard.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Developed UNIX shell scripts and used BTEQ, FastLoad, Multiload, and Fast Export utilities extensively to load to target database.
  • Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
  • Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
  • Participated in discussions with the team members on Prod issues and resolved.
  • Managed Scheduling of Tasks to run any time without any operator intervention.
  • Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
  • Created Unix Shell Scripts to automate sessions and cleansing the source data.
  • Provided Production Support.

Environment: Informatica Power Center 9.5.1,Informatica Developer 9.5.1, Teradata 13.10/14, Unix, Fixed width files, Oracle, SQL Server, Harvest (SCM)Windows XP and MS Office Suite.

Confidential

ETL-Teradata/InformaticaDeveloper

Responsibilities:

  • Analyzed Business requirements, prepared the Physical design, High level designs and technical specifications.
  • Have experience in tuning some batch BTEQ queries.
  • Worked with Designer tools like Source Analyzer, Target designer, Mapping designer,Mapplets designer, Transformation Developer.
  • Designed Mappings by including the logic of restart.
  • Created Source and Target Definitions, Reusable transformations, Mapplets and Worklets.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator
  • Enhanced some queries in the other application to run faster and more efficiently.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, DDL Commands and DML Commands (SQL).
  • Created some dat files by using FAST Export and have developed a common FTP script to port them on to the clients’ server.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Developed complex mappings usingInformaticaPower Center Designer to transform and load the data from various source systems like Oracle, SQL server and Flat files into the target EDW.
  • Created proper TeradataPrimary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate TeradataNUSI for smooth (fast and easy) access of data.
  • Designed the mappings between sources (external files and databases) to Operational staging targets.
  • Did the performance tuning for Teradata SQL statements using TeradataExplain command.
  • Data was extracted from Teradata, Processed/Transformed using Ksh programs and loaded into Data Mart.
  • Used various Teradata Index techniques to improve the query performance.
  • Arranging the meeting meetings on regular basis and go over the open issues.
  • Monitoring database space, Identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew.
  • Involved in building tables, views and Indexes.
  • Involved in ad hoc querying, quick deployment, and rapid customization, making it even easier for users to make business decisions.
  • Created pre-session and post-session shell scripts and mail-notifications.
  • Extensively worked on the Informatica Designer, Repository Manager, Repository Server, Workflow Manager/Server Manager and Workflow Monitor.
  • Working in Cognos report design and suggesting best practices to improve the performance.

Environment: Teradata 12, Informatica Power Center 8.6.1, Oracle 11g, Erwin, Unix, Oracle Applications 11i, Sun Solaris

We'd love your feedback!