We provide IT Staff Augmentation Services!

Sr Etl Developer Resume

5.00/5 (Submit Your Rating)

Philadelphia, PA

SUMMARY:

  • Over 9.5 years of Experience working as a Sr ETL Developer with emphasis on specialization in building Data Marts and Data Warehousing with Expertise in ETL tool Informatica Power Center 9.5/9.1/8.6/8.5/8.1/7.1
  • Strong functional experience in the domains of Consumer Healthcare, Retail and Banking.
  • Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of Data Warehousing using ETL
  • Extensively involved in creating Complex Mappings and reusable components like Reusable Transformations, Mapplets, Worklets and control tasks to implement reusable business logic
  • Developed Slowly Changing Dimension Mappings of Type I, and II
  • Assisted in Data modeling by creating Star schemas using MS Visio and Erwin tools.
  • Prepared a detailed Technical Design Document after analyzing the Functional Spec and the Architectural Diagram. It captured all the functional as well as technical requirements of each interfaces and thoroughly described, guided the entire ETL design and build.
  • Worked with various RDBMS sources Oracle, SQL Server and flat files.
  • Performed DataExploring, Data Profiling and Data Cleansing before performing Data Staging on to the staging tables.
  • Release management, Code Migration from Development to QA and Production, Scheduling the Jobs in Development, QA and Production
  • Installing and Configuring Informatica Power Center Domain, repositories and services and Informatica administration of Power Exchange services.
  • Member of Code review team in multiple projects.
  • Working experience in using Oracle 9i/10g/11g, Sql Server 2005/2008.
  • Extensively used PL/SQL in writing Stored Procedures, Functions, Packages and Triggers
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure
  • Testing issues are resolved on the basis of using Defect Reports
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to Export and Load data to/from different source systems including flat files.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads
  • Developed source to Target mapping documents to support ETL design.
  • Possess extensive knowledge on the PLSQL Database Triggers, Stored Procedures, Functions, and Database Constraints.
  • Extensively used ETL methodology for supporting Data Extraction, Transformation and Loading process, in a corporate - wide-ETL Solution using different versions of Informatica.
  • Involved in Testing/Debugging SQL for performance issues, used different scenarios, and fixed different test cases.
  • Good knowledge of SQL In-depth understanding and working knowledge.
  • Prepared the documents for the Administration/maintenance of Informatica Power Center including Installation, upgrading and patching.
  • Maintaining the Production and Working on change requests.
  • Performance tuning at time of Bottlenecks.
  • Creating the UNIX shell scripts for processing files, validation and archival.
  • Creating automation jobs in UNIX Shell for ETL.
  • Doing Impact Analysis by creating Data lineage and Business Glossary for Business users and done Audit and Reconcillation in Data Migration.
  • Excellent communication skills, quick grasping abilities and a completely dedicated Team player

TECHNICAL SKILLS:

ETL: Informatica Power Center 9.5/9.1/8.6/8.5/8.1/7.1

Operating Systems: UNIX, Windows NT/2000/XP

Reporting tools: Qlilkview 11

DBMS: Oracle11g/10g/9i, Sql Server 2005/2008, Teradata 11/12/13

Data modeling tool: Erwin, Oracle Designer 10g, MS Visio 2010

Data Base Tools: SQL* Loader, TOAD, PL/SQL Developer, SQL Developer

Languages: PL/SQL, UNIX Shell Scripting

ITIL TOOLS: Service now

WORK EXPERIENCE:

Sr ETL Developer

Confidential, Philadelphia, PA

Responsibilities:

  • Proficient in Interacting with Business Analysts to clearly understand business requirements.
  • As an active member of Warehouse Design Team, Assisted in creating Fact and Dimension tables as per requirement.
  • Design of the Data Warehouse was performed using Star Schema methodology.
  • Extensively used Informatica Power Center to create data mappings for extracting the data from various Relational systems, applying appropriate Transformations and Loading.
  • Performed Data cleansing using external tools like Name Parser and Dataflow.
  • Extensively used Informatica client Tools Source Analyzer, Warehouse designer, Mapping designer, Mapplets Designer, Transformation Developer.
  • Expertise in working in Teradata systems and used utilities like Multiload, Fastload, Fastexport, BTEQ, TPump, Teradata SQL
  • Writing Teradata Sql queries to join or any modifications in the table
  • Creation of customized Mload scripts on UNIX platform for Teradata loads
  • Implemented various integrity constraints for data integrity like Referential Integrity, using Primary key and foreign keys relationships.
  • Developed numerous Complex Informatica Mapplets and Reusable Transformations as needed.
  • Designed and created complex source to target mapping using various transformations inclusive of but not limited to Lookup, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
  • Expertise in using different tasks (Session, Assignment, Command, Decision, Email, Event - Raise, Event- Wait, Control).
  • Optimized Query Performance, Mapping Performance, Session Performance and Reliability.
  • Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation (Slowly Changing Dimensions SCD Type-2).
  • Implemented Stored Procedures, Functions, views, Triggers, Packages in PL/SQL.
  • Implemented Source Pre-Load, Source Post-Load, Target Pre-Load and Target Post-Load functionalities.
  • Extensive Performance tuning of Sources, Targets, Mappings and Sessions.
  • Used Debugger and breakpoints to view transformations output and debug mappings.
  • Implemented Pipeline Partitioning to improve performance.
  • Created very useful UNIX shell scripts while writing cron jobs for batch processing. Excellent experience using Tivoli job scheduler.
  • Used Test Director to log the defects and coordinated with Test team for a timely resolution.
  • Provided Production Support at the end of every release.
  • Documented Technical specifications, business requirements and functional specifications for the all Informatica Extraction, Transformation and Loading (ETL) mappings.

Environment: Informatica Power Center 9.1/8.6/8.1, Oracle 11g/10g, Qlik view XI, Teradata 13.0/12.0, PL/SQL, SQL*Loader, UNIX, Erwin.

ETL Developer

Confidential, Atlanta, GA

Responsibilities:

  • Extraction and Transformation of data from various sources such as Oracle and Flat files and loading them into the Oracle target database using InformaticaPower Center.
  • Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets and Transformations Developer.
  • Developed different types of transformations like Source qualifier, Expression, Filter, Aggregator, Lookup, Stored procedure and update strategies.
  • Created Mapplets for reusable business rules.
  • Ran workflows and sessions during production support and monitor workflow and session logs for error.
  • Used Debugger and breakpoints to view and edit transformations output and debug mappings.
  • Expertise in working in Teradata systems and used utilities like Multiload, Fastload, Fastexport, BTEQ, TPump, Teradata SQL.
  • Writing Teradata Sql queries to join or any modifications in the table.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads.
  • Worked on ETL strategy to store Data validation rules, Error handling methods to handle both expected and non expected errors and documented it carefully.
  • Used Update Strategies for cleansing, updating and adding to the data in the warehouse.
  • Designed and developed UNIX shell scripts as part of the ETL process to compare control totals, automate the process of loading, pulling and pushing data from & to different servers.
  • Used PL/SQL for database coding.
  • Developed pre and post session Stored procedures to drop, recreate the indexes and keys of source and target tables.
  • Extensively involved in unit and integration testing. Worked closely with QA team during the Testing phase and fixed bugs that were reported.
  • Used Debugger and Breakpoints to view and edit transformations output and debug mappings.
  • Optimized and perform tuned mappings to achieve higher response times
  • Carried out unit and Integration testing for Informatica mappings, sessions and workflows.
  • Analyzed Source Data to resolve post - production issues. Used MS Access to analyze source data from flat files.
  • Co-coordinated with end users and reporting teams to correlate Business requirements.
  • Worked with change control requests to fix the problems in production data.
  • Documentation to describe 2

Environment: Informatica Power Center 8.6/8.1, Oracle 11g/10g, Qlik view 11, Teradata 13.0/12.0, PL/SQL, SQL*Loader, UNIX, Erwin

ETL Developer

Confidential, Phoenix, AZ

Responsibilities:

  • Created Informatica Mappings, sessions including Command tasks like Event Wait, Event Raise, and Timer and assignment workflows on business requirements
  • Involved in design, development and implementation of the Enterprise Data Warehouse (EDW) and Data Mart.
  • Used external tools like Address for cleansing the data in the source systems.
  • Designed mappings using Source qualifier, Joiner, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy transformations and Mapplets to load data into the target involving slowly changing dimensions.
  • Used Workflow Manager for creating and maintaining the Sessions and Workflow Monitor to monitor workflows.
  • Enhanced existing UNIX shell scripts as part of the ETL process to schedule tasks/sessions.
  • Coordinated with end users and reporting teams to correlate Business requirements
  • Extraction, transformation and loading of data were carried out from different sources like Flat files, Sql Server and Power Exchange.
  • Expertise in working in Teradata systems and used utilities like Multiload, Fastload, Fastexport, BTEQ, TPump, Teradata SQL.
  • Writing Teradata Sql queries to join or any modifications in the table.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads.
  • Involved in creating and designing mappings and Mapplets using Expression, Filter, Router, Joiner, Lookup, Update Strategy, Stored Procedure, Union and other transformations
  • Used Unconnected Lookup and Stored Procedure transformations in the mapping.
  • Created and used complex aggregator transformation in various mappings.
  • Involved in designing and development of pre and post session routines.
  • Used Debugger and breakpoints to view transformations output and debug mappings.
  • Apply business rules using complex SQL and procedures
  • Validation rules to check Data consistency.

Environment: Informatica Power Center 8.6/8.1, Oracle 11g/10g, Qlik view XI, Teradata 11.0/12.0, Autosys, PL/SQL, SQL*Loader, UNIX, Erwin.

Sr ETL Developer

Confidential, Chicago, IL

Responsibilities:

  • Created Technical Specification documents based on high level requirement documents
  • Reviewed technical specification documents with the functional owners.
  • Accomplished data movement process that load data from DB2 into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities.
  • Worked in Teradata Utilities Bteq, Fastload, Fastexport, Multiload, and improved the design of Bteq, Multiload.
  • Extraction, Transformation and Loading of data were carried out from different sources like Flat files, Sql Server Involved in creating and designing mappings and mapplets using Expression, Filter, Router, Joiner, Lookup, Update Strategy, Stored Procedure, Union and other transformations
  • Worked on XML transformations to send input to Permits as per specifications.
  • Reworking if any discrepancies in the flat file extracts.
  • Moving the code between development and System testing environments
  • Fixing the Bugs rose during System and Integration testing.
  • Done Audit and Reconcillation for the data during SIT.

Environment: Informatica Power Center 9.1/8.6/8.1, Oracle 11g/10g, Teradata 13.0/12.0, DB2 UDB, PL/SQL, SQL*Loader, UNIX, Erwin.

ETL Informatica Developer

Confidential

Responsibilities:

  • Created the ETL design documentation, Mapping document, Migration document, Test cases.
  • Extracted data from source systems, Applied complex transformation and then loaded in the Target table.
  • Created Technical Specification documents based on high level requirement documents.
  • Extensively worked in the performance tuning of programs, ETL procedures and processes.
  • Error checking & testing of the ETL procedures & programs using Informatica session log.
  • Worked in Teradata Utilities, Fastload, Bteq, multiload
  • Performance Tuned Informatica Targets, Sources, mappings & sessions for large data files by Increasing data cache size, sequence buffer length and target based commit interval.
  • Reviewed Technical specification documents with the functional owners.
  • Developed parallel jobs using technical specification documents.
  • Tested the jobs and data in Oracle.
  • Reworking if any discrepancies in the flat file extracts.
  • Fixing the Bugs raised during System and integration testing.
  • Created Sequence jobs and scheduled them.
  • Done Audit and Reconciliation for the data during SIT.

Environment: Informatica Power Center 9.1/8.6/8.1, Oracle 11g/10g, Teradata 10, PL/SQL, SQL*Loader, UNIX, Erwin.

Sr ETL Developer

Confidential, Atlanta

Responsibilities:

  • Involved in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and warranty support for production environment.
  • Involved in preparing Plan and effort estimations required to execute the project.
  • Designing and building Informatica solution and PDO( Push down optimization ) where required
  • Design and build Teradata SQL, TPT, BTEQ and UNIX shell script
  • Performance tuning for Data warehouse Database (Teradata) and Data warehouse operations.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Reviewed Technical specification documents with the functional owners.
  • Developed parallel jobs using technical specification documents.
  • Tested the jobs and data in Oracle.
  • Reworking if any discrepancies in the flat file extracts.
  • Fixing the Bugs raised during System and integration testing.
  • Created Sequence jobs and scheduled them.
  • Done Audit and Reconciliation for the data during SIT.

Environment: Informatica Power Center 9.1/8.6/8.1, Oracle 11g/10g, Teradata 10, PL/SQL, SQL*Loader, UNIX, Erwin.

We'd love your feedback!