We provide IT Staff Augmentation Services!

Etl Developer Resume

Omaha, NE


  • 7+ yrs of experience in Datawarehousing Technology Requirement Analysis, Data Analysis, Application Design, Application Development which includes, Requirement gathering, Customization, Maintenance and Testing of various Business Applications in Data Warehouse and Training Specialist.
  • Extensive hands-on expertise using Informatica 8.x/7.x/6.x/5.x and Erwin Data Modeler with strong business understanding of Banking, Insurance, Finance and Pharmaceuticals-Healthcare and Telecom sectors.
  • Good understanding of relational database management systems like Oracle, TeraData, DB2, and SQL Server and extensively worked on DataIntegration using Informatica for the Extraction transformation and loading of data from various database source systems and mainframe Cobol and VSAM files.
  • Familiar with Ralph Kimball and/or Bill Inmon methodologies and Automation of ETL processes with scheduling tools and exception-handling procedures
  • Good knowledge of Installing and Configuring PowerCenter 7/8
  • Experience in writing shell scripting for various ETL needs and worked on integrating data from flat files (mainframe).
  • Development experience on Windows NT/2000/XP, UNIX (Solaris, HP UX, AIX) platforms.
  • Experience in Informatica mapping specification documentation, tuning mappings to increase performance, proficient in using Informatica Server Manager to create and schedule workflows and expertise in Autosys and ESP scheduling.
  • Having experience in leading team and possess excellent communication and interpersonal skills, ability to quickly grasp new concepts, both technical and business related and utilize as needed.
  • Strong skills in Requirement gathering and study, Gap Analysis, Scope Definition, Recommendations to Business Process Improvements, Development of Procedures, Forms & Documentation, Effort Estimation, Resource Planning, Project Tracking, UAT.

Educational Background

  • BE (Bachelor of Engineering)


  • Teradata V2R5 Certified Professional
  • Informatica PowerCenter 7 Cerfified Designer (Level N Certification)
Professional Experience

Confidential,Omaha, NE Sept08- till date.
ETL Developer

EDW, Medicare, Capsule and Recon: Blue Cross enterprise data warehouse deals with different kinds of health claims which are categorized as Facility, Professional, FEP. Data is coming from various sources like Oracle, SQL Server, Mainframe etc which will be loaded in to EDW based on different frequencies as per the requirement. The entire ETL process consists of source systems, staging area, ODS layer, Datawarehouse and Datamart.

Roles & Responsibilities:

  • Involved in gathering requirements and created design documents and mapping documents.
  • Involved in design and development complex ETL mappings and stored procedures in an optimized manner. Used Power exchange for mainframe sources.
  • Created metadata queries which would help in restart and to find out any missing link conditions or disabled tasks in a workflow.
  • Fine-tuned existing Informatica maps for performance optimization.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Involved in the Unit testing and System testing.
  • Worked on bug fixes on existing Informatica Mappings to produce correct output.

Environment: Informatica PowerCenter 8.6, Informatica Power Exchange, Oracle 10g, MS SQL Server 2005, Oracle, windows, Mainframe.

Confidential,IA Jan’08 –Aug’08
Senior ETL Programmer

AdminServer Reporting, ProductionServer Reporting & 403B Reporting: AvivaUSA gets the source data for Insurance policy owners, agents, beneficiaries, payees etc from various source systems like AS400, Oracle, SQL Server , DB2 and Mainframe. Data from these sources acquired into the Stage database. Data cleansing and aggregation is done for the data and populated into ODS layer and then to Datwarehouse for maintaining history and Data marts for reporting requirements. Data Integration is done using Informatica tool.

Roles & Responsibilities:

  • Involved in gathering of business scope and technical requirements and created technical specifications.
  • Developed complex mappings and SCD type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations. Created complex mapplets for reusable purposes.
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions.
  • Fine-tuned existing Informatica maps for performance optimization.
  • Worked on Informatica Designer tools –Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer and Server Manager to create and monitor sessions and batches.
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Involved in the Unit testing, Event & Thread testing and System testing.
  • Analyzed existing system and developed business documentation on changes required.
  • Made adjustments in Data Model and SQL scripts to create and alter tables.
  • Extensively involved in testing the system from beginning to end to ensure the quality of the adjustments made to oblige the source system up-gradation.
  • Worked on various issues on existing Informatica Mappings to produce correct output.
  • Efficient Documentation was done for all phases like Analysis, design, development, testing and maintenance.

Environment: Informatica PowerCenter 8.5, Oracle 10g, MS SQL Server 2005/2000, Linux/UNIX Shell Scripting, DB2, AS400, Mainframe TOAD, SQL Plus, ESP Scheduler.

Confidential,CT Oct’06 – Dec’07
ETL Lead Programmer

Pfizer DIF QDAM-(Development Information Factory Quality Data Acquisition Management): The goal of this project was to create a development Information Factory which consists of ODS, Datawarehouse and Datamarts. Data from different Source Systems is integrated into the Staging area, also the QDAM Rules, Exceptions Log and Tracking Log are acquired from the QDAM Application Database to the Staging and then ODS, DW and DM are created. ODS: Operational Data Store which would contain the cleaned, transformed and integrated data from the 4 different sources. DW: The Data Warehouse maintains Historical Data from the Source Systems and also the History of the Quality Rules, Exceptions Log and Tracking Log. DM: Datamarts provides a consolidated reporting environment for WWD meeting the decision-support information needs of the organization. Various stages of Development for DIF are Stage 1 – ETL to acquire the data from 4 sources and the QDAM application data into the Stage DB. (About 150 mappings). Stage 2 – ETL to load Tracking, Exception and QDAM Rules history data from Staging into the DIF Data Warehouse. (About 10-15 mappings). Stage 3 – ETL to apply the QDAM rules and load the data passing the Rules into the DW (About 10-15 mapplets). Stage 4 –Existing ETL in the Acquisition group and Integration Group to include a join/filter to QDAM Exception tables to include this data for reprocessing (113 Integration mappings, 23 Acquisition mappings, 208 Acquisition/Integration mappings). Stage 5 – ETL to load the Exceptions Data Mart (10-15)
Roles & Responsibilities:

  • Participated in all phases including Requirement Analysis; Client Interaction; Design, Coding, Testing and Documentation.
  • Developed Informatica SCD type-I, Type-II and Type III mappings and tuned them for better performance. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Database Design and development.
  • Involved in writing test scripts, unit testing, system testing and documentation.
  • Design and development of UNIX Shell Scripts to handle pre and post session processes.
  • Extracted data from legacy systems and uploaded into Oracle Datawarehouse.

Environment: Informatica PowerCenter 8.1.1, Oracle 10g, MS SQL Server 2005, Linux/UNIX Shell Scripting, ERWIN 4.x, TOAD, SQL Plus, Cognos.

Confidential,SF Jan’05 – Sep’06
Sr. ETL Programmer

Integrated Datawarehouse (IDW): Charles Schwab has embarked upon a strategic initiative to build an Integrated Data Warehouse (IDW) that will enable fact-based decision-making. The data warehouse being built will serve as a firm foundation for Business Intelligence and Data Analytics subsequently. The subject area covered in this phase is CUSTOMER, TRADES, ORDER, ITEM, BPT, ACCOUNT, PROFITABILITY, ASSET TRACKING and SECURITY. The process loads the Bank customers and account details, Household wise/Category wise Exclusive, Qualified and Practice revenue amounts, which will be used by the Decision Support System for the analysis.

Roles & Responsibilities:

  • Gathered and analyzed requirements by meeting business stakeholders and other

technical team members prior to the design of ETL.

  • Understanding High Level ETL Requirement specifications to develop LLD for type-I, SCD Type-II and Type III mappings and done rigorous testing for various test cases.
  • Preparation of QA migration and PROD migration Documents after development and testing in Development environment and executed Quality Assurance testing.
  • Developed complex mappings as per the requirement using almost all transformations and effectively used Debugger for identifying errors and resolving them.
  • Have done Informatica performance tuning and query tuning.
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Team leader who was responsible for the assignment of tasks for the team members, creation of Metrics reports, weekly status reports, preparation of project plan and quality related documents & migration documents etc. Used share point to maintain documents and tracking.

Environment: Informatica PowerCenter 7.1.2, Oracle 9i, Teradata V2R5, Linux/UNIX Sun Solaris 5.8, UNIX Shell Scripting, TOAD, Mainframe, Teradata SQL Assistant, SQL Plus.

Hire Now