We provide IT Staff Augmentation Services!

Senior Etl Informatica Developer Resume


  • Over 7+ years of experience in the IT industry in Design, Development and implementation as an ETL Informatica Developer.
  • Experienced in analysis of different systems for system study and data warehouse implementations.
  • Experienced in Data Analysis, Data modeling, ETL, Data Warehousing, Reporting, Development, Maintenance, Testing and Documentation.
  • Experienced in OLTP/OLAP Systems, Analysis, and Data model Schemas like Star schema, Snowflake schema and multidimensional modeling.
  • Well versed in Normalization (1NF, 2NF and 3NF) and Denormalization techniques for optimum performance in relational and dimensional database environments.
  • Extensive experience in implementation of Informatica Power Center components like Source Analyzer, Target Designer, Mapping/Mapplet/ Transformation Designer, Workflow Manager/ Workflow Monitor.
  • Solid expertise in Data Extraction, Data Migration, Data Transformation and Data Loading using ETL process in Informatica Power Center 9.x/8.x/7.x.
  • Extensively worked on Informatica PowerCenter Transformations such as Source Qualifier, Lookup, Filter, Expression, SQL transformation, Data masking, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator and Transaction Control transformations.
  • Experience in using data sources Oracle, Teradata, DB2, Ms SQL server, flat files, and XML files.
  • Experience in RDBMS such as ORACLE 12c/11g/10g/9i/8i, MS SQL Server 2008/2005/2000 , Teradata V12, V14, V15, DB2, SQL, T - SQL and PL/SQL.
  • Good understanding and working experience in Logical and Physical data models that capture current/future state data elements and data flows using Erwin
  • Experience in writing complex SQL queries, stored procedures and functions using PL/SQL programming.
  • Having extensive experience on load Utilities like FASTLOAD, TPUMP, MLOAD, FAST Export, TPT (Teradata Parallel Transporter) and PDO (Push down Optimization) and used BTEQ scripts.
  • Experience in UNIX shell scripting and using PMCMD commands to run Informatica workflows.
  • Experienced in handling SCDs (Slowly Changing Dimensions) and change data capture (CDC) using Informatica PowerCenter.
  • Experience in Test Deployment of Informatica objects using Informatica deployment groups, and non Informatica objects using Eclipse and UNIX deployment process.
  • Having Exposure in AWK Scripts and Worked on Autosys, Crontab and Control + M to schedule the jobs.
  • Experience in understanding the requirements and preparing data mapping documents.
  • Implemented type1/type2/type3/type4/type5/type6, Incremental and CDC logic according to the Business requirements.
  • Experience in debugging mappings; Identified bugs in existing mappings by analyzing the data flow and evaluating transformations
  • Hands on experience in Performance Tuning of sources, targets, transformations and sessions
  • Experience in UNIX shell scripting, FTP and file management in various UNIX environments.


Data Warehousing Tools: Informatica Power Center 10.x,9.x,8.x,7.x, Informatica IDQ, SSIS

Databases/RDBMS/Others: MS SQL Server 2008/2005/2000 , Oracle 8i/9i/10g/11g/12c,DB2, Teradata V12, V14, V15, XML, Netezza, Siebel Data Base, Flat files and Excel files.

Data Modeling: ERWIN 4.x/3.x, MS Visio, Ralph-Kimball Methodology, Bill-Inman Methodology, Star Schema, Snow Flake Schema, Physical and Logical Modeling, Dimension Data Modeling, Fact Tables, Dimension Tables.

Programming Languages: java, SQL, PL/SQL, T-SQL, MY-SQL

Job Control and Other Tools: Informatica Scheduler, Putty, SQL*Plus, TOAD, PL/SQL Developer tool, Autosys, Crontab and Control + M

Reporting Tools: OBIEE, Cognos, MicroStrategy and Tableau, SAP BI/BW and SSRS.



Senior ETL Informatica Developer


  • Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems.
  • Worked on converting technical specifications into Informatica Mappings.
  • Loaded data from different source systems like Flat files, Excel, Oracle, .txt files.
  • Worked with Reporting team to get to an agreement on what level of complexity to be handled by ETL and reporting side.
  • Worked on Informatica tuning at mapping and sessions level.
  • Extensively used Informatica Power Center 8.6/9.5 and created mappings using transformations like Source Qualifier, Joiner, Normalizer, Aggregator, Expression, Filter, Router, Normalizer, Lookup, Update Strategy, and Sequence Generator.
  • Designed packages in SSIS 2008 with various complex transformations such as derived column, conditional split, lookup, aggregation, execute SQL task.
  • Created the report for the Informatica CDC workflow’s performance.
  • Worked on Informatica MDM Hub Console like working with Data Mappings from Landing, staging and to Base Objects, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation and involved in customizing and configuring IDD applications
  • Implementing various SSIS packages having different tasks and transformations and scheduled SSIS packages.
  • Have used NORMALIZER Transformation heavily to convert the data in flat file RDBMS structures.
  • Used Mapplets, Parameters and Variables to facilitate the reusability of code.
  • Used Informatica MDM 10.1 (Siperion) tool to manage Master data of EDW.
  • Strong experience in administration and support of Informatica PowerCenter, Web services, Data Quality and PowerExchange-CDC in a Unix or Linux environment.
  • Extracted consolidated golden records from MDM base objects and loaded into downstream applications.
  • Worked on Pre and Post SQL queries on target.
  • Experience in end to end Data quality testing and support in enterprise warehouse environment
  • Experience in maintaining Data Quality, Data consistency and Data accuracy for Data Quality projects
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
  • Knowledge on ODI Procedures and packages.
  • Extracted data from different databases like Oracle and external source systems using ETL tool.
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
  • Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica.
  • Finally, Created Data Validation document, Unit Test Case Document, Technical Design Document, Informatica Migration Request Document and Knowledge Transfer Document.
  • Responsible for configuring the workflow tasks like email task, Command task and make it reusable for other team members.

Environment: Informatica Power Center 9.6/10.2, MS SQL Server 2008 R2, T-SQL, Cognos, Data Stage. MS SQL Integration Services 2008, JIRA, Unix-Putty, Oracle, HP lifecycle management tool.


ETL Developer


  • Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.
  • Prepare design documents based on the requirement. Understand the requirements and builds mapping to perform Quality Checks like Record Count, Record Validation, Hash Validation between the Source and Target data.
  • Provided various design solutions to ease the data loading and Analysis process.
  • Interacted with the Business IT teams and users to get understanding of the business requirement to convert business rules into informatica mappings.
  • Involved in SIF integration to develop xml code for external applications to perform search match API calls against MDM Hub Data.
  • Involved with the ODI Integrator Development and Understood Source systems attributes and come up with gap analysis and target system attributes mapping.
  • Working with SAP source systems to extract the data and load the warehouse.
  • Identified the bottlenecks in the sources, targets, mappings, sessions and resolve performance issues.
  • Implemented the Push down Optimization process in informatica effectively.
  • Created the deployment groups for promoting the code for higher environment.
  • Develop the mappings using wide range of transformations like Joiner, Java Aggregator, Sorter, Union, Router, SQL transformations
  • Loading Data to the Interface tables from multiple data sources such as SQL server, Text files and Excel Spreadsheets using SQL Loader, Informatica and ODBC connections.
  • Extracted data from different databases like Oracle and external source systems using ETL tool.
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
  • Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica.
  • Responsible for delivery of the design, code, testing and documentation, based on system and technical requirements
  • Works with solution architects to choose a data integration style to meet the needs of each project — for example, replication, ETL, views, etc.
  • Engages with Business SME’s to understand the use cases that the integration solution must support.
  • Assist other junior ETL resources with the proper usage of ETL tools and techniques as per defined ETL standards, development guidelines and best Practices.
  • Provide regular feedback to architecture team on existing design patterns for enhancements.
  • Presents ETL design in Architectural review meeting comprising of Solution Architects/ETL Architects/Data Architects/DBA/Engineers/Testing Team.
  • Works with the Project Data Modeler in understanding the Entity relationships also hands over ETL Architecture design documents to the Development team(s).
  • Interacts with database administrators to meet requirements for volume and performance.
  • During the execution of each phase, assumes an active project role and supports ETL team during development, testing and implementation rollout.
  • Works with data stewards to ensure the integration solution meets data delivery and quality SLAs
  • ETL Architecture Design Artifacts as part of Application Design Document (ADD)

Environment: Informatica Power Center 10.1, Informatica IDQ 10.1, Teradata V15, Oracle 12c, SQL Developer, Teradata, Sql Assistant, UNIX Shell Scripting, Flat Files, XML Files, Agile Methodology.

Confidential, CA

ETL Developer


  • Responsible for Business Analysis and Requirements Collection.
  • Efficiently worked in all the phases of System development life Cycle (SDLC) using different methodologies like Agile and Waterfall .
  • Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Performed requirement gathering analysis design development testing implementation support and maintenance phases of both MDM and Data Integration Projects.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Designing and building Informatica solution and PDO( Push down optimization ) where required Extensively worked with Informatica Power Center.
  • Used Oracle Data Integrator Designer (ODI) to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
  • Oracle Data Integrator (ODI) reduce the dependency on Excel and other proprietary tools for data entry and reporting, and provide property and department level budgeting/forecasting so as to produce a consolidated budget and forecast
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Analyzed source data files and gathered requirements from the business users.
  • Design and build Teradata SQL,TPT, BTEQ and UNIX shell script
  • Developed mappings to load into staging tables and then to Dimensions and Facts .
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse .
  • Utilized Informatica Data Quality (IDQ) for data profiling and matching/removing duplicate data, fixing the bad data, fixing NULL values.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Performed Performance testing with different sets of node configuration, different queue and different volumes.
  • Prepared the DML' s for maintenance tables, review, and test and execute them.
  • Used Github for version control tool for version controlling and movement of code to upper environments like SIT, UAT, Pre production and Production .

Environment: Informatica Power Center 10.1, Informatica IDQ 10.1, Teradata V14, Oracle 12c, SQL Developer, Teradata, Sql Assistant, UNIX Shell Scripting, Flat Files, XML Files, Agile Methodology.

Hire Now