We provide IT Staff Augmentation Services!

Sr. Etl Lead Developer Resume

Raritan, NJ


  • 10 +years of IT experience in Data warehousing and Data Modeling with emphasis on Business Requirements Analysis, Application Design, Development, Testing, Implementation and Maintenance of systems.
  • Good exposure in Data Analysis, Data Modeling, Data Cleansing, and Transformation, Integration, Data import, Data export and use of ETL tool including Informatica Power Center (9.x/8.x/7.x)
  • Expertise in Dimensional data modeling, Star schema and Snowflake modeling
  • Extensive experience in ETL processes in extracting data from operational and legacy systems to data warehouse using Informatica.
  • Expertise in working with complex mappings, transformation rules, various data formats; Efficient in performance - tuning of mappings, targets and sessions.
  • Experience in Power Exchange to connect and import sources from external systems like SAP R/3, DB2, Sales force, Mainframes, and AS/400.
  • Designed and developed mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Stored Procedure, Update Strategy, SCD's Type1, Type2 and Type3.
  • Knowledge of Informatica administration in windows and Linux environment
  • Proven experience Confidential building, motivating and leading professional teams, managing projects and supporting client relationships.
  • Knowledge of tools such as Erwin and MS Visio Professional.
  • Experienced with Multiple Databases including Oracle 11g/10g/9i, Teradata13/12, MS SQL Server 2008/2000 and DB2.
  • Experience in Informatica DAC tool for Customizing Data warehouses and Creating Task and Task Groups in it.
  • Worked on Oracle Procedures, Functions, Packages and Triggers.
  • Worked on Teradata utilities BTEQ, FLOAD, FEXPORT, MLOAD and TPUMP.
  • Experience using Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Expertise in Performance-Tuning of overall data warehouse, database, ETL and OLAP tuning coordinating with DBAs and UNIX Administrator for better performance.
  • Confident self-starter, independent worker and team player


ETL Tools: Informatica Power Center 9.5.1/9.1/8.6/8.1/7.1 , Informatica Power Exchange 9.1/8.6

Databases: Sybase 12.5, Oracle 11g/10g/9i, DB 2 8.0/7.0/6.0 , Teradata 13/12, MS SQL Server 2000/7.0/6.5 , MS Access 7.0/97/2000

Other Tools: DB Artisan 9.0, Oracle Report Writer, MLOAD, TPUMP, BTEQFLOAD, SAS, SQR 3.0, Erwin 4.x/3.5/3.x, SQL, XML, XSL, PL/SQLSQL*Plus, SQL*Loader and Developer

Languages: PL/SQL, T-SQL, UNIX Shell Scripting, UNIX Commands, XPathXSLT

Operating Systems: Windows Server 2008/2003, UNIX, Sun Solaris 2.x.

Scheduler & Reporting Tools: Tidal, CA Scheduling Tool, ESP, Autosys, OBIEE 11g, Cognos 7


Sr. ETL Lead Developer

Confidential, Raritan, NJ


  • Worked with business users to gather requirements and prepared TSD and ETL Specification documents based on BSD and Mapping document.
  • Involved in Code reviews and determining Informatica standards for Mappings /Sessions / Workflows.
  • Translated business requirements into Informatica mappings/workflows .
  • Designed and developed end-to-end ETL process from various source Systems (TrackWise, Metricstream, Curve) to Staging, from staging to DWH for loading Audit, Observations, CAPA data.
  • Created various transformations such as Normalizer, Union, Aggregator, Update Strategy, Look Up, Joiner, JAVA, Filter and Router Transformations.
  • Extensively worked with various Passive transformations like Expression, Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Involved in Performance tuning. Identified and eliminated bottlenecks (source, target, and mapping).
  • Developed Mappings to extract data from Upstream Systems (AUDIT’s, Observations, Responses, CAPA transactional data) and load into data warehouse.
  • Performed validation of migrated data from legacy systems before performing transformations using Informatica.
  • Involved in Production support for the data warehouse Informatica jobs.
  • Extensively worked on tuning mappings and reducing processing times.
  • Performed required database activities including SQL development, data modeling and database security using Informatica and Oracle tool suites
  • Created Database Triggers, Stored Procedure, Functions and Packages.
  • Extensively used debugger to trace the errors.
  • Extensively used PL/SQL programming in procedures to implement business rules.
  • Extensively tested the code and documented the Unit Test Cases.
  • Assisted the testers in system and integration testing and preparing the test cases and test plan.
  • Worked with DBA to identify source and target bottle necks.
  • Involved in bug fixing. Debugged the issues during QA level.
  • Worked on Production Support monitoring and resolving Issue's after post Production Migration Releases.
  • Worked in transition of existing Informatica process and database migration into cloud environment
  • Documented ETL solutions, including source/target mapping and data dictionary information as well as workflow and schedules for automated data processing and loading.
  • Worked with Tidal scheduling tool to schedule ETL jobs.
Sr. ETL/Informatica Consultant



  • Participated in Confidential initiative as Confidential Informatica Subject matter expert(SME) to provide key inputs in laying out data integration architecture
  • Provide assessments and help business make smart decisions about data integration optimizations reducing the risk of offloading the wrong work onto the wrong platform while reducing labor costs and delivering faster performance.
  • Identified Worked on mapping specifications existing requirements and consolidated the reports for effective and efficient reporting.
  • Analyzed QAAD data to be extracted from Confidential source systems in the enterprise DataLake and created mapping documents to map the data to the enterprise data warehouse fields
  • Worked on Analyzing and Identification of data warehouse data fields that are required for enterprise business reports.

Environment: Informatica Power Center 9.6, Oracle 11g, Oracle Toad 10.6,TIDAL Scheduler, Tortoise SVN, HP ALM quality center, TeraData 12/13.

Sr. ETL/Informatica Consultant

Confidential, Columbus, OH


  • Extensively used Powerexchange to extract data from Teradata databases and other legacy sources.
  • Worked extensively on Source Analyzer, Mapping Designer, Target Designer, Workflow Manager and Workflow Monitor
  • Used various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement the business logic.
  • Created Complex mappings using Connected and Unconnected Lookups, Aggregate, Update Strategy, Stored Procedure and Router transformations for populating target table in efficient manner.
  • Extensively worked with flat file sources to read data and write the data into the flat files.
  • Created Informatica mappings with PLSQL Procedures, Functions to build business rules to load data.
  • Worked on Unix shell scripts used in scheduling Informatica pre/post session operations
  • Implemented different Tasks in workflows which included Session, Command, E-mail, Event-Wait etc.
  • Migrated the Code from Informatica Power center 8.6.1 to 9.5.1.
  • Involved in Folder Migrations from one environment to the other environments.
  • Uploading/downloading the documents to/from sub version
  • Extensively worked with various Active transformation like Filter, Sorter, Aggregator, Router, Lookup, Joiner and Update Strategy transformations
  • Extensively worked with various Passive transformations like Expression, Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Developed different mappings, workflows for code rewrite from Abinitio to Informatica like Client rewrite, Phone Address Load rewrite, Email Address Load rewrite, Deloitte Extracts Loads and B1B
  • Responsible for migrating the jobs from Informatica 8.6 to 9.5 and testing them
  • Developed generic mappings and workflows for loading different types of client data by passing the parameters
  • Developed Parameter files for passing values to the mappings for each type of client
  • Worked with Connected look up and unconnected lookup and configured the same for implementing complex logic
  • Developed mappings and workflows by reading the data from one database and loading the data to another database
  • Responsible for developing the Jobsets, installing the jobsets in CA scheduling tool
  • Responsible for running the jobs from CA Scheduling tool in Dev
  • Responsible for checking in the code to Tortoise SVN
  • Responsible for monitoring the jobs while deploying to other testing environments like SIT, UAT, PERF
  • Responsible for updating the cycle scope documents before start of a test cycle
  • Worked with flat files like fixed and de-limited
  • Worked extensively with update strategy transformation for implementing inserts, updates and deletes
  • Developed test cases, conducted unit tests, monitoring the jobs and running the validations in testing environments
  • Involved in PROD release and running the validations in PROD
  • Hands on experience on memory management and pipeline partitioning of workflow for better performance.
  • Created Mapplets and used them in different Mappings.
  • Created various Job Sets in CA Scheduling tool and defined the dependency Confidential the Job Set and Job Levels.

Environment: Informatica Power Center 9.5.1, Power Exchange, Sybase 12.5, UNIX, Tortoise SVN 1.6.15, Windows XP, DB Artisan 9.0, Shell scripting, CA Scheduling Tool

Sr. ETL Developer

Confidential, Middletown, NJ


  • Participated in Business Requirement meetings and then involved in writing Functional Specification document.
  • Worked on Informatica Power Center tools, which include Designer, Repository Manager, Workflow Manager and Workflow Monitor.
  • Calibrated high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data model from multiple sources for DWH support
  • Laid out the ETL architecture and Source to Target mapping to load data into DWH.
  • Worked on complex data structures, dashboards & ad hoc reporting using OBIEE.
  • Worked extensively on OBIEE Answers to create the reports as per the client requirements and integrated them into the Dashboards.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Maintained source definitions, transformation rules and targets definitions using Informatica Repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings.
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Used existing ETL standards to develop these mappings.
  • Worked on different tasks like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Used SCD Type1 and Type2 mappings to update slowly Changing Dimension Tables.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle and created Stored Procedures, functions, views, packages and triggers.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning Confidential source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica Power Center 9.1, Informatica Power Exchange, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange 9.1, Data Analyzer 8.1, PL/SQL, Oracle 11g, Erwin, Autosys, SQL Server 2008, Sybase, UNIX AIX, Toad 9.0, Cognos, OBIEE 11g

Confidential, Denver, CO

Sr. ETL Developer


  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA.
  • Using Informatica Power Center Designer analyzed the source data to Extract& Transform from various source systems (Oracle 10g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
  • Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.
  • Used various transformations like Source Qualifier, Joiner, Lookup, SQL, Router, Filter, Expression and Update Strategy.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools.
  • Documented Informatica mappings in Excel spread sheet.
  • Tuned the Informatica mappings for optimal load performance.
  • Wrote BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Created and Configured real-time workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Power Exchange and Informatica PowerCenter.
  • Generated reports using OBIEE for the future business utilities.
  • This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.
  • Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
  • Constantly interacted with business users to discuss requirements.

Environment: Informatica PowerCenter Designer 9.1, Informatica Power Exchange 9.1, Oracle11g, DB2 6.1, MS Visio, TOAD, SAP Version: 3.1.H, Unix- SunOS, PL/SQL,SQL Developer, OBIEE 10.2.1g, Putty, SCM, Wynsure 5.2, IBM Mainframes, DB2

ETL Developer

Confidential, Washington, DC


  • Analyzed the requirements and framed the business logic for the ETL process.
  • Extracted data from Oracle as one of the source databases.
  • Involved in JAD sessions for the requirements gathering and understanding.
  • Involved in the ETL design and its documentation.
  • Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system using ER-STUDIO.
  • Followed Star Schema to design dimension and fact tables.
  • Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files. Worked on XML source and targets, XML parser and generator.
  • Responsible for the development, implementation and support of the databases.
  • Developed mappings in Informatica to load the data including facts and dimensions from various sources into the Data Warehouse, using different transformations like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
  • Developed reusable Mapplets and Transformations.
  • Used data integrator tool to support batch and for real-time integration and worked on staging and integration layer.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations
  • Design and develop Informatica mappings and workflows; Identify and Remove Bottlenecks in order to improve the performance of mappings and workflows
  • Review existing code, lead efforts to tweak and tune the performance of existing Informatica processes
  • Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements from flat files data using Informatica to the staging area.
  • Scheduled the tasks using Autosys.
  • Created SHELLSCRIPTS for generic use.
  • Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort.

Environment: Informatica PowerCenter 8.1/8.6, UNIX, Teradata 12, Oracle 11g, Oracle Data Integrator, SQL, PL/SQL,SQL Server 2008, Erwin 4.5, Oracle Designer, MS VISIO, Autosys, Korn Shell, XML

ETL Developer

Confidential, Wheeling, IL


  • Interacted with the business community and database administrators to identify the business requirements and data realties.
  • Responsible for dimensional modeling of the data warehouse to design the business process.
  • Parsing high-level design specification to simple ETL coding and mapping standards.
  • Designed new database tables to meet business information needs. Designed Mapping document, which is a guideline to ETL Coding.
  • Documented standards handbook for Informatica code development.
  • Developed a number of Informatica Mappings, Mapplets and Transformations to load data from relational and flat file sources into the data warehouse.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions and Database Triggers).
  • Used various transformations like Source Qualifier, Lookup, Update Strategy, Router, Filter, Sequence Generator and Joiner on the extracted source data according to the business rules and technical specifications.
  • Loaded the data from IMS files into oracle tables using Informatica with any data cleansing required.
  • Worked on SQL tools like TOAD to run SQL queries and validate the data.
  • Worked on database connections, SQL Joins, views in Database level.
  • Extensively used SQL*Loader to load Data from flat files to Database tables in Oracle.
  • Used Power Center server manager/Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process.
  • Used UNIX shell scripting for Scheduling Informatica Workflows.
  • Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.
  • Implemented Slowly Changing Dimensions to update the dimensional schema.
  • Implemented procedures/functions in PL/SQL for Stored Procedure Transformations.
  • Monitored workflows and collected performance data to maximize the session performance.
  • Resolved memory related issues like DTM buffer size, cache size to optimize session runs.
  • Documented the mapping process and methodology used to facilitate future development.

Environment: Informatica Power Center 8.1, Informatica Power Connect, (Repository Manger, Designer, Server Manager, Workflow Monitor, Workflow Manager), Erwin 4.5, Flat files, Oracle 10g, MS SQL Server 2000, PL/SQL, Business Objects 6.5, Shell Programming, SQL*Loader, IBM DB2 8.0, Toad, Excel, Unix scripting, Sun Solaris, Windows NT

Informatica Consultant

Confidential, Sacramento, CA


  • Interacted with Business Owners for day-to-day ETL progress monitoring.
  • Researched sources and identified necessary reusable components for Mapplets and Transformations
  • Coordinated with Source System Owners for proper data feeds and Defaults for the Incoming Data.
  • Populated data into staging tables from flat files, XML sources, Cobol Copybooks and Relational Sources.
  • Installed Informatica PowerCenter on Client and Server Machines and Configure the Informatica Server and Register Server.
  • Upgraded Repository from Informatica Power Center Version 5.1 to Version 6.2
  • Worked on Informatica Power Center 6.2 tool - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets, and Reusable Transformations.
  • Used Informatica Designer designed Mappings that populated the Data into the Dimensions on DWH.
  • Used Workflow Manager, Server Manager for Creating, Validating, Testing and running Batches and Sessions and scheduling them to run Confidential specified time.
  • Bench Mark testing of Hardware and Informatica Power Center mappings to calculate the load times for batch processing and tuning the mappings.
  • Develop Logical and Physical data models that capture current data /future data elements and data flows using Erwin.
  • Designing and customizing Data models for Data Warehouse supporting data from multiple sources.
  • Developed several mappings, and tuned existing mappings for better performance.
  • Re-used Mapplets & reusable transformations during development life cycle.
  • Scheduled and monitored processes using Informatica Server Manager.
  • Extensively used PL/SQL programming in procedures to implement business rules.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Responsible for performance tuning of SQL Queries, Sources, Targets and sessions.
  • Promoting of mappings, sessions and workflows from Development to Test and Production environments.
  • Created Pre- and Post-session UNIX scripts and scripts to drop and recreate Indexes.
  • Created and scheduled UNIX scripts to control the ETL processes.
  • Used Business Objects 5.0 to create complex reports from universes using drill down, Slice and Dice capabilities.
  • Used Business Objects, BO Query Designer and Report Designer for Reporting and Data Mining.

Environment: Informatica Power Center 7.1/6.1, Informatica Power Connect, (Repository Manager, Designer, Server Manager), Business Objects 5.0, Flat files, MS Access, Oracle 9i, UNIX (Solaris), Erwin 4.0/3.5.x, MS SQL Server 2000, PL/SQL, Autosys, MS Access 2000, Shell Programming, SQL * Loader, Visual Basic, Toad, SQL Navigator, Excel and Unix scripting, Windows NT

Informatica Consultant

Confidential, New York, NY


  • Involved with the business analyst and the ETL team to gather Business requirements.
  • Identified the workflows, worklets and Mappings to be enhanced and the new mappings to be created.
  • Created and modified the mappings, worklets and workflows which are affected by the new source modifications.
  • Support the offshore people.
  • Generated the XML files and Performed Power Center Administration and migrated ETL code from Development to Test, and to Production using both Import/Export XML and Repository Manager's copy/paste utility.
  • Created Mappings for Full load and also for the incremental load of the data.
  • Developed new forms, XML Publisher Reports and discoverer reports.
  • Handling VLDF files (Variable Length Delimited Flat Files) by taking a business case.
  • Extensively used Expression, Joiner, Lookup, Aggregator, and Update strategy, filter transformations etc. in various mappings.
  • Mostly utilized the SQL override and Lookup SQL override for filtering the data according to the requirement.
  • Used IDE for data analysis, Data Migration and understanding the different patterns of the data.
  • Enhanced the production environment by modifying the existing mappings by check in and check out, and then moved to the production.
  • Incorporated SQL tuning recommendations for data retrieval by using indexing strategy and using hints.
  • Unit testing was done on each mapping separately for validating the performance and verifying the data.
  • Informatica dataflow partition was used for loading large files of data.
  • Improved the performance of the mappings, which were having data loading issues.
  • Worked on development and testing environment and then migrated my workflows to the production environment.
  • Created labels, wrote queries in the repository manager for the deployment folder.
  • Provided test queries for the QA team for the validation of the data.
  • Designed restart recovery logic and also defined point of commit activities for highly volume loads.
  • Involved in writing UNIX shell scripts, PL/SQL procedures, Pre-session and Post- Session Scripts.

Environment: Informatica Power Center 6.1.1, Oracle 9i, SQL *Loader and TOAD, PL/SQL, SQL Developer, UNIX (server), PUTTY

Hire Now