We provide IT Staff Augmentation Services!

Sr. Etl/data Integration Developer Resume

North Bergen, NJ


  • 10+ years of experience in Information Technology in the field of Data Management, Enterprise Data Warehouse (EDW) and Data Integration .
  • Experience in Automotive, Retail, Healthcare, Banking and Finance Industries.
  • ETL Architecture, Development, Enhancement, Maintenance, Production support, analyze, design and develop Extraction, Transformation and Load (ETL) process for Data Warehousing using Informatica Power Center 10.x/9.x/8.x, which includes Designer, Repository Manager, Server Manager, Workflow Manager and Workflow Monitor.
  • Hands on experience in building Master Data Management (MDM) solution for large/complex customer base data using Informatica MDM.
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1.
  • Experience in creating various transformations using Aggregator, Look Up, Update Strategy, Joiner, Filter, Sequence Generator, Normalizer, Sorter, Router, XML, Stored Procedure in Informatica Power Center Designer.
  • Worked in a Development profile, creating IDQ rules/mapplets, IDQ mappings using transformations like standardizer, match, parser, labeler and SQL, profiles and scorecards in IDQ 9.6.1 developer and Informatica Analyst.
  • Strong experience in Planning, Designing, Developing and Deploying Business Intelligence solutions in Data Warehouse and Data Lakes/ Decision Support Systems using ETL and Business Intelligence tools.
  • Used Informatica to convert Oracle data into HDFS (Hadoop distributed file system).
  • Good understanding on data extraction, transformation and load in Hive, Pig and HBase and experience with data transformation from HDFS, HIVE, PIG, HBase, and Oracle.
  • Worked with different Flat Files, Excel files, XMLs and Databases like Oracle, DB2, Microsoft SQL Server 2011/2008/2005 , Netezza, Teradata, SAP HANA, and SAP S4.
  • Data modeling experience using Star Schema/Snowflake modeling, OLAP tools, Fact and Dimensions Tables, Data Modeling, ERWIN and Oracle Designer.
  • Hands on experience in tuning ETL, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings/jobs, and sessions.
  • Experience in extraction of data from multiple operational sources of loading staging area and data marts using SCD (Type1/Type2/Type3) loads.
  • Familiar with other reporting tools like SAP Business Objects Reporting (BO) 4.1/4.0/3.1, OBIEE 11g and MS SQL Server Reporting Services 2008.
  • Experienced in interacting with business users, business analysts. IT leads and developers in analyzing business requirements and translating requirements into functional and technical design specifications.
  • Excellent communication, presentation and interpersonal skills and ability to prioritize and coordinate work across different geographic locations.


Databases: Oracle 12c/11g/10g/9i, MS SQL Server 2008/2005/2000 , MS Access, DB2 UDB 8.0/7.0 and Teradata, AWS Redshift, HIVE, SAP HANA, SAP S4.

Tools: Erwin, Toad, TIDAL, Autosys, Visio, MS Office Suite.

Data Warehousing / ETL Tools: Informatica Power Center 10.2/9.6.1/9.5/9.1/8.0 , Informatica Data Quality (IDQ), Informatica BDE 10.1, Informatica PowerExchange 9.6.1, AWS Glue.

MDM Packages: Informatica MDM Multi Domain Edition 10.1, Informatica Data Director (IDD).

Environment: Windows 7/XP/2000, Windows server 2003/2000, Unix Shell Scripting.

Languages: C, C++, SQL, PL/SQL, T - SQL.

Reporting Tools: Business Objects 4.1/4.0/3.1/XI R2, Crystal Reports 2011/2008, OBIEE 11g, SQL Server Reporting Services (SSRS) 2008, Cognos 11/10.3.


Confidential, North Bergen, NJ

Sr. ETL/Data Integration Developer


  • Data Profiling/Data Analysis. Constantly Coordinate with Other teams and Business owners to get business approval and get the new requirements.
  • Responsible for new development and maintenance of ETL (Extract, Transform and Load) processes in data warehouse, data mart and data integration projects.
  • Created ETL processes to extract data from SQL, Oracle, Netezza and HANA tables for loading into various relational and non-relational staging areas and develop complex mappings in Informatica to load the data from various sources into the Data Warehouse.
  • Understanding requirements and preparation/review Mapping Document and design the logic to implement the same.
  • Extensively using Informatica ETL Power Center Tools to design, develop ETL jobs for extracting, transforming and loading the data
  • Created new workflows, sessions, mappings to extract information from different XML sources and load data into flat files.
  • Involved in Implementation Master Data solution build on AWS Cloud server for huge customer base, configure match/merge rules for standardization and consolidation process within MDM Hub store to generate master records, enable batch and real time integration process and maintain daily basis operation.
  • Used Hierarchies tool, for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD (Informatica data Detector MDM UI).
  • Used IDQ to complete initial data profiling and removing
  • Provide technical expertise in ETL, Business Intelligence, and Database Technologies, Leadership and offer technical and business solution to the problems.
  • Collaborate and provide critical thought and best practice discipline in the planning, design, development, and deployment of new applications and enhancements to existing applications.
  • Worked on Performance Tuning of ETL mappings and sessions.
  • Provides Data to the Micros POS systems from the S4 HANA system.
  • Create HIVE queries to join multiple tables of a source system and load them into Elastic Search Tables and used HIVE QL scripts to perform the incremental loads.
  • Involve in handling different XML structures, XSD’s for the development.
  • Involved in Unit and System Testing of ETL Code (Mappings and Workflows).
  • Migration from one environment to another as part of release management.
  • Works on Performance Tuning of ETL mappings and sessions.
  • Monitoring of application batch processes to ensure successful completion of jobs.
  • Work on Tidal to run parallel Informatica ETL jobs.
  • Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
  • Elicit stakeholder requirements using interviews, document analysis, workshops, and workflow analysis.
  • Capture, define, analyze and develop basic conceptual data and process models, including workflow, wire frames, use cases, domain models, and activity diagrams in Erwin.
  • Act as a liaison between business stakeholders and the IT team to ensure requirements are accurately captured and fully understood by involved parties.

Environment: Informatica Power Center 10.2/9.6.1, Informatica Data Quality (IDQ 9.6.1), Informatica Master Data Management (MDM), Oracle 11g, Microsoft SQL Server 2008, Hive, XML, SAP HANA, SAP S4, Netezza, Putty, UNIX (Solaris), Erwin 4.0, TIDAL.

Confidential, Auburn Hills, MI



  • Involved in gathering of business scope and technical requirements and created technical specifications. Extensively involved in Enterprise Data Warehouse.
  • Working closely with the business users to understand the requirements and converting them into project level technical capabilities.
  • Worked with business analysts to identify the appropriate data elements for required capabilities.
  • Configured Trust and Validation rules, match rules.
  • Guide ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Shire’s data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.
  • Fine tune the Match Rules using Accept Limit Adjustment
  • Configured IDD application using IDD Configuration manager.
  • Update the status and plan the releases through the scrum meetings.
  • Coordinating with offshore team and providing the inputs.
  • Worked with source teams to find out the source team changes.
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata reducing the development time.
  • Responsible for migration of mapping from Development to Testing and to Production.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimension.
  • Creating mappings, profiles and scorecards in IDQ Developer and Analyst as per requirements.
  • Extensively worked on Informatica IDE/IDQ.
  • Designed various approaches like CDC (Change data capture), Auditing control, Recovery strategy, Scheduling jobs etc.
  • Used Teradata utilities FAST LOAD, MULTI LOAD, TPUMP to load data.
  • Wrote, tested and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Excellent hands-on experience with Informatica (Designer and Workflow Manager) and Data Quality (IDQ) (mappings, profiles and scorecards).
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration at user’s machines and resolved the issues.
  • Involved in the Unit testing, Event & Thread testing and System testing.
  • Created WebI reports with multiple data providers and synchronized the data using Merge Dimensions.
  • Created the WebI reports using Business Objects functionalities like Queries, Drill down, @Functions, Cross Tab.
  • Provided the on-call and production support for Informatica, Business Objects, SQL and UNIX jobs.
  • Worked on various issues on existing Informatica Mappings to produce correct output.
  • Efficient Documentation was done for all phases like Analysis, design, development, testing and maintenance.

Environment: Informatica Power Center 9.6.1, Informatica Data Quality (IDQ 9.6.1), Informatica PowerExchange 9.6.1, Informatica Master Data Management (MDM), Oracle 11g, PL/SQL, JCL, Teradata 15, Business Objects 4.1/3.1, UNIX (Solaris), Toad 12.10.

Confidential, Erlanger, KY



  • Convert historical data into the new streamlined standards. Create new mappings for different business report needs.
  • Using Informatica Power Center Designer analyzed the source data to Extract & Transform from various source systems (Oracle, DB2, SQL server, Mainframe and Flat Files) by incorporating business rules using different objects and functions that the tool supports.
  • Extracted data from SAP system to Staging Area and loaded the data to the target database by ETL process using Informatica Power Center.
  • Converted the PL/SQL Procedures to Informatica mappings and at the same time created procedures in the database level for optimum performance of the mappings.
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1.
  • Identified and eliminated duplicates in datasets thorough IDQ 9.6.1 components of Edit Distance, Jaro Distance and Mixed Field matcher.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Involved in loading of data into Teradata from legacy systems and flat files using complex MLOAD scripts and FASTLOAD scripts.
  • Compiled and documented all the data quality issues pertinent to the specific source in a specific web based Agile Sprint Progress Tracking System (Trello) in the form of specific user stories for creation of specific Data quality rules to address those issues in defined sprint based time duration.
  • Created and embedded source specific data quality rules in their respective business drivers (IDQ mappings) to generate test results to rectify the data quality issues in the source which can be broadly identified under the Completeness, Conformance, Accuracy and Validity data quality dimensions.
  • Created ETL processes to extract data from SQL and Oracle tables for loading into various SQL and oracle staging tables.
  • Develop complex mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Router, Lookup - Connected & Unconnected, Sequence Generator, Filter, Sorter, Source Qualifier, Stored Procedure transformation etc.
  • Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
  • Involve in preparing detailed ETL design documents. Capture, define, analyze and develop basic conceptual data and process models, including workflow, wire frames, use cases, domain models, and activity diagrams.
  • Informatica Data Quality (IDQ 9.6.1) is the tool used here for data quality measurement.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow in Informatica Power Center.
  • Data Profiling/Data Analysis for the new clients. Continuously interact with other project team members and will interface with user departments to clarify requirement, identify and resolve problems, and ensure that user requirements are met.
  • User acceptance testing to check whether the data is loading into target which was extracted from different source systems according to the user requirements.
  • Act as a liaison between business stakeholders and the IT team to ensure requirements are accurately captured and fully understood by involved parties.
  • Analyze complex data relationships to determine data requirements and model data.
  • Work closely with the technical team to ensure functional requirements are accurate and complete.

Environment: Informatica Power Center 9.6.1, Informatica Data Quality (IDQ 9.6.1), Informatica PowerExchange 9.6.1, Oracle 12c/11g, SQL Server 2012, Mainframe, DB2, Teradata 14, OBIEE 11g, UNIX (Solaris), Erwin 4.0, PL/SQL.

Confidential, Houston, TX



  • Analyzed the business requirements and functional specifications.
  • Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.
  • Used Informatica Power Center 8.6 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Responsible for creating batches and scripts for implementing logical design to T-SQL.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Worked with profile development, rules/mapplets creation and mappings using transformations like standardizer, match, parser, labeler and SQL, profiles and scorecards in IDQ 9.x developer and Analyst.
  • Created mappings, profiles and scorecards in IDQ developer and do analysis as per the requirements.
  • Involved in fixing invalid mappings, testing of Stored Procedures and Functions, Testing of Informatica Sessions, Worklets and Workflows.
  • Wrote stored procedures in PL/SQL and UNIX Shell Scripts for Automated execution of jobs. Used version control system to manage code in different code streams like Clear case
  • Performed data-oriented tasks on Master Data projects especially Customer/Party, like standardizing, cleansing, merging, de-duping, determining survivorship rules.
  • Responsible for the design, development, testing and documentation of the Informatica mappings, PL/SQL, Transformation, jobs based on Dean Standards.
  • Provided production support and Conceived unit, system, integration, functional, and performance test plans.
  • Developed and deployed various validation Reports in Crystal Reports 2011 and SSRS for performing data checks and balances for all the User communities

Environment: Informatica Power Center 9.6.1/9.1, Oracle 11g, SQL Server 2008, T-SQL, Autosys, DI, JCL, UNIX (Solaris), Sybase, Erwin 4.0, PL/SQL, Shell Programming, SQL * Loader Toad.

Confidential, Sacramento, CA



  • Interacted with End Users for gathering requirements.
  • Developed standards for ETL framework for the ease of reusing similar logic across the board.
  • Used Repository manager to create Repository, User groups, Users and managed users by setting up their privileges and profile.
  • Developed mappings to extract data from different sources like AMS Advantage, Oracle, and XML files are loaded into Data Mart.
  • Created complex mappings by using different transformations like Filter, Router, Connected and Unconnected lookups, and Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimension. Also give end to end support to EDW
  • Fine-tuned existing Informatica maps for performance optimization. Created and managed the global and local repositories and permissions using Repository Manager.
  • Worked on Informatica Designer tools - Source Analyser, Warehouse designer, Mapping
  • Designer, Mapplet Designer, Transformation Developer and Server Manager to create and monitor sessions and batches. Also create the enterprise data warehouse.
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Extensively used Stored Procedures, Triggers, Functions and Packages using PL/SQL for creating Connected and Unconnected Stored Procedure Transformations.
  • Developed Slowly Changing Dimension for type I and type II (flag, version and date).
  • Involved in designing Logical/Physical Data Models reverse engineering for the entire subject across the schema’s using Erwin.
  • Scheduled the workflows using Shell script.
  • Comprehensive Admin Review: Created multiple models to meet the need of ad hoc and canned reports. Modified the existed model of Phase 1 and took care of all business needs for phase 2.
  • Developed FM model for report studio developers
  • Created CPF (Project files) and Published projects from the FM Models
  • Created Query subjects, query items, Filters, Namespaces
  • Handled Complex Queries as separate Query Subjects.
  • Setup report bursting and scheduled it to run on the11th of every month.
  • Troubleshoot database, workflows, mappings, source, and target to find out the bottlenecks and improved the performance.

Environment: Informatica Power Center 9.1/8.6, Cognos 11/10.3, XML files, Mainframe JCL, AMS Advantage, Info Advantage, Toad, Oracle 10i, Teradata, Tidal, PL/SQL, Flat Files, Erwin, WINCVS, HP-UNIX and Power Exchange.

Confidential, CT



  • Documented user requirements, translated requirements into system solutions and developed implementation plan and schedule.
  • Extracted data from relational databases DB2, Oracle and Flat Files.
  • Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load data into Datamarts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
  • Designed various approaches like CDC (Change data capture), Auditing control, Recovery strategy, Scheduling jobs etc…
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
  • Developed data Mappings, Transformations between source systems and warehouse
  • Performed Type1 and Type2 mappings.
  • Managed writing test cases and test scenarios from requirement for newly added features and executing test scripts.
  • Adding requirements into Quality Centre, Mapping Requirements with Test cases.
  • Documenting defects in Quality center uploading all necessary information into Share point.
  • Implemented Aggregate, Filter, Join, Expression, Lookup and Update Strategy transformations.
  • Used Power Center Data Masking Options (Random Masking, Blurring, SSN, and Credit Card) in mappings.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Responsible to design, develop and test the software (Informatica, PL/SQL, UNIX shell scripts) to maintain the data marts (Extract data, Transform data, Load data).
  • Documentation for ETL development process. Installation and troubleshooting guide.
  • Responsible for daily verification that all scripts, downloads, and file copies were executed as planned, troubleshooting any steps that failed, and providing both immediate and long-term problem resolution.

Environment: Informatica Power Center 8.6.1, Oracle, DB2, Teradata, UNIX Shell Programming, SQL * Loader Toad and ErWin .

Confidential, NYC, NY



  • Analyzed business process and gathered core business requirements, interacted with business analysts and end users.
  • Analyzed business requirements and worked closely with the various application teams and business teams and developed ETL procedures that are consistent across all application and systems.
  • Worked on designing of complex data models.
  • Experienced in PX file stages that include Complex Flat File stage, DataSet stage, LookUp File Stage, Sequential file stage.
  • Implemented Shared container for multiple jobs and Local containers for same job as per requirements.
  • Adept knowledge and experience in mapping source to target data using IBM Data Stage 8.x.
  • Experienced in developing parallel jobs using various Development/debug stages (Peek stage, Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage) and processing stages (Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel, Remove Duplicate Stage) .
  • Debug, test and fix the transformation logic applied in the parallel jobs.
  • Extract and deploy various types of data in the target database using ODI.
  • Development of pre-session, post-session routines and batch execution routines using server to run Informatica sessions.
  • Developed data mappings between source systems and warehouse components.
  • Created and defined DDL or the tables at staging area and documented ETL development process including installation and troubleshooting guide.
  • Created different Parameter files and changed Session parameters, Mapping parameters and Variables during run time.
  • Extensively used Source Qualifier Transformation and created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters, Normalizer, Stored Procedure, Update Strategy and Sequence Generator.
  • Partitioned Sessions for concurrent loading of data into the Target tables.
  • Tuned the Workflows and Mappings.
  • Involved in identifying the bottlenecks and tuning to improve the Performance.

Environment: Informatica Power Center 8.5/8.1, IBM Info sphere DataStage 8.5, Informatica Power Exchange Tool, Oracle 10g, Quest Central, SQLplus, PL/SQL (Stored Procedure, Trigger, Packages), ODI, Teradata, DB2, Erwin, MS Visio, Windows 2000, UNIX HP-UX.

Hire Now