We provide IT Staff Augmentation Services!

Sr. Data Specialist Etl Resume

Newport, NJ

SUMMARY:

  • Around 10+ years of experience working on ETL Informatica in the Data warehouse projects, including Analysis, design, development, implementation and Production Support and Documentation in the field of Data Warehouse.
  • Experience working on Performance tuning on different mappings by finding different types of bottle necks viz. Source bottle necks, Target bottle necks, mapping bottle necks and System bottle necks.
  • Involved on various Source Systems Data pertaining to CCAR / FINReg / Fin Remediation Reports to identify data anomalies and load the exception records into the DQ Mart Database and Provide Exception Reports to the DG Team Business users.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Identify the data flows for the interface between the SAP ECC/ BW instance to the Teradata Database
  • Design, development of mappings, transformations, sessions, workflows and ETL batch jobs to load data
  • Data transformation and verification for WAM inbound data stream
  • Generated and Installed ABAP Program / SAP R/3 Code Using Informatica Power center
  • Participated in Testing and performance tuning by identifying bottlenecks in mapping logic and resolving them, setting cache values and creating partitions for parallel processing of data.
  • Identify the appropriate application support in terms of any Production issues / problems and maintain system
  • Performed and documented the unit testing for validation of the mappings against the mapping specifications documents and Test the ETL process for both before data validation and after data validation process.
  • Monitoring daily ETL loads using Informatica Power Center monitor and resolve / report any load failures and follow - up with feed providers in case of any data issue.
  • Created and scheduled Sessions and Batch Process based on demand, run on time, or run only once using Informatica Workflow Manager and monitoring the data loads using the Workflow Monitor
  • Performed Data Analysis and Validations on the Business Rules provided by the Data Governance Team.
  • Worked extensively with mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Connected / Unconnected Lookups, Aggregators and Union.
  • Excellent communication skills, documentation skills, team problem solving ability, analytical and programming skills in high speed, quality conscious, multi-tasked environment.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.x/8.x

Databases: Oracle, MS SQL Server, Teradata, Netezza, Sybase.

Reporting Tool: Cognos, Tableau, SSRS, Business Objects.

Operating Systems: Linux, Windows, UNIX.

PROFESSIONAL EXPERIENCE:

Confidential, Newport, NJ

Sr. Data Specialist ETL

Responsibilities:

  • Involved on various Source Systems Data pertaining to CCAR / FINReg / Fin Remediation Reports to identify data anomalies and load the exception records into the DQ Mart Database and Provide Exception Reports to the DG Team Business users.
  • Worked with Informatica power center to develop the mappings/workflow for BU Enrichment.
  • Identify the appropriate application support in terms of any Production issues / problems and maintain system
  • Performed Data Analysis and Validations on the Business Rules provided by the Data Governance Team.
  • Created the Informatica Workflows, Worklets, Sessions, Mappings and Mapplets for loading the Data.
  • Extensively used workflow variables, mapping parameters and mapping variables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Worked on loading the data from different sources like Oracle, Flat files.
  • Participated in Testing and performance tuning by identifying bottlenecks in mapping logic and resolving them, setting cache values and creating partitions for parallel processing of data.

Environment: Informatica Power Center 9.6.1, SQL, PLSQL, Oracle.

Confidential, Minneapolis, MN

Data Specialist ETL

Responsibilities:

  • Identify the data flows for the interface between the SAP ECC/ BW instance to the Teradata Database
  • Design, development of mappings, transformations, sessions, workflows and ETL batch jobs to load data
  • Data transformation and verification for WAM inbound data stream
  • Generated and Installed ABAP Program / SAP R/3 Code Using Informatica Powercenter
  • Developed various Mappings with the collection of all Sources, Targets, and Transformations using Informatica Power Center Designer.
  • Developed Mappings using Transformations like Expression, Filter, Joiner and Lookups for better data messaging and to migrate clean and consistent data
  • Extracted data from various sources across the organization (Oracle, Teradata and Flat files ) and loading into staging area
  • Created and scheduled Sessions and Batch Process based on demand, run on time, or run only once using Informatica Workflow Manager and monitoring the data loads using the Workflow Monitor
  • Migrating the ETL objects from Development server to Test server and from test server to the production system
  • Using Pushdown Option and increase the system Performance

Environment: Informatica Power Center 9.5.1, SQL, Oracle, Teradata, Flat Files.

Confidential, El Segundo, CA

Data Specialist ETL

Responsibilities:

  • Created Informatica ETL mappings using different transformations like Source Qualifier, filter, Aggregator, Expression, Connected and Unconnected Lookup, Sequence Generator, Router and Update Strategy.
  • Developing Informatica for loading the data from flat files to the staging tables for the new item feeds and load the data from the staging tables to the base target tables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
  • Modified existing mappings for enhancements of new business requirements.
  • Test the ETL process for both before data validation and after data validation process.
  • Completed Confidential specific trainings and skill specific trainings mandatory for the project.
  • Developing the mappings as per Informatica standards of following naming standards for mappings, transformations and Used re-usable sessions, command tasks, email tasks.
  • Performing Hadoop Archival / Restore process.
  • Worked on loading the data from different sources like Teradata, Oracle, Flat files
  • Performed production support activities in Data Warehouse (Informatica) including monitoring and resolving production issues, pursue information, bug-fixes and supporting end users.
  • Performed and documented the unit testing for validation of the mappings against the mapping specifications documents

Environment: Informatica Power Center 9.5.1, Teradata, SQL, Oracle, Flat Files, Hadoop, Unix.

Confidential, Indianapolis, IN

Lead Informatica Developer

Responsibilities:

  • Developed various Mappings with the collection of all Sources, Targets, and Transformations.
  • Involved in detail design and development of mappings using Informatica.
  • Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to Target systems using Informatica Power Center. Debugging the mappings.
  • Created Informatica ETL mappings using different transformations like Source Qualifier, filter, Aggregator, Expression, Connected and Unconnected Lookup, Sequence Generator, Router and Update Strategy.
  • Developing Informatica for loading the data from flat files to the staging tables for the new shipper item feeds and load the data from the staging tables to the base target tables.
  • Test the ETL process for both before data validation and after data validation process.
  • Designed and developed Informatica Jobs to Extract data from heterogeneous sources, applied transform logics to extracted data and Loaded into Data Warehouse Databases.
  • Performed the unit testing for the mapping specifications documents
  • Worked in Offshore-Onshore Co-ordination environment, delegating and managing a group in Offshore.
  • Investing the bugs and fixing those bugs.

Environment: Informatica Power Center 9.1, SQL, Oracle.

Confidential ., Richmond, VA

Lead Informatica Developer

Responsibilities:

  • Performed major role in understanding the business requirements and designing and loading data into data warehouse (ETL).
  • Have created optimized WLM Schedules to schedule the Jobs and monitor the status of the Jobs.
  • Worked with Teradata for various transformations to be performed on the data, before it was either loaded or extracted from the Data Warehouse.
  • Developing Informatica for loading the data from flat files to the staging tables for the new shipper item feeds and load the data from the staging tables to the base target tables.
  • ETL methodology for supporting data extraction, transformations, and Load processing in a complex EDW.
  • Test the ETL process for both before data validation and after data validation process.
  • Created and executed test cases for ETL jobs to upload master data to repository.
  • Designed and developed Informatica Jobs to Extract data from heterogeneous sources, applied transform logics to extracted data and Loaded into Data Warehouse Databases.
  • Worked in Offshore-Onshore Co-ordination environment, delegating and managing a group in Offshore.
  • Claim amount fix in enterprise data warehouse: The challenge with this project was to build a history interface from scratch in order to remove all the existing Edward data and reload it with corrected data.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

Environment: Informatica Power Center 9.1, Teradata, Netezza, UNIX.

Confidential, Fort Wayne, IN

Sr. Informatica Developer

Responsibilities:

  • Developing the mappings as per Informatica standards of following naming standards for mappings, transformations and Used re-usable sessions, command tasks, email tasks.
  • Worked on loading the data from different sources like Oracle, Flat files
  • Created Informatica ETL mappings using different transformations like Source qualifier, Expression, Joiner, Filter, Router, Aggregator, Sequence generator and Connected Unconnected Lookup transformation.
  • Created Data Flow Diagrams and ETL process that required mapping doc, transformations, testing scenarios.
  • Performed production support activities in Data Warehouse (Informatica) including monitoring and resolving production issues, pursue information, bug-fixes and supporting end users.
  • Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to Target systems using Informatica Power Center. Debugging the mappings.
  • Investing the bugs and fixing those bugs.
  • Performed and documented the unit testing for validation of the mappings against the mapping specifications documents
  • Monitoring daily ETL loads using Informatica Power Center monitor and resolve / report any load failures and follow-up with feed providers in case of any data issue.

Environment: Informatica Power Center 9.1, Oracle, Autosys, SQL, Flat Files

Confidential, Schenectady, NY

ETL Informatica Developer

Responsibilities:

  • Designed and developed ETL mappings, workflow and transformation’s code to process the input file based on certain custom logic using Informatica 9.1.
  • Developed master parameter file which was called in workflow and session configuration.
  • Set up session and workflow configurations based on technical design documents.
  • Performed requirements analysis and wrote technical design documents (TDD).
  • Used session log files or debugger to figure out the issues while workflow and session failed.
  • Performed unit testing to compare source and target (e.g. Data count match and column by column match) using SQL Queries on SQL Server 2008, Sybase and Flat file.
  • Actively involved in Performance improvements of Mapping and Sessions and fine-tuned all transformations.
  • Performance tuning of the Informatica mappings using various components like Parameter files, variables.
  • Involved in writing ETL specifications and unit test plans for the mappings.
  • Designed the ETL processes using Informatica to load data from SQL Server, Sybase, Flat Files, XML Files and Excel files to target database.

Environment: Informatica Power Center 9.1, Sagent, MKS, SQL Server, Sybase.

Hire Now