We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

Miami, FL


  • 7+ Years of experience with all phases of SDLC including A nalysis, D esign, D evelopment and Implementation of operational database systems (OLTP) and data warehouse systems (OLAP). The past 7 years spent primarily on Data warehouse projects using Informatica Power center, Oracle, SQL Server, Business Objects and Cognos on UNIX and Windows platforms.
  • Experience in database modeling, analysis, design, development, data conversion using ETL/Informatica and Performance tuning and implementation of several database and data warehousing applications.
  • Extensive ETL experience using Informatica 10.2.1/9.6.1/9.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager), Power Exchange.
  • Extensive experience in implementation of Data Cleanup procedures, transformations Scripts, Triggers, Stored Procedures, meta data manager and execution of test plans for loading the data successfully into the targets.
  • Extensively used Data transformation Studio (B2B) to change the data from unstructured to structured forms using Parsers, Mappers and Streamers
  • Experience in Data Modeling (Logical and Physical Design for distributed databases, Reverse - engineering and Forward-engineering using Erwin).
  • Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Roles, Privileges, Tables, Constraints, Views, Indexes, Sequence, and Synonyms Dynamic SQL and SQL*Loader in distributed environment .
  • Worked on Power Exchange for change data capture (CDC).
  • Involved in extracting source data from SAP, DB2, Flat Files, VSAM files on different systems and populated target data warehouse on NCR Teradata.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Utilized Teradata Fast Load, Multi Load options for loading data to Teradata database.
  • Used Teradata Tools and Utilities like BTEQ/ODBC/JDBC/OLE.
  • Worked as a QlikView Technical Consultant for a wide variety of business applications.
  • Developed SQL code for data validations and data computation process on source DB2 transaction system and on target warehouse using Teradata
  • Implemented performance tuning techniques at application, database and system levels.
  • Expertise in Designing Fact Tables, Dimension Tables, Summary Tables.
  • Strong knowledge in OLAP Systems, Kimball and Inmon methodology and models, Dimensional modeling using Star schema and Snowflake schema.
  • Experience in OLAP / BI tools such as Business Objects, Tableau, and SSRS .
  • Experience in UNIX Shell scripting for automating the Jobs.
  • Significant Testing experience, including designing test cases, UAT testing, and designing test data effective in cross-functional and global environments to manage multiple tasks & assignments concurrently.
  • Highly organized and detail-oriented professional with strong business and technical skills.


ETL Tools: Informatica 10.2.1/9.6.1 (Power Mart/ Power Center), Informatica Power Exchange, Informatica Power Connect, IDQ, IDE, Datastage, B2B Data Transformation Studio.

Reporting Tools: BusinessObjects, Tableau, Cognos8, SSRS(SQL Reporting Services), SAS, Qlikview

Databases: Oracle 12c,11g/10g, IBM DB2 UDB 8.0/7.0, Teradata v2r5/v2r6, MSSQL Server 2012/2014, Sybase, MS Access.

Database utilities: SQL *plus, Stored procedures, Functions, Exception handling.

Data Modeling tool/ Methodology: MSVisio, ERWIN 4.x/3.x, Ralph-Kimball Methodology, Bill-Inmon Methodology, Star Schema, Snow Flake Schema, Extended Star Schema, Physical And Logical Modeling.

Languages: C, C++, JAVA, Visual Basic, T-SQL, PL/SQL, XML

Operating Systems: HP-UX, Unix, Linux, Windows


Confidential, Miami, FL

Sr. ETL Informatica Developer


  • Responsible for the technical implementation of business use cases which involves interacting with business and creating a technical design for the business requirements by designing and loading the data into data warehouse (ETL).
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplets Designer, and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
  • Used Informatica repository manager to backup and migrate metadata in development, test and production environments. worked with informatica Admin team in setting up connection to various databases and applications from Informatica. Monitor and troubleshoot informatica productions issues.
  • Worked with mapping wizards for slowly changing dimensions and slowly growing dimensions by using Informatica mapping designer.
  • Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status.
  • Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements.
  • Worked on migrating existing Teradata ETL flows to Hadoop without missing business logic.
  • Extensive hands on experience in Hadoop file system commands for file handling operations.
  • Prepared Test Plan from the Business Requirements and Functional Specification.
  • Developed Test Cases for Deployment Verification, ETL Data Validation, Cube Testing and Report testing.
  • Performed validation tests to ensure that the developed functionality meets the specifications prior to UAT testing.
  • Created test data for all ETL mapping rules to test the functionality of the Informatica Jobs.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Used techniques like source query tuning, single pass reading and caching lookups to achieve optimized performance in the existing sessions.
  • Used CDC for moving data from Source to Target.
  • Identified the data in the source system that has changed since the last extraction with the help of CDC.
  • Used UNIX Shell scripting for automation of the process, invoking PL/SQL procedures, and Informatica sessions.
  • Designed and Developed Oracle PL/SQL procedures, performed Data Import/Export, Data Conversions and Data Cleansing operations.
  • Involved in writing shell scripts on Unix for Informatica ETL tool to run the Sessions.
  • Involved in generating reports from DM layer using Business Objects.
  • Involved in tracking, reviewing and analysing defects.

Environment: Informatica 10.2/9.6.1, Business Objects XI 3.1, Hadoop, Tableau, Oracle12c/11g, MS SQL Server 2014, Windows, Unix, PL/SQL, SVN.

Confidential, Little Rock, AR

ETL Informatica Developer


  • Coordinated with Business Analysts to understand the business requirements and implemented the same into a functional Data warehouse design.
  • Interpreted logical and physical data models for Business users to determine common data definitions.
  • Developed ETL technical specs, Visio for ETL process flow and ETL load plan, ETL execution plan, Test cases, Test scripts etc.
  • Worked with various Informatica client tools like Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Workflow Manager.
  • Created mappings using different transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator
  • Implemented Informatica Web consumer Services Transformation to FTP the data from one server to other servers.
  • Used Power Exchange to source copybook definition and then to row test the data from data files etc.
  • Created procedures, functions and database triggers to enforce Business rules using PL/SQL..
  • Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation Logic.
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Involved in tuning the database for better performance by analyzing the table, adding Hints and by Query Tuning methods.
  • Utilized Teradata Fast Load, Multi Load options for loading data to Teradata database
  • Used Teradata Tools and Utilities like BTEQ/ODBC/JDBC/OLE
  • Identified and defined changes and enhancements to existing applications.
  • Developed dashboards pertaining to KPI monitoring using QlikView10. x.
  • Developed complex QlikView scripts to build model, to support various KPIs and time comparison
  • Developed UNIX shell scripts for scheduling workflows.
  • Handled the ETL related activates by interacting with other teams on a regular basis.
  • Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
  • Developed and tested all the backend programs, Error Handling Strategies and update processes.
  • Involved in Unit Testing, User Acceptance Testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Involved in production support activities including monitoring loads, resolving exceptions and correcting data problems.

Environment: Informatica PowerCenter 9.6.1/ 9.1.0 , UNIX, Informatica B2B Data Exchange, Oracle 11g, Unix Shell Scripts, Tidal, SQL, PL/SQL, Flat files, DB2, Teradata, QlikView.


ETL Informatica Developer


  • Coordinate with the users/analysts to understand the business of the application, gather requirements and put into technical design documents.
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
  • Involved in developing the technical documents from functional specifications.
  • Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Designed Complex Informatica mappings with transformations such as Aggregator, Expression, Joiner, Filter, Source Qualifier, XML, Union, connected/unconnected Lookups, Sequence Generator, Update Strategy, Stored Procedure and Router Transformation’s to transform and load the data from Mainframe files, JMS Queues, Flat Files and Oracle data to the Oracle warehouse.
  • Created and Configured Workflows, Worklets, and Sessions to transport the data to target using Informatica Workflow Manager. Also created various tasks such as command task, email task, timer tasks, Event wait and Event Raise tasks in the workflow manager.
  • Involved in creating Reusable Transformations, Mapping Parameters and Mapping Variables.
  • Worked on Teradata insert, Update, Upsert and Dynamic SQL to transform and Load Summary tables built on Core Data Warehouse using BTEQ.
  • Worked on error handling in TD SQL s using ERROR CODE and ACTIVITY COUNT
  • Developed UNIX shell scripts for scheduling workflows by using pmcmd.
  • Involved in tuning the database for better performance by analyzing the table, adding Hints and by Query Tuning methods.
  • Tuned performance of Informatica sessions for large data files by increasing block size, data cache size and, sequence buffer length
  • Developed Scripts to automate the Data Load processes to target Data warehouse.
  • Implemented Informatica IDQ for Data Cleansing and Data Standardization.
  • Implemented the Schedules for the Historical Load run and Incremental Load runs
  • Have worked on testing and gathering statistics and monitoring the run times for the performances and fixing it.
  • Migrated mappings, sessions and workflows from Development to Test environment
  • Involved in Performance testing the mappings and workflows in production staging before installing in the Production environment.

Environment: Informatica PowerCenter 9.1/9.6.1, Business Objects XI 3.1, Crystal Reports 2008, Tidal, Oracle11g, MS SQL Server 2008/2012, Erwin7.3, Windows, Unix.


Data Warehouse Developer


  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Analyzed relationships of Flat Files and to extract the analyzed systems, met with end users and business units in order to define the requirements
  • Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
  • Developed data Mappings between source systems and warehouse components.
  • Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures.
  • Developed and tested all the backend programs, Informatica mappings and update processes.
  • Created and Monitored Batches and Sessions using Informatica PowerCenter Server.
  • Responsible to tune ETL procedures and STAR Schemas to optimize load and query performance.
  • Extensively worked in the performance tuning of the programs, ETL Procedures and processes
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Perform analysis and resolution of Help Desk Tickets and maintenance for assigned applications.

Environment: Informatica PowerMart 7.1, Oracle, PL/SQL, Windows, Remedy, Synergy.

Hire Now