We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

Tampa, FL


  • Having 9+ years of Professional Experience in analysis, design, development, implementation and Testing (troubleshooting) of Data Mart / Data Warehouse applications using ETL Tools like Informatica power center 9.x/8.x, OLTP and OLAP using Data Extraction, Data Transformation, Data Loading and Data Analysis.
  • Experience in Data Modeling, Data Warehouse, ETL Design Development, Ralph Kimball models with Star/Snowflake schema designs, Analysis Definition, Database Design, Testing, Implementation and Quality process.
  • Expertise in Installing and managing Informatica Power Center, Metadata Manager, Data Explorer and Data Quality.
  • Strong knowledge in Informatica Administration including Installation and up gradation, Configuration, Maintenance and Troubleshooting.
  • Strong experience in installation of Informatica on UNIX, Migration of Repositories, and up gradation of Repositories.
  • Extensively worked on Informatica tools Admin console, Repository manager, designer, Workflow manager, Workflow monitor.
  • Strong Experience in writing queries to access Informatica Repository DB.
  • Strong experience as Support for UAT, production, deployment, Informatica object Migration using Repository Manager and monitoring workflows using workflow monitor.
  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups, Java, XML, Aggregators, Type - I, Type-II, Type-III slowly changing dimensions (SCD).
  • Experience in Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files, SAP COBOL files & XML Files.
  • Strong experience in Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect.
  • Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet and Mapping Designer.
  • Strong Experience on Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.
  • Design, development and coding with ORACLE 11g, 10g/9i/8i/, SQL Server 2000/2005, experienced in writing Complex queries, stored procedures, functions, cursors and packages using PL/SQL Programming.
  • Strong in UNIX Shell scripting and scheduled ETL load using utilities like Control-M, Crontab and Autosys.
  • Extensively worked with Oracle PL/SQL Stored Procedures, Functions and Triggers, involved in query optimization.
  • Good Experience with IDQ (Informatica Data Quality), in Installation and configuration of Informatica Servers and Client Environments. And good understanding of the ERWIN Tool and MDM concepts.
  • Created and monitored Database maintenance plans for checking database integrity, data optimization, rebuilding indexes and updating statistics. Built data quality Rules and generated dashboard using services.
  • Developed /Modified/ Executed multiple Windows Power Shell Scripts during the Migration of SharePoint Older Versions to Office 365 SharePoint.
  • Developed Unique Repository Queries on Toad/Oracle to access/retrieve the information of (Metadata) Informatica objects like Mappings, sessions, workflows, sources, Targets etc., which are stored in Informatica Repository/Manager DB.
  • Involved in writing various UNIX Shell Scripts for writing automated Scripts for scheduled queue process and PMCMD commands.
  • Tuned SQL/PL SQL Queries to improve the performance using Explain plan and creating indexes.
  • Good experience in performing and supporting Unit testing, System Integration testing, UAT and production support for issues raised by application users.
  • Excellent technical and professional client interaction skills. Interacted with both Technical, functional and business audiences across different phases of the project life cycle.
  • A well-organized, goal-oriented, highly-motivated effective team member with excellent analytical, troubleshooting, and problem-solving Skill.


ETL Tools: Informatica Power Center 7.x/8.x/9.x, Informatica Data Quality IDQ, Data Warehouse builder (OWB), Data Stage 8.7, Power Exchange, SyncSort, Data Cleansing, OLTP, SQL*Plus, SQL*Loader, B2B, SSIS

Languages: C, C++, C#, PL/SQL, SQL, SQL*Plus 3.3/8.0, Dynamic SQL, Java

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake ModelingFACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 7.2/7.0, Visio 2003, Oracle Designer, TOAD.

Methodology: Agile RUP, SCRUM, Waterfall

Databases: DB2, Oracle 11g/10g, MS SQL Server 2012/2008, Teradata V2R6/V2R5, MS access.

Operating Systems: Windows, UNIX, Linux

IDEs: AQT, Eclipse, PL/SQL Developer, TOAD, Teradata SQL Ass, SQL * Loader, Erwin 3.5

BI Reporting & Scheduling: IBM-Tivoli Maestro, Crystal Reports, Business Objects, OBIEE 10g, Control-mAutosys, Tidal, Cognos 8. Power BI Putty, WinSCP, Remedy, MS Office, Jira, Rally

Big Data Technologies: Hadoop, HDFS, Map Reduce, Hive, Pig, HBase, Sqoop, Oozie, Spark

Tracking Tool: JIRA, VersionOne.


Confidential, Tampa, FL

ETL/Informatica Developer


  • Review business and technical requirements and ensure the data integration platform meets requirements.
  • Apply industry best practices for ETL design and development.
  • Lead design, development and implementation of end-to-end complex ETL system using Informatica tools.
  • Plan and lead ETL development efforts for team utilizing Informatica tools.
  • Produce written deliverables for technical design, system testing and implementation activities.
  • Implement ETL systems that are operationally stable, efficient and automated.
  • Implement technical solutions that are scalable, aligned with the enterprise architecture and can adapt to changes in business.
  • Implement ETL systems that maximize re-usable components/services, collect/share metadata, incorporate audit, reconciliation and exception handling.
  • Designing and developing Informatica mappings including SCD-I, SCD -II, SCD -III slowly changing dimensions (SCD)
  • Implement parallel processing, load balancing, near real-time and real-time ETL processes.
  • Implement entity resolution applications to build master indexes for members, providers, payers, etc
  • Help establish master data management program
  • Implement monitoring and measurement of data quality as defined by business rules.
  • Ensure adherence to architectural governance standards and practices
  • Develop best practices, standards of excellence, and guidelines for programming teams Conduct System Testing - execute job flows, investigate system defects, resolve defects and document results
  • Work with DBAs, application specialists and technical services to tune performance of the system to meet performance standards and SLAs
  • Assist in the development, documentation and application of best practices and procedures that govern DW implementations and operations.
  • A well-organized, goal-oriented, highly-motivated effective team member with excellent analytical, troubleshooting, and problem-solving Skills.

Environment: Informatica Power Center 10.2.0 HF2 client, Informatica Developer, Informatica Data Quality, Unix, Windows, GreenPlum, AQT, Mainframes, SQL Server 2014, Oracle 11g R2, TOAD, SQL Plus, PL/SQL., WinSCP, Putty, Autosys, Visual Studio 2015, TFS-Team Foundation Studio.

Confidential, Philadelphia, PA

Informatica ETL Developer


  • Involved in development, maintenance, and enhancement of Informatica Mappings, Workflows, Processes and writing complex Oracle SQL to integrate on-premise Oracle-based source data with Salesforce.
  • Included in tasks involving analyzing transaction errors, troubleshooting issues in the software, developing bug-fixes, involving performance tuning efforts.
  • Worked closely with Business Analysts and client throughout the project to understand, analyze and implement the quality code.
  • Experience building and implementing scalable, reliable, and high-performance ETL architectures, the ability to analyze and retrieve metadata from Informatica repository, proficiency with RDBMS systems and SQL, and familiarity with database analysis and design.
  • Responsible for integration of various data sources like Oracle, SQL Server, Flat Files, XML files and various other data sources.
  • Loaded data from different legacy systems to salesforce and mastered data using informatica CC360.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Expression, Lookup, SQ Lookup and Target Lookup, Filter, Aggregator, Update Strategy, Joiner, Normalizer, Router, Union etc., T/Rs.
  • Designed mappings related to complex business logic provided by data modeler, which includes Dimensions and Fact tables.
  • Extensively involved with backend testing by writing complex SQL queries
  • Implemented inserts/updates/deletes through SCD TYPE 1, TYPE 2 and other approaches as per the requirements.
  • Created Workflows and deployed using Application and executed from pmcmd command through UNIX server.
  • Developed parameter files for all the mapping and Workflows objects.
  • Scheduled all the Workflows using Autosys tool.
  • Excellent interpersonal, oral and written communication skills to work with business users and team members.

Environment: Informatica PowerCenter Designer 10.1.1 HotFix1, Mapping Designer, Workflow Manager, Workflow Monitor, Repository Manager, PL/SQL Developer, Oracle, Toad), SQL, PL/SQL, Shell-Script, Auto sys, WinScp, Confluence, JIRA, Informatica CC360, Informatica Data Quality, Salesforce, Microsoft-Visio.


Informatica/ETL Developer


  • Confidential has Corporate Data Warehouse model that is Inman approach which has staging layers, followed by Data warehouse (Institutional) and then Data Marts.
  • Main objective of the project is to integrate the Small Market Digital (SMD) data coming in from the IBM-Majesco into an existing Warehouse into Data Marts. From IBM-Majesco the data would be fed by a batch into Confidential Data Hub (MDH) and then to IDW and the Data Marts.
  • Involved in end to end process of development and extensive analysis as this is an integration project however, there are few workstreams which are completely new and new data is extracted, transformed and loaded into the new and existing applications.
  • Coordinating with Onsite Team for Requirements gathering and analysis.
  • As part of analysis, developed multiple unique queries on Toad/Oracle to access Informatica Repository DB for faster identification of Informatica objects like Mappings, sessions, workflow, Source, Target etc.,
  • Used Informatica Repository Manager, Monitor to verify the active mapping/workflow.
  • Verified the Unix Shell Scripts and Jobs to identify the existing order of execution of Informatica Workflows, Sessions and Mappings.
  • Worked with the Production Team/ Source Systems in verifying the active/in-active Feeds.
  • Created DDL requests for the New and changed Tables according to the requirement.
  • Developed LLDs by understanding the Business Requirement and company’s coding standards.
  • Created Sample LLD templates for the staging and Persistent staging.
  • Actively involved In the Peer Review.
  • Implemented Slow Changing Dimension (SCD) by using Dynamic Lookup, Cycle Redundancy Check (CRC32) function in Expression T/r etc..,
  • Used Various Transformations like Joiner T/r, Filter T/r, Expression T/r, Update Strategy T/r, Router T/r, Rank T/r, and Connected/Un-Connected Lookups etc..,
  • Used debugger to test the logic implemented in the mappings.
  • Created Test Cases for the Unit Testing also Captured Data for the better understanding of the team during the Peer Review.
  • Fixed values of t/r like Rank in Dev environment when noticed value difference from prod environment.
  • Collected entire current active jobs related information for Regression Testing.
  • Extensively worked with different BSA’s of different workstreams in identifying the targeted/affected areas for Integration and also during the Backward Analysis, Forward Analysis and Data Analysis.

Environment: Informatica PowerCenter Designer 9.6.1 HF3, Workflow Manager, Workflow Monitor, Repository Manager, Advance Query Tool(AQT/v10), DB2, Toad 12.6, oracle(11x), SQL, PL/SQL, Shell-Script, Auto sys, WinScp, Putty, Maestro- Job-Scheduler(IBM-Tivoli), SharePoint(x/O365), IDQ, B2B, Microsoft-Visio.


Sr. Informatica ETL Developer


  • Coordinating with Onsite Team and client for Requirements gathering and analysis.
  • Understanding and developing the ETL framework for Informatica objects as per coding standards.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Worked with Informatica Data Quality (IDQ) tool kit, Analysis, data cleansing, data matching, data conversion, exception.
  • Used reference tables and rules created in Analyst tool.
  • Used various IDQ transformations like Standardizer, Match, Association, Parser, Weighted Average, Comparison, Consolidation, Decision, Expression
  • Implement Data Quality Rules using IDQ to check correctness of the source files and perform the data cleansing/enrichment.
  • Loading data into operational data store.
  • Designing and developing Informatica mappings including Type-I, Type-II, Type-III slowly changing dimensions (SCD).
  • Coding & testing the Informatica Objects & Reusable Objects as per BI standards.
  • Participating in peer review of Informatica objects.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD.
  • Estimating volume of work & Deriving delivery plans to fit into overall planning.
  • Prepared ETL Build Peer Review Checklist’s and Unit Test Case Templates for different work packages.
  • Involved in Unit Testing, Integration Testing and System Testing.

Environment: Informatica 9.6.1(PC & IDQ), Data Stage, TeraData 13, SQL SERVER 2014, Teradata SQL Assistant, SQL Developer, UNIX, CITRIX, Business Objects, JIRA.


Informatica ETL Developer


  • Understanding and developing the ETL framework for Informatica objects as per coding standards.
  • Analyzed functional specifications and the business requirements of the project.
  • Used Informatica 9.5 PowerCenter Designer and used transformations like Source Qualifier, Expression, Filter, Router, Joiner, Sequence developer, Update Strategy, Lookup, Sorter, Aggregator, Normalizer, XML Source Qualifier, Stored Procedure etc. for extraction, transformation and loading of data.
  • Created new Pre-stage tables and staging tables for the data loads.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Created Stored procedures to use oracle generated sequence numbers in Mappings instead to using Informatica Sequence generator.
  • Created Rollback Stored Procedure to roll back data loaded.
  • Created complex Mappings and implemented Slowly Changing Dimensions (Type 1, Type 2 and Type 3) for data loads.
  • Created complex Mappings to implement data cleansing on the source data.
  • Created Mappings to implement one time as well as Incremental loads.
  • Created Mappings to load 150 million historical Transactions.
  • Extensively used Informatica debugger to eliminate the bottlenecks for optimum performance. Also involved in troubleshooting existing ETL bugs.
  • Used Mapping Variables, Mapping Parameters and Session Parameters to increase the re-usability of the Mappings.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Used Informatica Update Else Insert session property and Update Override extensively to improve the performance of the data loads.
  • Developed Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Created various batch Scripts for scheduling various data cleansing scripts and loading process.
  • Created Technical Documentation for all the developed mappings for presentation with Business.
  • Extensively used Erwin for Logical and Physical data modeling and designed Star Schemas.
  • Created deployment groups, migrated the code into different environments.
  • Worked closely with reporting team to generate various reports.

Environment: Informatica 9.5 (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SAP Business Objects 4.0, Oracle 11g, SQL Developer 3.2.1, Toad, PLSQL, SQL, Active Batch 7, Talend 5.3.1, ERWIN.


Informatica Developer


  • Understanding and developing the ETL framework for Informatica objects as per coding standards.
  • Created transformations like Expression, Lookup, Joiner, Rank, Update strategy and Source Qualifier, Filter, Router using the mapping designer.
  • Understanding and developing the ETL framework for Informatica objects as per coding standards.
  • Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements.
  • Extracted Data from FACETS Database.
  • Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target files in efficient manner.
  • Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and also by using various components like, Parameter files & Variable.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts.
  • Worked on different data sources like facets, Medco, tasks included cleansing the data, providing the claims data coming from facets into a flat file in the desired format.
  • Scheduled the tasks using ESP scheduler.
  • Written Unit test scripts to test the developed mappings.
  • Developed mapplets and rules using expression, labeller and standardizer transformations using IDQ.
  • Deployed code from IDQ to power center.
  • Performed proof of concept of data using Informatica data quality (IDQ) along with Oracle Exadata.
  • All non-production Mainframe environment and other data warehousing databases such as Mainframe files, Teradata, SQL Server, DB2 and Sybase systems where the customer data is stored, will be scanned for Personally Identifiable Information (PII) and will be masked.
  • Created complex mappings using Unconnected and Connected lookup Transformations.
  • Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre-session commands.
  • Extensively used debugger to test the logic implemented in the mappings.
  • Performed error handing using session logs.
  • Implemented Slowly changing dimensions Type 1 and Type 2 for change data capture.
  • Extensively worked with Teradata Fast load utility to load huge tables for initial loads and truncate and load.
  • Worked with Teradata multi load utility for insert, update or incremental loads.
  • Used Teradata BTEQ for writing scripts in Teradata database.

Environment: Informatica Powercenter 9x, IDQ 9x, Oracle 11g Exadata, SQL, Teradata 14,Facets,Auto sys, Unix/Linux.

Hire Now