We provide IT Staff Augmentation Services!

Informatica Developer Resume


  • Eight years of experience in Data Warehousing using the tool Informatica PowerCenter 9.6/9.5.1/9.1/8.6/7.1 (Designer, Repository Manager, Workflow Manager, Workflow Monitor and Power Center Administration Console)
  • Working in development of various projects involving Data Warehousing tool in using Informatica Power Center.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications
  • Have an experience on Hadoop ecosystems.
  • Knowledge/Exposure on Big data (HDFS, HIVE, HBASE, Impala, Sqoop, OOZIE, Kafka and Impala).
  • Skilled in performing complete process of Extraction, Transformation and Loading of data from single, multiple sources into Data Warehouse or Data Marts
  • Expertise in working with Flat file and relational databases such as Oracle 11g/10g/9i.
  • Good at Oracle PLSQL based work apart from SQL scripting.
  • Working extensively with slowly changing dimensions.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Profound knowledge of unit testing and system testing on Informatica Power center code at various levels of the ETL
  • Understand the business rules completely based on High Level document specifications and implemented the data transformation methodologies.
  • Good at identifying and resolving on function and performance based bottle necks on Oracle PLSQL objects.
  • Very good at SQL scripting, exclusively for module based functionality.
  • Well and proved knowledge on writing PLSQL procedures, functions, cursors, collections and packages.
  • Expertise at Database (Oracle PLSQL) performance tuning.
  • Experience in using Oracle related tools like SQL * Loader, Toad, SQL Navigator and SQL Developer.
  • Independently perform complex troubleshooting, root - cause analysis and solution development.
  • Experience with dimensional modelling using star schema and snowflake models.
  • Experience in Creating and Maintaining Database objects like Tables, Indexes, Views, Synonyms, Object types and Collections.
  • Experience on system study, design, development, post implementation maintenance and support of Oracle EBS or any ERP.
  • Good understanding of Data warehousing concepts, ETL Processes and Data modelling concepts such as Design of the Dataflow, ER Diagrams, Normalization and De-normalization of Tables.
  • Strong technical expertise in UTL FILE, DBMS SQL, DBMS OUPUT, DBMS UTILITY, DBMS JOBS packages.
  • Good at working with production support and application admin teams.
  • Team player, Motivated, able to grasp things quickly with analytical and problem-solving skills.
  • Eager to contribute in a team-oriented environment having inner desire to gel and work for business growth
  • Strong experience in offshore-onshore model projects work execution.
  • Excellent analytical skills, logical reasoning, interpersonal skills, problem solving skills and communication skills.
  • Excellent verbal and written communication skills with technical documentation skills


Data Warehousing: Informatica Power Center 9.5/9.1/8.6.1/8.5/8.1.1 , Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer, Workflow Manager, Workflow Monitor.

Dimensional Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling

Databases: Oracle 11g/10g/9i, Microsoft SQL Server, IBM-DB2., Hadoop Ecosystems

Programming Languages: C, C++, SQL, PL/SQL, Java, SQL Plus, XML.

Scripting: Unix shell scripting, Linux

Other Tools: SQL * PLUS, TOAD, PL/SQL Developer, Autosys and CA - ERWIN Data Modeler

Operating Systems: Windows, Windows: NT/2000/XP Professional, MS-DOS, UNIX, Sun Solaris and HPAIX.



Informatica Developer


  • Responsible for Business Analysis, Requirements Collection and HLD creation
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Worked on complex mappings, mapplets and workflow to meet the business needs ensured they are reusable transformation to avoid duplications.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated into target locations(tables/files).
  • Responsibilities include creating the sessions and scheduling the sessions
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets to use them in different mappings.
  • Working with HDFS file system and developed ETL mappings and load the data into Oracle Database
  • Working on NoSQL Databases - Hive and Hbase
  • Working with Sqoop.
  • Developed mappings to load into staging tables and then to flat files
  • Used existing ETL standards to develop these mappings.
  • Development of Oracle - PLSQL packages, triggers, views and creating new schemas for new modules etc.
  • Provides knowledge transition to interface module team members as well to client based on new rollouts.
  • Performance tuning in Oracle - PLSQL level & ETL component level.
  • Working with UNIX and UNIX shell scripts.
  • Remove exhaustive bottle-necks in Oracle PLSQL scripts and Informatica level workflows.
  • Participate in all stakeholders meeting and Organizing technical standard meeting for reviews.

Environment: Informatica 9.6/9.5, Oracle10g/11g, TOAD, PL/SQL, SQL, SQL* Loader, UNIX Shell Scripting, Hadoop Eco system, Autosys.

Confidential, Chicago, IL

Informatica Developer


  • Designed, developed, implemented and maintainedInformaticaPowerCenter and IDQ 8.6.1 application for matching and merging process.
  • IntegratedInformaticaapplications as Maplets within PowerCenter Mappings.
  • Completed system testing utilizing toolsets of Mercury quality center.
  • Utilized of InformaticaIDQ 8.6.1to complete initialdata profiling and matching/removing duplicate data.
  • Worked with Systems Analysts to prepare detailed specs from which programs was written.
  • Designed, coded, tested, debug and documented those programs.
  • Planed, organized and developed technical support protocol, maintained code, responded to system users, evaluated software, system problems and potential solutions to application systems requirements.
  • Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ 8.6.1) are the tools are used here. IDE is used for data profiling over metadata and IDQ 8.6.1 for data quality measurement
  • Created complex mappings that involved implementation of Business Logic to load data in to staging,ODS.
  • Developed modules to extract, process & transfer the customer data using Teradata utilities.
  • Created Teradata Fast Export scripts for extracting and formatting customer data from data warehouse in a mainframe file.
  • Worked on Informatica Data Quality (IDQ) in driving better business outcomes with trusted enterprise data and empowering the data quality and data governance.
  • Extracted data from different external vendor source systems (oracle, mainframes, flat files, .csv files,XML filesand DB2), transformed and loaded data into EIM tables from where loaded data into Siebel base tables using IFB files
  • Worked with Power Exchange Navigator to import the source tables from Mainframe’s and perform masking and update the respective Source tables.
  • Developed the SQL scripts and Procedures for the business rules using Unix Shell and NZSQL for Netezza.
  • Worked on the Informatica Session partitioning and Database table partitioning.
  • Worked on development of Linux/Unix shell scripting.
  • Involved in development of Stored Procedures, Packages and Triggers using PL/SQL.
  • Involved in the Unit testing and System testing.
  • Debug the Informatica mappings and validate the data in the target tables once it was loaded with mappings.

Environment: Informatica Power center 9.0, Informatica Power Exchange 9.x,XML, Informatica Data Quality(IDQ) 8.x, FTP,Business Objects,Oracle 11g,Control-M,Teradata R12, AutoSys, TOAD, Windows XP.

Confidential, Chicago, IL

Data warehouse Developer


  • Member of warehouse design team assisted in creating fact and dimension tables based on specifications provided by managers.
  • Load operational data from Oracle, SQL Server, flat files, Excel Worksheets into various data marts like PMS and DEA.
  • Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.
  • Implemented effective date range mapping (Slowly Changing dimension type2) methodology for accessing the full history of accounts and transaction information.
  • Design complex mappings involving constraint based loading, target load order.
  • Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations and Create mapplets that provides reusability in mappings.
  • Involve in enhancements and maintenance activities of the data warehouse including performance tuning, rewriting of stored procedures for code enhancements.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.

Environment: Informatica Power Center 6.2, Oracle, Business Objects 6.x, Windows 2000, SQL Server 2000, Microsoft Excel, SQL * Plus

Confidential, NY

ETL Consultant


  • Used Informatica Power Center to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different source systems (SQL server, Oracle, XML).
  • Assisted data modeler in designing Conceptual, Logical and Physical data models making use of ERwin for relational OLTP systems.
  • Utilized ofInformaticaIDQ 8.6.1to complete initialdataprofiling and matching/removing duplicate data.
  • Designed, developed, maintained and tested universes for supporting ad hoc queries and canned reports according to the business to business (B2B) requirements.
  • Use Agile Methodology for SDLC and utilize scrum meetings for creative and productive work.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, Teradata, WSDL, XML files, IBM DB2, and Worked on integrating data from flat files like fixed width and delimited.
  • Designed, developed, implemented and maintainedInformaticaPowerCenter and IDQ 8.6.1 application for matching and merging process.
  • Created mappings using various Transformations like Source Qualifier, Aggregator, Expression, Filter, Router, Joiner, Stored Procedure, Lookup, Update Strategy, Sequence Generator and Normalizer.
  • Expertise in implementing performance tuning techniques both ETL & Database level.
  • Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCD’s (Type 1/Type 2) loads.
  • Used Debugger to test the mappings and fixed the bugs.
  • Extensively worked with aggregate functions like Avg, Min, Max, First, Last in the Aggregator Transformation.
  • Extensively used SQL Override function in Source Qualifier Transformation.
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and Master Outer Join in the Joiner Transformation.
  • Extensively worked with Incremental Loading using Parameter Files, Mapping Variables, and Mapping Parameters.
  • Written Queries, Procedures, created Indexes, Primary Keys and Database testing.
  • Defects were analyzed, fixed, tested, tracked and reviewed.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflow.
  • Tuned performance of mapping and sessions by optimizing source, target bottlenecks and implemented pipeline partitioning.
  • Worked with DBA for partitioning and creating indexes on tables used in source qualifier queries.
  • Scheduled various daily and monthly ETL loads using Control-M
  • Involved in writing UNIX shell scripts to run and schedule jobs.
  • Involved in unit testing.
  • Involved in Production Support in resolving issues and bugs.

Environment: Informatica Power Center 8.6, Informatica Power Exchange 8.6,IDQ 8.6.1, PL/SQL, Unix Shell scripts, Linux,Windows, Teradata, Oracle 10g, Control-M, SQLServer 2012, FTP, Toad 8.0


QA Consultant


  • Mainly involved in ETL developing.
  • Analyzed the Sources, targets, transformed the data and loading the data into Target Database using Informatica.
  • Transformations like Expression, Router, Joiner, Lookup, Update Strategy, and Sequence Generator are mainly used.
  • Design, develop, and testInformaticamappings, workflows, worklets, reusable objects, SQL queries, and Shell scripts to implement complex business rules.
  • Design and developPL/SQLpackages, stored procedure, implement best practices to maintain optimal performance.
  • Created the reusable Mapplets using Mapplets designer and that Mapplets are used in mapping.
  • Extracted data from oracle, DB2, flat file and xml sources.
  • Good knowledge of Data warehouse concepts and principles (Kimball/Inman) - Star Schema, Snowflake, SCD
  • Designed and developed UNIX shell scripts as part of the ETL process.
  • Documented flowcharts for the ETL (Extract Transform and Load) flow of data using Microsoft Visio and created metadata documents for the Reports and the mappings developed and Unit test scenario documentation for the mentioned.
  • Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
  • Supported and Fixed ETL bugs, reporting bugs identified by QA Team using Agile Methodology.
  • Performed the unit testing on the mappings developed according to the Business Scenarios.
  • Prepared the Documentation for the mappings according to the Business Requirements.

Environment: Informatica Power Center 7.1, Oracle 9i,Windows, flat files, UNIX, SQL Developer, PL/SQL, DB2.

Hire Now