We provide IT Staff Augmentation Services!

Informatica Developer Resume

0/5 (Submit Your Rating)

Pittsburgh, PA


  • Extensive experience as a skilled Technology Professional specializing in Informatica PowerCenter encompassing all phases of analysis, design, development, implementation and support of Data Warehousing applications using Informatica PowerCenter, SQL programming and scripting.
  • Proficient in data analysis, data mart design, development and implementation of ETL processes against high - volume data sources.
  • Thorough understanding of business intelligence and data warehousing concepts with emphasis on ETL.
  • Extensive experience in all phases of the Data Warehouse Lifecycle involving Analysis, Design, Development and Testing of Data Warehouses using ETL, Data Modeling and Reporting tools.
  • Extensive experience in designing, developing, and implementing Extraction, Transformation, and Load (ETL) techniques on multiple database platforms and operating system environments.
  • Proficient in various operating system environments like Windows 2000/NT/XP, IBM AIX, LINUX and UNIX.
  • Solid experience in Informatica PowerCenter (9.x/8..x/7.x) including Designer, Workflow Manager, Workflow Monitor and Repository Manager
  • Expert in Creating, Configuring and Fine-tuning ETL workflows designed in DTS and MS SQL Server Integration Services (SSIS).
  • Expertise in Creating, Managing and Deploying SSRS Reports on the portal for user accessibility.
  • Strong Experience in creating Drill down, Drill through, parameterized, Cascade parameterized, Sub, Linked, Charts and Snapshots using SSRS based on Relational and Multidimensional databases.
  • Experience with Teradata utilities like TPT, Fastload, FastExport, Mload and BTEQ.
  • Experience in integration of various data sources like Oracle, Teradata, DB2, SQL Server, Flat Files and XML Files. Experience in the complete life cycle comprising of design, development, documentation and maintenance of the data marts and data warehouse.
  • Strong knowledge in RDBMS concepts and extensive experience in creation and maintenance of tables, views, materialized views, stored procedures, synonyms, triggers and complex SQL programming.
  • Strong Database Experience in Oracle, MS SQL Server and Teradata databases.
  • Highly Proficient in Informatica Designer Components like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Extensive experience with various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, Union, HTTP and Web Services.
  • Experience using the Pushdown Optimization (PDO) feature in Informatica.
  • Sound knowledge and experience in performance tuning of data warehouse bottlenecks.
  • Good Experience in UNIX Shell scripting and Windows Batch scripting for automation of batch and ETL jobs.
  • Sound Understanding and experience in creating entity-relational and dimensional-relational table modeling using Data Modeling (Dimensional & Relational) concepts like Star Schema Modeling and Snow-flake schema modeling.
  • Working knowledge and experience with BI tools such as Cognos and Microstrategy.
  • Working knowledge of Power Exchange CDC.
  • Experience with scheduling tools such as TWS and Tidal.


ETL Tools: Informatica Power Center (Designer, Workflow Manager, Workflow Monitor, Repository Manager, Power Exchange) 9.5.X/ 9.1/8.6/8.5/7.1/6. X

Databases: Oracle 11g/9i/8i, Teradata, SQL Server 2008/2005/2003 , DB2 LUW, MS Access, My SQL

Environment: UNIX (Solaris, AIX), Linux, Windows

Languages: C, C++, SQL, PL/SQL, UNIX Shell Scripting


Confidential, Pittsburgh, PA

Informatica developer


  • Analyzed business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Involved in design phase of logical and physical data model using Erwin 4.0.
  • Involved in developing dependent Data Marts to house data to support the front end BI tool.
  • Involved in designing and developing ETL solutions to meet complex business requirements.
  • Worked closely with the Data Modelers to design data structures best suited for the requirements.
  • Displayed great multitasking capability by working on multiple projects simultaneously while delivering timely results.
  • Worked closely with the Data Analysts, SQA’s and BI Developers to ensure accurate and timely delivery of solutions.
  • Utilized Informatica Push Down Optimization (PDO) wherever necessary to ensure efficient performance of mappings.
  • Utilized Volatile Temporary tables to aid in data load processing.
  • Used TPT (Teradata Parallel Transporter) configured for Fastload and Mload to load data.
  • Utilized pre and post session command tasks to invoke BTEQ scripts to insert/update Audit tables.
  • Created SSIS package to extract, transform and load data from Flat Fileto Flat File and Flat File into the Data warehouse and to Report-Data mart, usingLookup, Fuzzy Lookup, Derived Columns, Condition Split, Aggregate, Pivot Transformation, and Slowly Changing Dimension.
  • SQL server reporting services (SSRS) installation and configuration on QA and Prod environments
  • Worked on import and export of data from sources to Staging and Target using Teradata MLOAD, Fast Export, TPUMP and BTEQ.
  • Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
  • Analyzed anddesigned USI andNUSIbased on the columns used in join during dataretrieval.
  • Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Utilized Tortoise SVN tool to maintain Enterprise-Wide versioning.
  • Created Run Books and Turnover documents for the scheduling team to schedule jobs on Tidal.
  • Created Technical Design documents and Implementation Plans for migration of code to various environments.
  • Used the SDE tool to implement and adhere to Change Management practices.

Environment: Informatica PowerCenter 9.5, Informatica DVO, ERWin, Informatica Developer, Teradata 13, SSIS/SSRS, SQL Server 2008, Windows Batch Scripting, Tortoise SVN, Microstrategy, Tidal, SDE

Confidential, Hartford, CT

ETL Informatica developer


  • Preparing documentation for requirements, design, install and Unit testing and System Integration.
  • Worked extensively on complex mappings using source qualifier, joiner, expressions, aggregators, filters, lookup and update strategy transformations
  • Compared actual results to expected results and suggested changes to mappings owned by others
  • Developed PL/SQL procedures oracle(11g) for processing business logic in the database and use them as a Stored Procedure Transformation
  • Involved in designing the mappings between sources and targets and also tuned them for better performance
  • Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Worked on Performance tuning by identifying the bottlenecks in sources, targets, mappings and transformations
  • Enhanced performance for Informatica session using large data files by using partitions, Increasing block size, data cache size, sequence buffer length and target based commit interval
  • Developed a number of Complex Informatica Mappings, Mapplets and Reusable Transformations for other mappings
  • Migrated Mappings, Sessions and Common Objects from Development to Test and to Production
  • Created reusable Transformations and Mapplets and used them in mappings in case of reuse of the transformations in different mappings
  • Extensively used SQL overrides at Source Qualifier and Lookup Transformations while extracting data from multiple tables
  • Monitoring day-to-day Process for different Data Loads and resolving Issues.
  • Used Shell scripts to call procedures and load the data into the table.
  • Developed Perl scripts to automate pre session and post session task.
  • Provide post-production support and Performance tuning.
  • Created parameter files with workflows and sessions
  • Involved with the healthcare claims processing 837, 835.
  • Worked with HIPPA 5010 for reduced risks, flexibility and complete bi-directional transaction crosswalk transactions.
  • Extensively worked on Microsoft Team Foundation Server for checking the code.

Environment: Informatica Power Center 9.1.0, Power Exchange 9.1, Oracle 11g, SQL Server 2008, PL/SQL, SQL, DB2, UNIX, Shell scripts.

Confidential, Des Moines, IA

Informatica developer


  • Analyzed business requirements to design, develop, and implement highly efficient, highly scalable Informatica ETL processes. Designed Informatica ETL solutions.
  • Worked closely with architects and data analysts to ensure the ETL solution meets business requirements.
  • Interacted with key users and assisted them with various data issues, understood data needs and assisted them with data analysis.
  • Involved in Documentation, including source-to-target mappings and business-driven transformation rules.
  • Designed mappings that loaded data from flat-files to the staging tables.
  • Involved in designing the end-to-end data flow in a mapping.
  • Designed, developed and implemented scalable processes for capturing incremental load.
  • Used a wide range of transformations such as the Source qualifier, Aggregator, Expression, lookup, Router, Filter, Sequence Generator, Update Strategy and Union Transformations.
  • Used FTP connection to store, stage and archive Flat Files.
  • Developed structures to support the front end Business Objects reports.
  • Extensively worked with Repository Manager, Designer, Workflow Manager and Workflow Monitor. Developed Informatica mappings, sessions and workflows.
  • Developed and executed test plans to ensure that the ETL process fulfills data requirements.
  • Supported and mentored the off shore team for production support.
  • Involved in tuning Informatica Mappings and Sessions as well as tuning at the database level.
  • Participated in peer-to-peer code review meetings.
  • Wrote UNIX shell scripts and used them as Pre/Post Session commands in the Sessions.
  • Involved in formulating load strategies and identifying dependencies.
  • Designed the Migration Plan document to migrate the Mappings and work-flows to the testing and production environments.
  • Created error tables to capture data anomalies and rejects.

Environment: Informatica PowerCenter 8.6.1, Oracle 10g, TOAD, ER-Studio, AIX, UNIX Shell Scripting, MS Visio.

Confidential, Deerfield, IL

ETL Informatica developer


  • Involved in Dimensional modeling of the Data warehouse and used Erwin to design the business process, grain, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated the Data warehouse.
  • Generated Load Ready Files (lrfs) to load into the Teradata database.
  • Used Teradata Utilities like FastLoad, BTEQ and MLoad to load data.
  • Developed BTEQ scripts for running batch jobs and scheduling them.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Extensively migrated data from different relational and file source system to ODS, Data marts and Data warehouse.
  • Designed and developed ETL load strategy to populate the Data Warehouse.
  • Standardized parameter files to define session parameters such as database connection for sources, targets, last updated dates for incremental loads and many default values of fact tables.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loaded into data warehouse.
  • Performed tuning of Informatica Mappings for optimum performance. Involved in Unit, Integration, system, and performance testing levels.
  • Monitor and troubleshoots batches and sessions for weekly and monthly extracts from various data sources across all platforms to the target database.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Environment: Informatica Power Center 8.1, Oracle 9i/10g, Teradata V12, PL/SQL, Erwin 3.5, Teradata SQL Assistant, MS-Office, Windows XP.


Informatica developer


  • Major involvement in creating and tuning the mappings, to perform maximum efficiency and performance complete knowledge of session properties in Work flow monitor.
  • Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator and Router Transformations.
  • Created update strategy and stored procedure transformations to populate targets based on business requirements.
  • Implemented Slowly Changing dimension type2 methodology for accessing the full history of accounts and transaction information.
  • Built-in mapping variable / parameters and created parameter files for imparting flexible runs of sessions / mappings based on changing variable values
  • Used Informatica to create and organize various Metadata types, including legacy Source and Data warehouse Target data definitions and ETL jobs.
  • Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
  • Loaded operational data from Oracle, Sql server, flat files, Excel Worksheets into various data marts.
  • Created Tasks and Workflows using Task Manager and Workflow Designer
  • Also developed numerous Stored Procedures, Triggers and common functions using PL/SQL.

Environment: Informatica Power Center 7.1, Work Flow Manager, Oracle 10g, SQL Server 2005, DB2, Erwin, TOAD, UNIX Shell script.

We'd love your feedback!