We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

Irving, TX


  • 7+ years of experience in Informatica as Data Analysis, Design, and Development for various software applications in a client - server environment and provide Solutions in Data Warehousing for Decision-making Support Systems.
  • Worked in complete Software Development Life Cycle (SDLC) Implementation from Requirement gathering, analysis, data modeling, design, testing, debugging, implementation.
  • 7 Years of experience with Strong Data Warehousing ETL experience of using Informatica 9.6/9.1 Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Informatica Cloud, Informatica MDM 10.1, Informatica IDQ.
  • Experience in ETL methodology Extraction, Transformation and Loading using designed data Mapping from large variety of source systems including Oracle, SQL server, Teradata and non-relational sources like flat files, XML and Mainframe files.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, Hbase, Hive, Sqoop and Spark SQL.
  • Experienced on writing Hive queries to load the data into HDFS.
  • Experience in writing complex T-SQL queries, Sub-queries, SQL Queries for better Performance.
  • Expertise in transformations like Joiner, Expression, lookup, Filter, Aggregator, Rank, Update Strategy, Router, Sequence generator, Union, Sorter, Source Qualifier, Normalizer, Transaction Control, SQLT.
  • Strong experience with Informatica tools using real-time Change Data Capture and MD5.
  • Design developing Informatica mappings including Type-I, Type-II slowly changing dimensions (SCD).
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, ETL Design, development.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema for Identify Facts and Dimensions using ERwin and ER-Studio.
  • Loading data from various data sources and legacy systems into Teradata production and development warehouse using Teradata utilities like BTEQ, FastLoad, Multi Load and TPT.
  • Experienced in using advanced concepts of Informatica like push down optimization (PDO).
  • Experienced in resolving on-going maintenance issues and bug fixes and monitoring Informatica sessions as well as performance Determine of mappings and sessions.
  • Experience in writing UNIX shell scripts to process Data Warehouse jobs and Automation Scheduling tools like Autosys and Control-M.
  • Good Technical in performance tuning, debugging, and troubleshooting within PowerCenter.
  • Team player and self-starter with good communication skills and ability to work independently and as part of a team.


ETL Tools: Informatica Power Center9.6/9.1 (Repository manager, Mapping Designer, Workflow Manager, Workflow Monitor).

BI Tools: Business Objects, Cognos 10.1

Databases: Oracle 11g/10g/9i, SQL Server 2008/2005, Teradata

Data Modeling: Star schema and Snowflake schema, Erwin, Toad

Languages: C, C++, SQL

Operating Systems: Windows 2000/07/NT/XP, UNIX, LINUX

Scripting languages: Windows scripting, UNIX Scripting

Packages: SQL, Toad, MS SQL Developer

Scheduling Tools: Autosys, Control M

Web services: HTML

Other Applications: MS Office (MS Access, MS Excel, MS PowerPoint, MS Word), MS Outlook.


Confidential, Irving, TX

Sr. Informatica developer


  • Worked on Informatica PowerCenter/Cloud tools- Mapping Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Developed mappings to extract data from SQL Server , Oracle , Flat files , Mainframes and load into Dataware house using the Informatica Cloud, Power Center, MDM .
  • Used Informatica MDM 10.1 (Siperion) tool to manage Master data of EDW.
  • MDM Hub configurations - Data modeling, Data Mappings, Data validation, Match and Merge rules, Hierarchy Manager, customizing/configuring.
  • Define requirements for data matching and merging rules, and data stewardship workflows that can be deployed in MDM implementation.
  • Collaborate with the Enterprise Architecture teams (data integration, data architecture, business intelligence) to develop and deliver MDM implementation.
  • Developed MDM Hub Match and Merge rules Batch jobs and Batch groups.
  • Creating Connections, Task flows and Schedule information for the Informatica Cloud Data Synchronization/Replication tasks.
  • Designed and developed data integration approach to replicate and synchronize Oracle standard and custom objects by using Informatica cloud.
  • Developed mappings in Informatica Cloud/PowerCenter by using the transformations like Unconnected and Connected lookups, Source Qualifier, Expression, Router, Filter, Aggregator, Joiner, Update Strategy, Union, Sequence Generator, Rank, Sorter, Normalizer, etc.
  • Developed Mappings, Mapplets and Reusable transformations by using Informatica Cloud.
  • Used Type 1 SCD and Type 2 SCD mappings to update Slowly Changing Dimension Tables.
  • Implemented various loads like daily loads, weekly loads, and quarterly loads and on demand load using Incremental loading strategy and concepts of changes Data Capture (CDC).
  • Involved in Performance Determine at source, target and Transformations levels in DTM.
  • Extensively used mapping Parameters, Variables, User defined functions in Informatica PowerCenter.
  • Developed Workflows, Worklets and Tasks by using Power Center Workflow Designer.
  • Implemented error handling by routing invalid records to error tables, reloading them to target tables and did performance tuning and Optimization of mappings.
  • Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets.
  • Responsible for error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor.
  • Created indexes on database tables (SQL) and tuned the queries to improve the performance Performed Unit Testing and documented the results.
  • Worked on Data Conversion and Data Analysis and Data Warehousing to meet EDW requirements.
  • Scheduled and monitored automated weekly jobs under UNIX environment.
  • Wrote UNIX shell scripts using PMCMD commands for executing and scheduling workflows.
  • Attending technical meeting for Code Review walks through with team members.

Environment: Informatica Cloud/ PowerCenter 9.6, MDM 10.1, Oracle 11g, Agile, Erwin, Scheduler Autosys, SQL Server 2005/2008/2012, UNIX, Toad.

Confidential, Rocky hill, CT

Sr. Informatica Developer


  • Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Coordinate and develop all documents related to ETL design and development.
  • Involved in designing the Data Mart models with ERwin using Star schema methodology.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width and Delimited) to staging database to the target Warehouse database.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Performed profiling, matching, cleansing, parsing and redacting data using Informatica IDQ and implementing standards and guidelines.
  • Integrated Informatica data quality (IDQ) mappings in Informatica data quality tool and imported them into Informatica Powercenter as Mappings, Mapplet.
  • Experienced in Informatica data quality (IDQ) for data cleansing, data profiling, data quality measurement and data validation processing, Match, Merge, Deduplication process.
  • Worked extensively with mappings using expression, aggregator, filter, lookup, joiner, update strategy and Union, Sorter, Router, Sequence Generator, Rank transformations.
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions (SCD) and incremental loading and unit tested the mappings.
  • Removed bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets as per requirement.
  • Developed number of complex Informatica Mappings, Mapplets and Reusable Transformations for different types of requirement with Customer information, Monthly and Yearly Loading of Data.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging area to target tables using Teradata production and development warehouse using BTEQ, FastLoad, MultiLoad and TPT.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance .
  • Worked with large amounts of data analysis, utilizing appropriate tools and techniques and presenting them to both internal and external business need.
  • Used Informatica workflow manager for creating, running the Batches and Sessions and scheduling them to run at specified time.
  • Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements.
  • Scheduled jobs for running daily, weekly and monthly loads through control-M for each workflow in a sequence with command and event tasks.
  • Created UNIX shell scripting for automation of ETL processes.
  • Used check in’s and check outs of workflows and config files in to the Clearcase.
  • Automated ETL workflows using Control-M Scheduler.
  • Code Review meeting walks through with team members.

Environment: Informatica Power Center 9.6, Informatica IDQ, Oracle 9i/10g, Teradata 13, SQL developer, Toad, Control-M.

Confidential, Dallas, TX

Informatica developer


  • Gathered requirements and developed mappings to maintain metadata in Repository to load data from Oracle, SQL Server, flat files into Oracle targets.
  • Extracted the data from Flat files (Delimited and fixed width) like CSV files or text files, XML files, Oracle 11g, SQL server and load into Oracle data warehouse.
  • Interacted with subject matter experts and data management team to get information about the business rules for data cleansing.
  • Analyze the data based on requirements and wrote down techno functional documents and developed complex mappings using Informatica data quality (IDQ) used to remove the noises of data using standardization, merge, match, case conversion, consolidation, lookup etc. Transformations and performed unit testing of accuracy of data.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Used Informatica data quality (IDQ) in cleaning and formatting customer master data.
  • Integrated Informatica data quality (IDQ) with Informatica PowerCenter and Created data quality mappings in Informatica data quality (IDQ) tool and imported them into Informatica Powercenter as Mappings, Mapplet.
  • Developed standard and re-usable Transformation and Mapplet using various transformations like expression, aggregator, joiner, source qualifier, router, lookup Connected/Unconnected, Rank, union and filter.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data Warehouse.
  • Created connected and unconnected Lookup transformations to look up the data from the source to ETL target tables
  • Used Update strategy and Target load plans to load data into Type-2 Dimensions (SCD) in Change Data Capture and MD5 .
  • Created multiple Type 2 mappings in the Customer mart for both Dimension as well as Fact tables, implementing both date based and flag based versioning logic.
  • Worked using Parameter Files, Mapping Variables, and Mapping Parameters for Incremental loading.
  • Worked on identifying Mapping Bottlenecks and improved session performance through error handling.
  • Scheduling the sessions to extract, transform and load data into warehouse database on Business requirements.
  • Created Pre/Post Session and SQL commands in sessions and mappings on the target instance
  • Developed number of mappings, Maplets, reusable transformations to implement the business logic and to load the data incrementally.
  • Tuning Informatica Mappings and Sessions for optimum performance.
  • Developed mappings by usage of Aggregator, SQL Overrides in Lookups, Source filter in Source Qualifier and data flow management into multiple targets using Router transformations.
  • Monitor troubleshoots batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database.
  • Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues.
  • Developed UNIX shell scripts to move source files to archive directory.
  • Involved in Unit, Integration, system, and performance testing levels.
  • Debugging and validating and executing mappings and Involved in Technical Review and rectify the error.

Environment: Informatica Power Center 9.1, Informatica IDQ, Oracle 10g, SQL Server, UNIX.

Confidential, Irving TX

Informatica Developer


  • Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
  • Reviewing the requirements with business, doing regular follow ups and obtaining sign offs.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, and scheduling of the workflow.
  • Development of Informatica (ETL) mappings to load data into various target tables and defining ETL standards.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions (SCD) type 2 and type 1.
  • Used Debugger to test the mappings and fixed the bugs.
  • Used various transformations like Filter, Expression, Sequence Generator, Source Qualifier, Lookup, Router, Rank, Update Strategy, Joiner, Stored Procedure and Union in Informatica mapping Designer.
  • Developed analysis of Source, Requirements, existing OLTP system and identification of required dimensions and facts from the Database.
  • Tuning Informatica Mappings and Sessions for optimum performance.
  • Developed various mapping by using reusable transformations.
  • Prepared the required application design documents based on functionality required.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width and Delimited) to staging database and from staging to the target Warehouse database.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed, If the session fails debug the Mapping.
  • Involved in testing Unit and integration Testing of Informatica Sessions, Batches, fixing invalid Mappings.
  • Developed and executed scripts to schedule loads, for calling Informatica workflows using PMCMD command.
  • Worked on Dimensional Data Modeling using Data modeling tool Erwin.
  • Populated Data Marts and did System Testing of the Application.
  • Built the Informatica workflows to load table as part of data load.
  • Wrote Queries, Procedures and functions that are used as part of different application modules.
  • Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
  • Created Informatica Technical and mapping specification documents as company standards.

Environment: Informatica PowerCenter 9.0, Oracle 10g, Toad, SQL Developer, UNIX.

Hire Now