We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

Eagan, MN


  • An accomplished IT Professional with 9+ years of experience in Data Warehousing and ETL processes.
  • Strong experience in Informatica Design, development and deploying highly complex applications.
  • Expertise in performing ETL operations with Informatica Power Center 10.0.x, 9.0.x and 8.0.x.
  • Good exposure to different data bases like Oracle, SQL Server, DB2 and Teradata.
  • Solid experience on Teradata utilities like BTEQ, FASTLOAD, MULTILOAD, TPUMP and FASTEXPORT.
  • Excellent knowledge in data warehouse designs using Star Schema and Snow Flake Schema Architectures, Fact and Dimension Tables, OLTP and OLAP applications.
  • Experience in troubleshooting complex SQL queries and addressing production issues and performance tuning.
  • Experience in understanding of Partitioning, Slowly Changing Dimension, and Normalization and De - normalization concepts in DWH.
  • Experience in various data sources like DB2, SQL Server, T-SQL, Oracle, XML, Fixed Width and Delimited Flat Files.
  • Strong experience in writing UNIX Shell scripts, SQL Scripts for Development, Automation of ETL process, error handling and reporting purposes.
  • Experience with IDE/IDQ tools for Data Analysis / Data Profiling and Data Governance.
  • Experience working in agile methodology and ability to manage change effectively.
  • Excellent team player with leadership, analytical and interpersonal skills with ability to quickly adapt to new environments and technology needs.


Operating systems: Windows 7/8/XP/2000, UNIX and Linux

Data Warehousing: Informatica PowerCenter 10.0.x,9.0.x & 8.0.x, Informatica Power Exchange 8.6, Informatica Data Quality (IDQ)

Languages: SQL, PL/SQL, UNIX Shell Scripting

Dimensional Data Modeling: Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables

Programming GUI: UNIX Shell Scripting

Databases: Oracle, Microsoft SQL Server, T-SQL, Teradata and DB2

TOOLS: Toad, Control-M, GitHub, AutoSys, Jira, SharePoint, Mantissa


Confidential - Eagan, MN

Sr. Informatica Developer


  • Effectively involved in understanding the existing business model and requirements, requirements gathering and understanding business needs & process flow.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used to design the business process, grain, dimensions and measured facts.
  • Developed prime Members centric claims using Source Analyzer, Target designer, Mapping, Mapplet Designer, Transformations, Workflow Manager and Monitor.
  • Developed Claims processing data mappings to provide more feasibility to end users from the downstream applications.
  • Developed the Prime member’s claims mappings for all steps by using various transformations such as the Source qualifier, Expression Router, Filter, Aggregator, Lookup (Connected & Unconnected) and Sequence Generator as per the business requirement.
  • Developed mappings to implement Slowly Changing Dimensions (Type 1 & 2).
  • Design and development of UNIX Shell Scripts to handle pre and post session processes and also for validating the incoming files.
  • Created Databases, Tables, Stored Procedures, DDL/DML Triggers, Views, User defined data types, functions, Cursors and Indexes using T-SQL.
  • Created UNIX script to execute SQL Script.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and reloaded into a target system.
  • Involved in Performance tuning of Informatica mappings, work flows and SQL queries/procedures.
  • Develop high-performance T-SQL queries, complex joins and advanced indexing techniques to optimize database operations.
  • Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Documented the purpose of mapping so as to facilitate the personnel to understand the process and in corporate the changes as and when necessary.

Environment: Informatica Power Center 9.6, Power Exchange 9.1, Oracle11g, SQL Server, T-SQL, Flat Files, Toad, SQL Worksheet, Erwin 4.0, AutoSys, Windows XP, UNIX, Quality Center

Confidential - Alpharetta, GA

ETL Informatica Developer


  • Involved in analyzing the requirements and created Design Documents and Data Mapping Documents.
  • Involved in agile scrum methodology.
  • Prepared source to target data mapping and business rules for the ETL processes.
  • Wrote stored procedure queries, functions and packages for moving the data from staging area to DataMart.
  • Developed Complex database objects like Functions, Stored Procedures, Triggers and Packages using PL/SQL and SQL.
  • Designed and developed ETL Mappings using Informatica to extract data from flat files and XML, and to load the data into the target database.
  • Created Procedures and Packages to automatically drop table indexes and create indexes for the tables.
  • Used the Aggregator transformation to load the summarized data for Finance departments.
  • Used Mapping Parameters and Variables to implement object orientation technologies and facilitate the reusability of code.
  • Involved in performance tuning by identifying the bottlenecks at source, target, mapping, session, and database level.
  • Wrote UNIX Shell Scripts to handle pre-and post-session processes and also for validating the incoming files.
  • Optimized the T-SQL queries and converted PL-SQL code to T-SQL.
  • Development of scripts for loading the data into the base tables in EDW using Fast Load, Multi Load and BTEQ utilities of Tera data.
  • Used Tera data SQL Assistant to run the SQL queries and validate the data.
  • Extensively worked with Teradata Fast load utility to load huge tables for initial loads and truncate and load.
  • Imported Mapplets and mappings from Informatica developer (IDQ) to Power Center.
  • Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Identifying bugs in Existing mapping by using Informatica debugger.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and reloaded into a target system.
  • Created data modeling for staging area as well as Data warehouse (Star Schema and Snow Flake Schema) database.
  • Performance tune ETL programs, Maintenance and enhancements of ETL code.
  • Migrated development mappings to production environment.
  • Responsible for preparing test cases and technical unit testing for the developed reports.
  • Responsible for to analyze and apply new requirements to existing reports.
  • Continuously monitor the accuracy of the data and the content of the delivered reports.

Environment: Informatica Power Center 9.6,9.5, Informatica Power Exchange, Data Mart, Windows 8, UNIX, Oracle, XML, Microsoft SQL Server, T-SQL, PL/SQL, Teradata14, TOAD, AutoSys, Control-M, SharePoint, Mantissa and GitHub

Confidential - Southfield, MI

Sr. Informatica Developer


  • Responsible for gathering the requirements both functional, technical and documentation of the same.
  • Involved in Data Analysis, Design, Development of ETL jobs and Unit Testing phases of the Project.
  • Worked with health payer related data such as customers, policy, policy transactions, claims.
  • Configured the ETL packages to load Monthly aggregate tables and running Month-end data load Process.
  • Monitored and implemented payers related testing for medical claims, which resulted in rapid one week turnaround time to go into production.
  • Created ETL packages and to move data from source systems to Data marts.
  • Implemented metadata for various technologies like Oracle, SQL Server and Informatica.
  • Involved in creation of mapping specification documents.
  • Involved in the development of PL/SQL stored procedures, functions and packages to process business data in OLTP system.
  • Captured the DQ metrics using the Profiles and Created scorecards to review data quality using IDQ.
  • Assisted in troubleshooting the Production support problems, which are related with IDQ.
  • Set up Metadata driven utility design for the ETL processes using Informatica.
  • Used Informatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.
  • Worked on tuning of Teradata Queries using Explain plan.
  • Developed Informatica Mappings, Mapplets and Sessions for data loads and data cleansing.
  • Responsible to develop ETL (Extract Transform Loading) mappings from different stages.
  • Used designer debugger to test the data flow and fix the mappings.
  • Extracted data form Excel, Flat Files and SQL Server, applied business logic to load them in the central Microsoft SQL Server database.
  • Tuned Informatica Mappings and Sessions for optimum performance.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks.
  • Daily Coordination with Onsite-Offshore team to develop source to target mappings and identify transformation rules.
  • Involved in Unit Testing and prepared documents for Test Results and Review Logs.

Environment: Informatica Power Center 9.5, DB2, Teradata, MS Visio, Windows XP, UNIX, XML, Flat Files, Oracle, Microsoft SQL Server, Informatica Data Quality(IDQ), PL/SQL, AutoSys, Jira, TOAD and GIT

Confidential - San Francisco, CA

ETL Developer


  • Analyzed the functional specs provided by the data architect and created technical specs documents for the mappings.
  • Created new Informatica Mappings with Source qualifier, Union, Aggregator, Connected & unconnected lookups, Filter, Update Strategy, Rank, Stored Procedure, Expression and Sequence Generator transformations while transforming the data.
  • Created Reusable transformations and Mapplets for use in Multiple Mappings.
  • Used session partitions, Dynamic cache memory and Index caches for improving performance of Informatica server.
  • Created numerous scripts with Teradata utilities BTEQ, MLOAD and FLOAD.
  • Worked with Teradata utilities like Teradata Parallel Transporter (TPT), BTEQ, Fast Load, Multi Load, TPump and Worked with work tables and Teradata Stored procedures.
  • Created multiple universes and resolved loops by creating table aliases and contexts.
  • Coordinated with customer in finding the sources and targets for data conversion.
  • Involved in the preparation of documentation for ETL standards, Procedures and Naming conventions as per ETL standards.
  • Used session partitions, Dynamic cache memory and Index caches for improving performance of Informatica server.
  • Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Analyzed the source data with business users, developed critical mappings using Informatica Power Center to load the data from DB2 to Oracle.
  • Scheduled sessions and batches on the Informatica Server using Informatica.
  • Extracted data from SQL server Source Systems and loaded into Oracle Target tables.
  • Involved in the migration of existing ETL process to Informatica Power center.
  • Worked on troubleshooting complex SQL queries and addressing production issues and performance tuning.
  • Developed mappings to implement Slowly Changing Dimensions (Type 1 & 2).
  • Involved in the migration of existing ETL process to Informatica Power center.
  • Used Control-M as the scheduling tool for on demand, Hourly, Cyclic, File watcher, weekly, monthly, annual run jobs etc.

Environment: Informatica 9.5,9.1, SQL Server, Informatica Power Exchange, Informatica Data Quality(IDQ), Oracle, COBOL, XML, Workflow Manager, Workflow Monitor, Business Objects, Teradata, UNIX, Control-M, JIRA

Confidential - San Ramon, CA

Informatica Developer


  • Effectively interacted with various department heads for requirements gathering.
  • Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Created staging area according to the input file layout, proposed in design.
  • The project involved developing mappings for moving data from Flat Files to Staging (STG).
  • Involved in code review and prepared review checklist docs.
  • Developed standard and re-usable mappings using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and Router.
  • Created, optimized, reviewed, and executed SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Extensively worked with Incremental Loading using Parameter Files, Mapping Variables and Mapping Parameters.
  • Actively involved in Daily Status calls with Client, for Daily Onsite-Offshore Co-ordination Development Status, Maintain and Discussion with Issue Logs and Defect Tracking & Status.
  • Loaded sample data in source system for unit testing, customizing the mappings and code review.
  • Worked with multiple sources such as Relational Databases, Flat files.
  • Involved in Performance documentation.
  • Actively involved in Daily Status calls with Client, for Daily Onsite-Offshore Co-ordination Development Status, Maintain and Discussion with Issue Logs and Defect Tracking & Status.

Environment: Informatica 8.6, DB2, Oracle, Windows XP, PL/SQL, UNIX, AutoSys and Perforce


ETL Developer


  • Extensively worked with Informatica tools- Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Repository server and Informatica server to load data from flat files, legacy data.
  • Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
  • Scheduling of sessions and batches using workflow manager to load the data into target table.
  • Created Sessions, Tasks, Work flows and Worklets using Work flow manager.
  • Data modeling experience using Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.
  • Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse and troubleshoot data issues.
  • Developed various documentation including on boarding documentation, troubleshooting, migration etc.
  • Involved in creating, monitoring, modifying, & communicating the project plan with other team members.
  • Responsible for providing production support and resolve issues.

Environment: Informatica Power Center 8.5, Erwin 4.0, Oracle 9i, SQL, Crontab, MS PowerPoint, TOAD 7.5 and KUBERA

Hire Now