We provide IT Staff Augmentation Services!

Informatica Developer Resume

Dover, NH


  • 8+ years of strong IT experience in analysis, design, development and testing client server environment with focus on Data Warehousing application using tools like Informatica Power Center9.x/8.x/7.x/6.x, Business Objects with Oracle, SQL Server databases and IDQ.
  • Involved in Full Software Development Life Cycle (SDLC) involving Application Development, Data Modeling, Business Data Analysis and ETL/OLAP Processes.
  • Strong Data Warehousing ETL experience of using Informatica 9.6.1/9.1/8.6.1/8.5/8.1/7.1 Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Experience in Microsoft Business Intelligence technologies like SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS) and SQL Server Analysis Services (SSAS).
  • Experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and loading using Informatica Power center9.x/8.x/7.x/6.x, IDQ and Trillium.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Experience working on Informatica transformations like normalizer, Lookup, Source Qualifier, Expression, Aggregator, Sorter, Rank and Joiner.
  • Worked on PROVIDER, CLAIMS, MEMBER subject areas.
  • Experienced in designing the Conceptual, Logical and Physical data modeling using Erwin and ER Studio Data modeling tools.
  • Experienced in developing meta-data repositories.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases
  • Closely worked with business users in order to develop reports according to user requirements.
  • Strong experience in Informatica Data Quality (IDQ), creating data profiles, custom filters, data cleansing and developing score cards.
  • Expertise in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server, DB2 8.0/7.0, UDB, No SQL Databases, Mongo DB, MS Access and Teradata
  • Performed extensive Data profiling and analysis for detecting and correcting inaccurate data from the databases and to track data quality.
  • Strong experience in writing complex SQL Queries, Stored Procedures and Triggers.
  • Experience in Agile/Scrum, TDD and BDD methodologies
  • Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Excellent problem solving and analytical skills, committed team player with multitasking capabilities.
  • Strong communication skills, both verbal and written, with an ability to express complex business concepts in technical terms.


Operating Systems: Linux, Unix, Windows 2010/2008/2007/2005/ NT/XP

ETL Tools: Informatica Power Center 10.1/9.x/8.x/7.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), IDQ.

Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2/UDB, Mongo DB, Teradata.

Data Modelling Tools: Erwin, MS Visio, E/R Studio, Excel, SAS, R language.

Reporting Tools: Reporting services SSRS, SSIS Tableau, and Micro strategy

Languages: Java, JavaScript, HTML, Perl, SQL, PL/SQL, UNIX, Shell scripts, C++, R.

Scheduling Tools: Autosys, Control-M.

Tools: Selenium, QTP, Win Runner, Load Runner, Quality Center, Test Director, TOAD


Confidential, Dover, NH

Informatica Developer


  • Loaded data from different Sources to Profile the data for Business.
  • Participating with the Business team in meetings to prepare the BRD.
  • Experience in understanding Business Requirement Documents (BRD), developing test plans, defining test cases.
  • Preparing the Design documents for the BRD and having good knowledge in DMD (Data Mapping Document).
  • Experience in designing/developing complex mappings, from varied transformations logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and Normalizer Transformations.
  • Well versed in writing the complex SQL queries, unions and multiple table joins and experience with Materialized Views.
  • Used Unix Shell Scripts to automate pre-session and post-session processes.
  • Did performance tuning to improve Data Extraction, Data process and Load time.
  • Wrote complex SQL Queries involving multiple tables with joins.
  • Implemented best practices as per the standards while developing Informatica mappings understanding the Technical design documents.
  • Extensively used SQL scripts/queries for data verification at the backend after ETL Jobs load in development.
  • Implemented SCD Type I/II/III slowly changing dimensions.
  • Developed and updated Test plans, Test Scenarios and Test Cases for all products.
  • Involved in preparing the Technical design documents.
  • Failed data to be fixed thru UI by fixing the Policy data in live tables.
  • Writing Stored Procedures to load data to tables to show in UI.
  • Extracted data from various source systems like flat files, database tables.
  • Monitoring the ETL Workflows in Production and Enhancing existing production ETL scripts

Environment: Informatica PowerCenter (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Oracle 10g, SQL Server, Flat files, SQL, PL/SQL, UNIX Scripting, SOA Services, Web Service Transformation, XML Source and Target transformations.

Confidential, Austin, Texas

Informatica/ IDQ Developer


  • Worked in Agile development methodology environment and Interacted with the users, Business Analysts for collecting, understanding the business requirements.
  • Worked on building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Involved in the installation and configuration of Informatica Power Center 9.6 and evaluated Partition concepts in Power Center 9.6
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Created stored procedures, views, user defined functions and common table expressions.
  • Generated underlying data for the reports through SSIS exported cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.
  • Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
  • Involved in IDS Services like building Business logics, analyzing the structure and data quality, creating a single view of data etc.
  • Worked on Informatica cloud for creating source and target objects, developed source to target mappings.
  • Involved in importing the existing Power center workflows as Informatica Cloud Service tasks by utilizing Informatica Cloud Integration.
  • Involved in Data integration, monitoring, auditing using Informatica Cloud Designer.
  • Worked on Data Synchronization and Data Replication in Informatica cloud.
  • Written PL/SQL scripts, created stored procedures and functions and debugged them.
  • Created Mapplets, reusable transformations and used them in different mappings. Used Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Involved in Production Support by performing Normal Loads, Bulk Loads, Initial Loads, Incremental Loads, Daily loads and Monthly loads and Developed reports based on issues related to the data warehouse.
  • Used different Informatica Data Quality transformations in the Developer and Configured match properties match paths, fuzzy match key, fuzzy and exact match columns
  • Created profiles, rules, scorecards for data profiling and quality using IDQ.
  • Used Informatica Data Quality for addresses and names clean-ups and developed error handling & data quality checks to pull out the right data
  • Used IDQ to cleanse and accuracy check the project data, check for duplicate or redundant records.
  • Used debugger to test the mapping and fix the bugs and identified the bottlenecks in all levels to tune the performance and Resolved the production support tickets using remedy.
  • Developed monitoring scripts in UNIX and moved Data Files to another server by using SCP on UNIX platform.
  • Extensively used Teradata Utilities like Fast-Load, Multi-Load, BTEQ & Fast-Export.
  • Created Teradata External loader connections such as M Load, Upsert and Update, Fast load while loading data into the target tables in Teradata Database.
  • Involved creating the tables in Teradata and setting up the various environments like DEV, SIT, UAT and PROD.

Environment: Informatica Power Center 9.6, Oracle12C, Informatica Cloud, IDS 9.6.1, IDQ 9.6.1 Teradata 14.0, SQL Server 2014, Teradata Data Mover, Autosys Scheduler Tool, Netezza, UNIX, Toad, PL/SQL, SSIS, Power Connect, DB2, Business Objects XI3.5.

Confidential, Jersey City, New Jersey

Informatica Developer


  • Designed and developed Informatica mappings in Informatica 9.5.1 environment, workflows to load data into Oracle ODS. Extensively used Informatica to load data from Oracle, XML and flat files to Oracle.
  • Extensively used Informatica functions LTRIM, RTRIM, IIF, DECODE, ISNULL, TO DATE, DATE COMPARE in Transformations.
  • As this project falls under Waterfall methodology, still we used to have daily scrum meetings and I took lead to conduct this daily standup meeting.
  • Extensively used Informatica data Quality, Informatica Power Center to create and manipulate source definitions, target definitions, mappings, Mapplets, transformations, re-usable transformations, etc.
  • Prepared all the designing documents consist of Data flow from Source to Target by mapping all the columns, which are required for reporting purpose.
  • Handled the complex mappings by modifying some of the core tables which consist of Pet Smart customer data and also the sales tables that are involved in Batch load.
  • Created different (Detailed and High level) Data Model diagrams to explain the flow of the data by using Data Modular tool called ERwin.
  • Extensively worked understanding the Business requirements and designed the logic to populate the data as expected.
  • Created DDL and DML scripts that have structure of new tables, and modifications of existing tables.
  • Built Mappings, work lets and workflows to load the data into staging area, and then into DW tables.

Environment: Informatica 9.1/9.5, SQL Server, Oracle, Netezza, Tidal and JIRA.

Confidential, New York

Informatica/ MDM Developer


  • Address Doctor Upgrade in Power Center and Developer.
  • Configure AVOS with hub and made sure that default workflows are integrated.
  • Configured BE workflows with IDD.
  • Installed MDM in the windows development environment.
  • Handled Address Cleansing before populating the data into landing tables.
  • Worked on ETL process for bringing in the data into MDM landing tables.
  • Worked on profiling the data using Developer tool/Analyst Tool for identifying the data integrity from different sources.
  • Most of the data was cleansed in Power Center/Developer before the data is placed in MDM landing tables.
  • Based on the data quality analysis and discussion with stakeholders the source data trusted scores are defined.
  • Developed validation rules based on the profiled data quality and data analysis.
  • Came to conclusion on key fields after discussing with people in the knowledge of data.
  • Defined Match rules in Match and Merge settings of the base tables by creating Match Path Components, Match Columns and Rule sets.
  • Configured match rule set filters for meeting the different data scenarios.
  • Performed match/merge and ran match rules to check the effectiveness of MDM on data and fine-tuned the match rules.
  • Developed ad hoc queries like Execute Batch Delete SOAP requests for deleting the specific data from the concerned underlying tables using.
  • Developed Unmerge user exist for reprocessing the some of the records which are supposed to be processed differently.
  • Closely worked with Data Steward Team for designing, documenting and configuring Informatica Data Director.
  • Used ActiveVOS for configuring workflows like One step approval, merge and unmerge tasks.
  • Configured static lookups, dynamic lookups, bulk uploads, extended search and Smart search in IDD.
  • Configured JMS Message Queues and appropriate triggers for passing the data to the contributing systems.

Environment: Multi-Domain MDM 10.0, IDD, Oracle 11g, Oracle PL/SQL, Windows Application Server, Active VOS, Informatica PowerCenter 10.1, Informatica Developer, Address Doctor 5.1, PowerShell.


Informatica/ ETL Developer


  • Actively involved in gathering business requirements and performing GAP analysis.
  • Translated Functional requirements into Technical specification documents.
  • Developed ETL mapping spreadsheet and Fact-Dimension matrix prior to performing the ETL process.
  • Involved in Data modeling for GL Data Mart to bring in various set of books assets and balances into the warehouse
  • Used complex transformations such as Connected and Unconnected Lookups, Joiner, Router, Filter and Stored Procedure transformations for data loads
  • Worked with Memory management for the best throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations and involved in pipeline partitioning
  • Debugged and implemented the best practices in mappings, sessions and workflows for data extraction and loading into Slowly Changing Dimensions type 1 and type 2
  • Created test plan cases and performed Unit, Volume and Performance tests.
  • Created and scheduled Informatica workflows using Crontab application.
  • Imported physical tables and created the required aliases, views, joins in the Physical layer of Siebel Analytics Admin tool.
  • Developed the facts, dimensions, necessary hierarchies and the joins between the logical tables, added aggregates in the BMM layer.
  • Implemented various performance tuning mechanisms by performing Query Optimization, Hints, running advice plans and explain queries on indexes and creating Materialized Views to execute the prompts on the Dashboards faster.

Environment: Informatica power center, IDQ, Microsoft Visio 2003, Oracle 8i/9i, PL/SQL, UNIX, Windows NT/2000 and Siebel.


SQL - ETL Developer


  • Gathered functional requirements and Business requirements and wrote the technical specifications for building the Data model.
  • Created new database objects like Procedures, Functions, Packages, Triggers, Indexes and Views using T-SQL in SQL Server 2005/2008.
  • Created and executed several SSIS packages to perform ETL operations of the data from source server to destinations server and OLTP to OLAP.
  • Experience in creating complex SSIS packages using proper control and data flow elements with Error Handling.
  • Experience in providing Logging, Debugging and Error handling by using Event Handlers, and Custom Logging, break point, data viewers, check points for SSIS Packages.
  • Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages in both the environments (Development and Production).
  • Transfer Data from Database to Data Warehouse using SSIS ETL packages such as look up, union-all, multicast Transformations.
  • Involved in monitoring and tuning report performance by analyzing the execution plan of the reports.
  • Developing back-end PL/SQL packages, building UNIX shell scripts for data migration and batch processing.
  • Build, Published and Scheduled SSRS Reports for both Dev and Test Environment.
  • Database backups and troubleshooting production databases and cube sync processing issues and SSAS issues.
  • Created Star Schema for OLAP cube generation and extensively worked dimension modeling for analysis service using SSAS.

Environment: SQL Server 2008/2005, SSIS, SSRS, T-SQL, Crystal Reports 7/8, Windows 2008 Advance Server, VB.NET, MS Excel, and MS Office.


Junior SQL Developer


  • Complete analysis, requirement gathering and function design document creation.
  • Collaborated with the application developers in data modeling and E-R design of the systems.
  • Creating and managing schema objects such as Tables, Views, Indexes and referential integrity depending on user requirements
  • Used DDL and DML commands for writing triggers, stored procedures and data manipulation.
  • Involved in tuning system stored procedures for better performance.
  • Involved in the Data modeling, Physical and Logical Design of Database
  • Configured Exchange Server for sending automatic mails to the respective people when a job failed or succeed.
  • Worked with the application developers and provide necessary SQL Scripts using SQL and PLSQL.
  • Responsible for the management of the database performance, backup, replication, capacity and security.

Environment: MS SQL Server 2008, T-SQL, SQL Server Management Studio (SSMS), SQL Profiler, Visual Source Safe (VSS), Windows XP.

Hire Now