We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume


  • 8+ years of experience in IT industry, related to various aspects involving Data analysis and Data Modeling techniques, using ETL tools like, Informatica PowerCenter 10.2/9.x/8.x(Source Analyzer, Mapping Designer, Mapplet Designer, Transformations Designer, Warehouse Designer, Repository Manager, and Workflow Manager), Informatica Power Exchange 9.6.2/9.5.1 ).
  • Expert in all phases of Software development life cycle (SDLC) - Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance.
  • Worked with different non-relational Databases such as Flat files, XML files, Mainframe Files and other relational sources such as Confidential, Sql Server, Azure Sql Datawarehouse, Confidential, Salesforce, Snowflake and DB2.
  • Expertise in using methodologies for data extraction, transformations and loading processing in corporate-wide ETL solutions using Informatica PowerCenter 10.x/9.x/8.x/7.x on UNIX and Windows platform.
  • Worked extensively with Data migration, Data Cleansing, Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse Effectively used PowerCenter and Power Exchange.
  • Experience working with Azure Sql Data Warehouse and Snowflake integration using IICS.
  • Experience on PowerCenter mapping detailed transformation information in the catalog (EDC) setting-up Enterprise Data Catalog (EDC v 10.2x) tool on Linux platform Implement Data Governance in Informatica, EDC based on data validation performed in IDQ.
  • Instrumental in setting up standard ETL Naming standards & BEST Practices throughout the ETL process (Transformations, sessions, maps, workflow names, log files, bad files, input, variable, output ports).
  • Complete understanding of regular matching, fuzzy logic and dedupe limitations on IDQ suite.
  • Worked with different Informatica performance tuning issues like source, target, mapping, transformation, session optimization issues and fine-tuned Transformations to make them more efficient in terms of session performance.
  • Experience in implementing the complex business rules by creating re-usable transformations, developing complex Mapplets and Mappings, PL/SQL Stored Procedure and Triggers.
  • Database experience using Confidential advanced concepts like stored procedures, functions, packages and complex sql’s relating to data integration.
  • Experience in Creating ETL Design Documents, strong experience in complex PL/SQL packages, functions, cursors, indexes, views, materialized views.
  • Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.
  • Extensive experience in UNIX Shell Scripting, AWK and file manipulation techniques.
  • Demonstrated ability in defining project goals and objectives, prioritizing tasks, developing project plans and providing framework for effective communication while maximizing responsiveness to change.
  • Possess leadership, problem solving abilities, good analytical skills, excellent verbal and written communication and good interpersonal skills with an ability to work as a committed individual and motivated team player.


Data Warehousing: Star Schema & Snowflake Schema, Data Warehouse & Data Marts

Informatica Tool Set: Informatica PowerCenter 10.2, 9.5.1, 9.1.1, 8.6.1, Informatica Data Quality 9.6.1, 9.5.1, Informatica Power Exchange 9.6.2, 9.5.1

Databases: Confidential 12c, 11i, 10g, 9i, 8g, Microsoft SQL Server 2008/2012/2016 , Confidential V13, Azure Sql Datawarehouse, Salesforce, Snowflake

Operating Systems: , Windows, UNIX

Job Schedulers: Tivoli (TWS), Tidal Enterprise Scheduler, Control M, Informatica Scheduler

Scripting: Unix, Perl, Batch

Software Development Methodologies: Agile & Waterfall



Sr. Informatica Developer


  • Collaborating with Business Analyst team to gather and study requirements, providing design solutions as per the end user requirements.
  • Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
  • Building application software, preparing unit test plans, and creating test case scenarios and documenting test results for all the development tasks.
  • Good performance tuning skills, such as identifying poorly performing SQLs, functions and procedures, making recommendations for improvements to the stored procedures and upon approval, making modifications, testing results, and validating performance improvements.
  • Worked with Dimensional Data warehouses in Star and Snowflake Schemas, Slowly changing dimensions and created slowly growing target mappings, Type1/2/3 dimension mappings.
  • Worked on complex Confidential Sql’s involving DDL’s, DML’s, Confidential functions, PL/SQL blocks and more.
  • Experience and knowledge in data archiving, data masking using Informatica ILM Power Center tool.
  • Extracted existing sales data from Salesforce objects and integrated the data into Confidential warehouse using Powercenter.
  • Integrated warehouse data from Confidential semantic layer into downstream system in form of extracts/files and directly integrated into Salesforce to update information from warehouse into SFDC objects.
  • Created reusable transformations and Mapplets in the designer using transformation developer and Mapplets designer tools.Resuable expression transformation to clean the mainframe files that were sourced using Informatica Powerexchange process into ODS layer.
  • Used slowly changing dimensions SCD II and SCD III as per the requirement analysis.
  • Extensively used transformations like router, aggregator, lookup, source qualifier, joiner, expression, aggregator and sequence generator transformations in extracting data in compliance with the business logic developed.
  • Solid experience in debugging and troubleshooting Sessions using the Debugger and Monitor.
  • Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy
  • Fixing defects in project and providing solutions to be executed by the developers, Creating migration documents for releases.
  • Documented handbook of standards for Informatica code development.
  • Reviewed Informatica ETL mappings/workflows and SQL that populates EDW and data mart Dimension and Fact tables to ensure accuracy of business requirements.

Environment: Informatica PowerCenter 10.1/9.6.1 (PowerCenter Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Informatica TDM,, Power Exchange,, T-SQL, Confidential, DB2 8.1, XML, Snowflake;Autosys, Confidential 11g, PL/SQL.

Confidential, Livonia, MI

Sr. Informatica Developer


  • Developed a file loading process for Claim Detail file which will source the data from the flat file and go through all the ETL validation (critical data element check - ensuring that certain data element fields in the file contains a valid value). Once the file passes the critical data element check it is loaded to the database and a series of notification emails are sent out informing the users about the status of the file load.
  • Developed Informatica mappings, re-usable transformations, re-usable mappings and Mapplets for data load to data warehouse.
  • Worked on 835 and 837 file formats for business processing.
  • Created sessions, database connections and batches using Informatica Workflow Manager.
  • Extracting data from flat file, Excel files, sql server and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
  • Migrate Informatica objects and Database objects to Integration Environment and schedule using Tidal Enterprise Scheduler.
  • Monitored the ETL jobs/schedules and fixing the Bugs.
  • Very good Hands-on experience on tools like JIRA ticketing system, Tortoise SVN for maintaining sub version of the code.
  • Setting up Batches and large volumes of data and creating sessions to schedule the loads at required frequency using PowerCenter Workflow manager.
  • Handle slowly changing dimensions of Type 2 to populate current and historical data to Dimensions and Fact tables in the data warehouse. Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, Stored procedure transformations in the mapping.
  • Extensively used various transformations such as Source Qualifier, Expression, Lookup, Sequence Generator, aggregator, Update Strategy, and Joiner in migrating data from various heterogeneous sources like Confidential, OWB, DB2, XML and Flat files to Confidential .

Environment: Informatica PowerCenter v 9.6/8.6.1/8.5 SQL Server 2005, IDQ;T-SQL, XML,IDQ

Confidential, California

Sr. Informatica Developer


  • Developed a standard ETL framework to enable the reusability of similar logic across the board.
  • Involved in System Documentation of Dataflow and methodology.
  • Extensively developed Low level Designs (Mapping Documents) by understanding different source systems.
  • Designed complex mappings, sessions and workflows in Informatica PowerCenter to interact with MDM and EDW.
  • Design and develop mappings to implement full/incremental loads from source system.
  • Design and develop mappings to implement type1/type2 loads.
  • Responsible for ETL requirement gathering and development with end to end support.
  • Responsible to coordinate the db changes required for etl code development.
  • Responsible for ETL code migration, db code changes and scripting changes to higher environment.
  • Responsible to support the code in production and QA environment.
  • Developed complex IDQ rules which can be used in Batch Mode and as well in Online.
  • Extensively used transformations like router, lookup, source qualifier, joiner, expression, sorter, XML, Update strategy, union, aggregator, normalizer and sequence generator.
  • Created reusable mapplets, reusable transformations and performed Unit tests over Informatica code.
  • Responsible for providing daily status report for all Informatica applications to customer, Monitoring and tracking the critical daily applications & code migration during deployments.
  • Responsible for reloads of Informatica applications data in production and closing user tickets and incidents.
  • Identify performance issues and bottlenecks.

Environment: Informatica PowerCenter 8.6/8.1.1, Informatica DT Studio 8.6, IDQ;ERWIN 3.5Oracle 10G, SQL*Loader.

Confidential, Chicago, IL

Informatica Developer


  • Lead and/or participate in gathering and evaluating requirements, working with application / Data Warehouse team and project managers to provide solutions to end users.
  • Develop Technical design and reporting solutions to influence business results. Oversee the performance of the project throughout the life cycle from initiation till completion stage.
  • Proficient in translating users’ statements of needed system behavior and functionality into Business and Functional Requirement.
  • Involved in data modeling through the use of ER, star schema and dimensional modeling.
  • Excellent understanding of OLTP/OLAP System Study, Analysis and developing Database Schemas like Star and Snowflake schema. Exposure to Reporting tools OBIEE, BI Publisher.
  • Developing ETL mappings from the given requirements and unit testing them accordingly.
  • Creating Technical Design Documents from Business Requirements.
  • Creating batch scripts for different requirement of the project such as file validation, moving files from share point to Informatica server, archiving files with date time stamps etc.
  • Performance tuning various mappings, Sources, Targets and transformations by optimizing caches for lookup, joiner, rank, aggregator, sorter transformation and tuned performance of Informatica session for data files by increasing buffer block size, data cache size, sequence buffer length and used optimized target-based commit interval and Pipeline partitioning to speed up mapping execution time.
  • Reviewing Informatica ETL mappings/workflows and SQL that populates data warehouse and data mart Dimension and Fact tables to ensure accuracy of business requirements.
  • Created Informatica Source & Targets Instances and maintain shared folders so that shortcuts are used in project.
  • Responsible for Unit Testing and Integration testing of mappings and workflows.

Environment: Informatica Powercenter 9.6.1, Confidential 11g, Sql Server 2012, flat files, XML files, Informatica, MFT, Informatica Address Doctor.

Hire Now