We provide IT Staff Augmentation Services!

Sr. Etl Informaticapowercenter Engineer Resume

Titusville, NJ


  • 8+ Years of experience in Information Technology as Informatica Developer with strong background in ETL Data warehousing experienced using Informatica Power Center 10/9.5x/9.x/8.x/ 7.x.
  • Good experience in Informatica Installation, Migration and Upgrade Process.
  • Experience in using Informatica Power Center Transformations such as Source Analyzer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Experience in integration of various data sources like SQL Server, Oracle, Flat Files, and XML files.
  • Experience in developing XML/XSD/XSLT as a part of Source XML files for Informatica and also input XML for Web service Call.
  • Proficient knowledge and hands - on experience in building Data Warehouses, Data Marts, Data Integration, Operational Data Stores and ETL processes.
  • Good Exposure to Teradata DBA utilities Teradata Manager, Work load Manager, index wizard, Stats Wizard and Visual Explain.
  • Good knowledge of Dimensional Data Modeling, ER Modeling, Star Schema/Snowflake Schema, FACT and Dimensions Tables, Physical and Logical Data Modeling.
  • Expertise in design and implementation of Slowly Changing Dimensions (SCD) type1, type2, type3.
  • Experienced in loadingdata, troubleshooting, Debugging, mappings, performance tuning of Informatica (Sources, Targets, Mappings and Sessions) and fine-tuned transformations to make them more efficient in terms of session performance.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
  • Experience in several facts of MDM implementations including Data Profiling, Data extraction, Data validation, Data Cleansing,Data Cleansing, Data Match, Data Load,Data Migration, Trust Score validation.
  • Experienced in Installing, Managing and configuring Informatica MDM core component such as Hub Server, Hub Store, Hub Cleanse, Hub Console, Cleanse Adapters, Hub Resource Kit.
  • Database experience using … Teradata, MS SQL Server … and MS Access.
  • Experience in UNIX Operating System and Shell scripting.
  • Working knowledge of data warehouse techniques and practices, experience including ETL process, dimensional data modeling (Star Schema, Snow Flake Schema, FACT&Dimension Tables), OLTP and OLAP.
  • Worked on Data Profiling using IDE-Informatica Data Explorer and IDQ- Informatica Data Quality to examine different patterns of source data. Proficient in developing Informatica IDQ transformations like Parser, Classifier, Standardizer and Decision.
  • Worked on MDM Hub configurations - Data modeling & Mappings, Data validation, Match and Merge rules, Hierarchy Manager, customizing/configuring Informatica Data Director (IDD)
  • Experience in data mart life cycle development, performed ETL procedure to load data from different sources into Data marts and Data warehouse using Informatica Power Center.
  • Used Debugger in Informatica Power Center Designer to check the errors in mapping
  • Experience with creating profiles, rules, scorecards for data profiling and quality using IDQ.
  • Data Cleansing rules for MDM Projects.
  • Experience includes working with healthcare, and pharmaceutical organizations
  • Experience in writing complex SQL queries. Experience in performance tuning the HiveQL and Pig scripts.
  • Experience in working with Oracle, Netezza databases. Experience working on Hadoop using Hive database (HUE). Experience in integration of data sources like Oracle 11G and Flat Files.
  • Created Views in Hive database to load into Hive and Netezza databases.
  • Highly proficient in using T-SQL for developing complex Stored Procedures, Triggers, Tables, Views, User defined Functions, User profiles, Relational Database Models and Data Integrity, SQL joins, indexing and Query Writing
  • Experience in using Informatica Utilities like Pushdown optimization, Partition and implemented slowly changing dimensions Type1, Type2 methodology for accessing the full history of accounts and transaction information.
  • Excellent skills in fine tuning the ETL mappings in Informatica.
  • Extensive experience using database tool such as SQL *Plus, SQL *Developer, Autosys and TOAD.
  • Effective working relationships with client team to understand support requirements, and effectively manage client expectations.
  • Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.


Sr. ETL InformaticaPowerCenter Engineer

Confidential, Titusville, NJ

Roles & Roles & Responsibilities

  • Coordinated with Business Users for requirement gathering, business analysis to understand the business requirement and to prepare Technical Specification documents (TSD) to code ETL Mappings for new requirement changes.
  • Develop strategy for implementing data profiling, data quality, data cleansing and ETL metadata.
  • Analysis of Source, Requirement, existing OLTP system and Identification of required dimensions and facts from the Database.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.6
  • Set up batches and sessions to schedule the loads at required frequency using Power Center Workflow manager and accessing Mainframe DB2 and AS400 systems.
  • Developed various T-SQL stored procedures, functions and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager.
  • Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
  • Worked on MDM Hub configurations - Data modeling & Mappings, Data validation, Match and Merge rules, Hierarchy Manager, customizing/configuring Informatica Data Director (IDD)
  • Performing requirement gathering analysis design development testing implementation support and maintenance phases of both MDM and Data Integration Projects.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2008.
  • Solved T-SQL performance issues using Query Analyzer.
  • Master Data Management MDM Data Integration concepts in large scale implementation environments
  • Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data modeling tool.
  • Set up batches and sessions to schedule the loads at required frequency using Power Center Workflow manager.
  • Developed various T-SQL stored procedures, functions and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager
  • Closely worked with other IT team members business partners data stewards stakeholders steering committee members and executive sponsors for all MDM and data governance related activities.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2008.
  • Optimized the T-SQL queries and converted PL-SQL code to T-SQL.
  • Standardized the T-SQL stored procedures per the organizations standards.
  • Applied try/catch blocks to the T-SQL procedures.
  • Used merge statement in T-SQL for upsets into the target tables.
  • Worked on OBIEE Answers to create the reports as per the client requirements and integrated them into the Dashboards.
  • Extensively worked on Autosys to schedule the jobs for loading data.
  • Worked on Power Exchange for change data capture (CDC).
  • Executing DML, DDL and DCL commands as SQL queries.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • MDM development includes creating Base object tables, staging tables and landing tables as per requirement in LLD
  • Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
  • Analysis and code development using Agile methodology.
  • Used mapping parameters and variables for pulling incremental loads from source
  • Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Involved in the design, development and implementation of the Enterprise Data Warehousing (EDW) process.
  • Provided data warehouse expertise including data modeling, Extract, Transform and Load (ETL) analysis, design and development.
  • Created User exit features extending the functionality and features of MDM HUB.
  • Hands-on Experience in working with Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets to extract, transform and load data.
  • Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Worked with various Active and Passive transformations like Source Qualifier, Sorter, Aggregator, Filter, Union, and Router Transformations, Sequence Generator and Update Strategy Transformations.
  • Handled versioning and dependencies in Informatica.
  • Developed schedules to automate the update processes and Informatica sessions and batches.
  • Resolving technical and design issues.
  • Developed data transformation processes, maintain and update loading processes.
  • Developed and implemented the UNIX shell scripts for the start and stop procedures of the sessions.
  • Used UNIX shell scripts to run the batches.
  • Developed standards and procedures to support quality development and testing of data warehouse processes.

Environment: Informatica Power Center 9.6, Oracle 10g, T-SQL,Informatica MDM 10.1/10.2,Informatica MDM Data Director 10.1/10.2, MS SQL Server 2008, UNIX(Sun Solaris5.8/AIX), Data Marts, Erwin Data Modeler 4.1, Agile Methodology, Teradata 13, FTP, MS-Excel, MS-Access, UNIX Shell Scripting, Data Modeling, PL/SQL, Autosys.

Sr ETL Informatica Developer

Confidential - MEMPHIS, TN

Roles & Roles & Responsibilities

  • Developed the source definitions, target definitions to extract data from flat files, relational sources and Salesforce Objects.
  • Created different transformations for applying the key business rules and functionalities on the source data.
  • Involved in performance tuning of sources, targets, mappings, and sessions.
  • Manage coordination between onsite/offshore teams and BA.
  • Developing test plan and scripts, conducting testing, and dealing with business partners to conduct end-user acceptance testing.
  • Analyzed the business systems, gathered requirements from the users and documented business needs for decision supporting data.
  • Defined measurable metrics and required attributes for the subject area to support a robust and successful deployment of the existing Informatica MDM 9.5 platform.
  • Planed Informatica MDM 9.5 requirement analysis sessions with business users.
  • Created Informatica MDM 9.5 Hub Console Mappings.
  • Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
  • Understanding the Requirement Specifications, preparing the Functional and technical design documents.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections, optimized the mappings by changing the logic and reduced running time.
  • Included error-handling and exception handling by logging in the error tables and sending an alert message via e-mail to the concerned distributions list.
  • Responsible for retrofitting the code to QA environment, and extending the support for the QA and UAT for fixing the bugs.
  • Responsible for Escalation management during the Production support along with fixing the bugs.
  • Worked on integration testing of all the modules and preparation of test plans
  • Ensuring that all production changes are processed according to release management policies and procedures.
  • Ensuring that appropriate levels of quality assurance have been met for all new and existing applications / CRs.
  • Ensuring that application changes are fully documented, supportable.
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
  • Created the (ER) Entity Relationship diagrams& maintained corresponding documentation for corporate data dictionary with all attributes, table names and constraints.
  • Prepared technical documentation to map source to target.
  • Designed, Developed, Deployed and implemented ETL mappings using Informatica
  • Migrated Workflows, Mappings, and other repository objects from Development to QA and then to production.
  • Responsible for performance tuning at all levels of the Data warehouse.
  • Created Informatica sessions in workflow manager to load the data from staging to Target database.
  • Prepared technical design/specifications for data Extraction, Transformation and Loading.
  • Using Aggregator transformation calculated SUM, AVG of monthly sales for different products.
  • Created different target schemas for Staging and Data Mart.
  • Designed the Mapping Design documents and the Deployment Documents.
  • Designed and Developed several mappings to Load the Dimensions and the fact tables using transformations like XML, Union, Expression, Filter, Aggregator, Lookup and Router etc.
  • Worked with Flat files and XML files generation through ETL process
  • Generated XML files as target to load into the vendor customized application to generate the reports.
  • Involved in XML and XSLT coding in order to create the Front End screen insert activities
  • Implemented Slowly Changing Dimensions Type1, Type 2.

Environment: Informatica Power Center 9.1, Informatica MDM 9.5, Informatica IDQ 8.6 8.6, Informatica 9.5.1, Oracle11g, Informatica Power Center 8.6.1, Oracle 10g, TOAD, Force.com explorer, Salesforce, Flat files, Informatica Scheduler, UNIX Shell Scripting, and Windows XP/ 7,SQL Server 2008, TOAD, SQL Navigator, Windows Server 2008, UNIX.

ETL Informatica Developer

Confidential, Atlanta, GA

Roles & Roles & Responsibilities

  • Gathered user Requirements and designed Source to Target data load specifications based on business rules.
  • Used Informatica Power Centre 9.0.1.for extraction, loading and transformation (ETL) of data in the datamart.
  • Participated in the review meetings with functional team to signoff the Technical Design document.
  • Involved in Design, Analysis, Implementation, Testing and support of ETL processes
  • Worked with Informatica Data Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.
  • Validated the following HIPAA EDI transactions as 837(Health Care Claims or Encounters), 835(Health Care Claims payment/Remittance), 270/271 (Eligibility request/Response) and 834(Enrollment/Disenrollment to a health plan) by developing mappings.
  • Developed IDQ mappings using various transformations like Labeler, Standardization, Case Converter, Match and Address validation Transformation.
  • Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica Power Center.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
  • Worked extensively with the connected lookup Transformations using dynamic cache.
  • Worked with complex mappings having an average of 15 transformations.
  • Coded PL/SQL stored procedures and successfully used them in the mappings.
  • Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process and also to schedule the automatic execution of workflows.
  • Scheduled the Jobs by using Informatica scheduler& Jobtrac
  • Created and scheduled Sessions, Jobs based on demand, run on time and run only once
  • Monitored Workflows and Sessions using Workflow Monitor.
  • Performed Unit testing, Integration testing and System testing of Informatica mappings.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning at various levels like mapping level, session level, and database level.
  • Provided production support by monitoring the processes running daily.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
  • Coordinating with the Offshore team and directly interacting with the client for clarifications & resolutions
  • Introduced and created many project related documents for future use/reference.
  • Designed and developed ETL Mappings to extract data from Flat files and Oracle to load the data into the target database.
  • Developing several complex mappings in Informatica a variety of Power entertransformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
  • Built complex reports using SQL scripts.
  • Created complex calculations, various prompts, conditional formatting and conditional blocking etc., accordingly.
  • Created complex mappings to load the data mart and monitored them. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normaliser and Sequence generator transformations.
  • Ran the workflows on a daily and weekly basis using workflow monitor.

Environment: Informatica 9.0.1,PL/SQL,Informatica Data Quality IDQ 9.6, Informatica 8.6.1, 9.5, Oracle 9i, UNIX, SQL, PL/SQL, Informatica Scheduler, SQL*loader, SQL Developer,Framework Manager, Transformer, Teradata, Oracle 11g, TOAD, Windows Server 2008, UNIX.

Informatica Developer

Confidential - Oxnard, CA

Roles & Responsibilities:

  • Involved in Business analysis and requirements gathering.
  • Creating the design and technical specifications for the ETL process of the project.
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Developer.
  • Developed complex mappings using Informatica Power Center Designer to transform and load data.
  • Extensively used various types of transformations such as Expression, Joiner, Update strategy, Aggregator, Filter, and Lookup.
  • Developed several Mappings and Mapplets using corresponding Sources, Targets and Transformations.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Implemented complex mappings such as Slowly Changing Dimensions.
  • Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
  • Built in mapping variable/parameters and created parameter files for imparting flexible runs of workflows based on changing variable values.
  • Designed and Developed pre-session, post-session routines for Informatica sessions to drop and recreate indexes and key constraints for bulk loading.
  • Maintain Development, Test and Production mapping migration Using Repository Manager.
  • Created sessions and workflows to run with the logic embedded in the mappings using Power center Designer.
  • Used workflow manager for session management, database connection management and scheduling of jobs.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed.

Environment: Informatica Power Center 9.5.1/9.1.0, Oracle 11g, Erwin-4.0, TOAD 9.x, Shell Scripting, Teradata, Oracle SQL *Loader, OBIEE, PL/SQL, SalesForce dot com(SFDC), SSIS and Sun Solaris UNIX, Windows-XP,DAC.

Informatica Developer

Confidential - Camas, WA

Roles & Responsibilities:

  • Based on the requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Developed mappings and mapplets using Informatica Designer to load data into ODS from various transactional source systems.
  • Used Informatica Designer to import the sources, targets, create various transformations and mappings for extracting, transforming and loading operational data into the EDW from ODS.
  • Used various transformations such as expression, filter, rank, source qualifier, joiner, aggregator and Normalizer in the mappings and applied surrogate keys on target table.
  • Used the Informatica Server Manager to register and monitor the server, create and run the sessions/batches for loading the data using the earlier created mappings.
  • Created mapplets and reusable transformations.
  • Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
  • Created connection pools, physical tables, defined joins and implemented authorizations in the physical layer of the repository.
  • Migrated mappings from Development to Testing and performed Unit Testing and Integration Testing.

Environment: Informatica Power Center 8.x, Repository Manager, Designer, Oracle 8i, SQL, UNIX, Win 2000/NT.



Data Warehousing/ETL Tools: Informatica PowerCenter … (Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet, Transformation, Sessions, Informatica MDM, Workflow Manager-Workflow, Task, Commands, Worklet, IDQ, Transactional Control, Constraint Based Loading, SCD I, II,Data Flux, Datamart, OLAP, ROLAP, MOLAP, OLTP.

Data Modeling: Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, Fact, Dimensions), Entities, Attributes, Cardinality, ER Diagrams.

Databases: Oracle … SQL Server … MS Access and DB2

Languages: SQL, PLSQL C,C++,Data Structures, T-SQL, Unix Shell Script, Visual Basic

Web Technologies: XML, HTML, Java Script

Tools: Toad, SQL* Developer,Autosys,Erwin

Operating Systems: Windows Server, … UNIX, MS-DOS and Linux

Hire Now