We provide IT Staff Augmentation Services!

Sr. Etl Informatica/mdm Developer Resume

Oriskany, NY


  • 7+ years of IT Experience in Data Warehousing, Database Design and ETL Processes in various business domains like finance, telecom, manufacturing and health care industries.
  • Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large - scale Data warehouses using Informatica Power Center.
  • Worked extensively on ETL process using Informatica Power Center 10.x,9.x,8.x, 7.x, IDQ 10.2.1/9.0.1/8.6 , MDM 10.x,B2B,PCF, Teradata, SQL server and oracle databases
  • Extensively used ETL methodologies for supporting Data Extraction, Transformation and Loading process, in a corporate-wide-ETL solution using Informatica Power Center.
  • Extensively worked on developing Informatica Designer, Workflow manager and Workflow monitor for data loads.
  • Experience with special emphasis on system analysis, design, development and implementation of ETL methodologies in all phases of Data warehousing life cycle and Relational databases using IBM InfoSphereDataStage 11.5/11.3/9.1/8.5/8.1.
  • Extensive experience as an ETL developer and working with IBM InfoSphere Information Server. DataStage (Versions 11.5, 11.3, 9.1,8.5, 8.1, 8.0.1) and Ascential DataStage 7.5.2 Enterprise Edition (Parallel Extender) and Server Edition.
  • Hands-on experience on foundation tools like Fast Track, Metadata Work Bench, DataStage and Quality Stage, Information Analyzer etc.
  • Extensive experience ETL tool using IBM Info Sphere DataStage.
  • Expertise in analyzing and understanding data dependencies in database tables using the corresponding metadata stored in the DataStage repository.
  • Extensive experience in using various Informatica Designer Tools such as Source Analyzer, Transformation Developer, Informatica MDM Work Flow Manager, Mapping Designer and Mapplet Designer.
  • Extensive experience in Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in E-commerce software, Oil & Gas, Health Care, Insurance, Financial industries.
  • Experience in development and maintenance of Oracle 11g/10g/9i/8i, PL/SQL, SQL *PLUS, TOAD, SQL*LOADER.SQL, Stored procedures, functions, analytic functions, constraints, indexes and triggers.
  • Experienced in IDQ (9X, 9.5.1) handing LDO’s PDO’s & some of the transformation to cleanse and profile the incoming data by using Standardize, Labeler, Parser, Address Validator Transformations8 years of experience in using different versions of Oracle database like 11g/10g/9i/8i.
  • Excellent working knowledge of c shell scripting, job scheduling on multiple platforms, experience with UNIX command line and LINUX.
  • Experience is design MDM architecture solutions, complex match rules, complex custom survivorship rules which cannot be handled out-of-the box MDM tool to determine best version of truth (BVT) aka Golden Record and reusable framework
  • Experience in creating batch scripts in DOS and Perl Scripting.
  • Experience in ETL development process using Informatica for Data Warehousing , Data migration and Production support.
  • Sound knowledge of Relational and Dimensional modeling techniques of Data warehouse (EDS/Data marts) concepts and principles (Kimball/Inmon) - Star/Snowflake schema, SCD, Surrogate keys and Normalization/De-normalization.
  • Data modeling experience in creating Conceptual, Logical and Physical Data Models using Erwin Data Modeler.
  • Experience with TOAD, SQL Developer database tools to query, test, modify, analyze data, create indexes, and compare data from different schemas.
  • Knowledge in Data Analyzer tools like InformaticaPower Exchange (Power Connect) to capture the changed data.
  • Proficiency in data warehousing techniques for data cleansing, surrogate key assignment and Change data capture (CDC).
  • Experience in validating data quality & business rules by using Mapping document and FSD to maintain the data integrity.
  • Experience in writing SQL test cases for Data quality validation.
  • Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files into ODS and good knowledge onTeradata 12.0/13.0, SQL Server 2000/2005/2008 and MS Access 2003/2007.
  • Expertise in implementing complex business rules by creating re-usable transformations, Mapplets and Mappings.
  • Experienced in Extracting, Transforming and Loading data from various data sources like excel, flat file and Oracle to MySQL Server databases using SSIS MySQL Server Integration services.
  • Hands on experience in MDM development.
  • Experience in end to end Data quality testing and support in enterprise warehouse environment.
  • Experience in maintaining Data Quality, Data consistency and Data accuracy for Data Quality projects.
  • Extensively used Autosys and Tidal for scheduling the UNIX shell scripts and Informatica workflows.
  • Extensive knowledge in all areas of Project Life Cycle Development.
  • Experience in both Waterfall and Agile SDLC methodologies.
  • Strong analytical, verbal, written and interpersonal skills.


Databases: Oracle 10g/9i/11i/R12, DB2, MS SQL server 7.0/2000/2005/2008 , MS Access 2000/2005, Teradata, MySQL

Languages: Transact- SQL, PL/SQL,HTML, CSS, Javascript, C, C#, PERL, Java

Operating Systems: Windows, Linux, Unix, MS-DOS, Sun Solaris.

OLAP/Reporting Tools: SQL Server Analysis Service(SSAS), SQL Server Reporting Service(SSRS), Share Point MOSS 2007, Business Objects 6.x, Cognos Framework Manager, Tableau

ETL Tools: InformaticaPowerCenter10.x/9.x/8.x/7.x, Informatica Power Exchange, Informatica Data Quality Suite 9.6,Informatica MDM, Informatica Data Director (IDD), Informatica B2B DT, Pentaho Data Integration (PDI)SQL Server Integration Services (SSIS), IBM InfoSphere DataStage, IBMDatastage 11.5/11.3/9.1/ 8.7/8.5/8.0.

Data Modeling Tools: Microsoft Visio, Erwin r9.0/8x/7x, ER/Studio 9.7/9.0/8.0/7. x

SQL Server Tools: SQL server Management Studio, SQL server Query Analyzer, SQL server mail service, DBCC, BCP, SQL server profiler

Web Technologies: MS FrontPage, MS Outlook Express, FTP, TCP/IP

Other Tools: Microsoft Office, Visual Basic 6

Scheduling Tools: Informatica Scheduler, CA Scheduler(Autosys), ESP, Autosys Control-M

Data Quality Tools: Informatica Analyst, Informatica Data Quality, Informatica Developer

MDM Tools: Nextgate, Informatica MDM


Confidential, Oriskany, NY

Sr. ETL Informatica/MDM Developer


  • Involved in Business Analysis and Requirements collection.
  • Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.
  • Scheduling the Informatica workflows using Control-M, Tivoli scheduling tools & trouble shooting the Informatica workflows.
  • Involved in extracting addresses from multiple heterogeneous source like flat files, oracle, SAS and SQL server.
  • Extensively used all Power Center/Power mart capabilities such Target override, Connected, Unconnected and Persistent lookup’s
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Worked on Slowly Changing Dimensions (SCD's) Types -1, 2 and 3 to keep track of historical data.
  • Used Informatica MDM 10.1 tool to manage Master data of EDW.
  • Extracted consolidated golden records from MDM base objects and loaded into downstream applications.
  • Represent PowerCenter mapping detailed transformation information in the catalog (EDC)
  • Setting-up Enterprise Data Catalog (EDC v 10.2x) tool on Linux platform Implement Data Governance in Informatica, EDC based on data validation performed in IDQ.
  • Business term and technical tern lineages in EDC and Axon for Data Stewards better understanding. End to End installation and configuration of Informatica 9.x/10.x product suits (PowerCenter (PC), Enterprise Data Catalog(EDC), Informatica Data Quality(IDQ), Preparation of Unit test cases.
  • Developed Tableau visualizations and dashboards with interactive views, trends and drill downs along with user level security using Tableau Desktop.
  • Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
  • Worked with team to convert Trillium process into Informatica IDQ objects.
  • Extensively involved in ETL testing, Created Unit test plan and Integration test plan to test the mappings, created test data. Use of debugging tools to resolve problems & created reference tables to standardize data.
  • Used bunch of transformations in Pentaho transformations including Row Normalizer, Row Demoralizer, Database Lookup, Database Join, Calculator, Add Sequence, Add Constants and various types of inputs and outputs for various data sources including Tables, Access, Text File, Excel and CSV file.
  • Participated in design of Staging Databases and Data Warehouse/Data mart database using Star Schema/Snowflakes schema in data modeling.
  • Worked very closely with Project Manager to understand the requirement of reporting solutions to be built.
  • Used Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others.
  • Optimized the Solution using various performance-tuning methods (SQL tuning, ETL tuning (i.e. optimal configuration of transformations, Targets, Sources, Mappings and Sessions), Database tuning using Indexes, partitioning, Materialized Views, Procedures and functions).
  • Adept in formulating Test plans, Test cases, Test Scenarios, Test Approaches, Setting up Test Environments in conjunction with the Testing team and expertise in formulating test approaches with Load Testing, Integration Testing, Functional Testing, and User Acceptance Testing (UAT).
  • Implemented Logic with Database lookup table to maintain Parent- Child relationship and maintain hierarchy.
  • Use Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others (DEV/QA/PREPROD/PROD).
  • Created Transformations/Jobs to take daily back up Enterprise repository for DEV/QA/PREPROD/PROD.
  • Utilized SSIS transformations such as Conditional Split, Derived Column, Fuzzy Grouping, Fuzzy Lookup to validate data during staging before loading into data mart.
  • Utilized SSIS to implement the Slowly Changing Transformation along with checksum technology, to maintain historical data in data marts.
  • Perform all SDLC phases related to extract, transform, and load (ETL) processes using MySQL Server Integration Services (SSIS) and MySQL Server T-SQL stored procedures within SQL Server 2012 environment.
  • Resolved many issues by applying EBF’s and patches for issues related to informatica load failures.
  • Migration of Informatica Mappings/Sessions/Workflows and Unix scripts to QA, UAT and Prod environments using Harvest and SVN tool.
  • Configured Connection Manager files for SSIS packages to dynamically execute on Quality Analysis server and Production server. Performed Full load & Incremental load with several Data flow tasks and Control Flow Tasks using SSIS.
  • Utilized XML and SQL Server table configuration for the management and migration of SSIS packages in staging and pre-production environments.
  • Deployed packages from test environment to production environment by maintaining multiple package configurations in SSIS utilizing package and project deployment models in SSIS 2012.
  • Implemented advanced features in SSIS such as error handling, transactions, checkpoints, loggings and package configurations utilizing package deployment utility.
  • Worked on Master Data Management (MDM), Hub Configurations (SIF), Data Director, extract, transform, cleansing, loading the data onto the tables.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Assisted with automation and deployment of SQL Scripts and SSIS Packages, SSRS reports and maintained daily jobs in different environments using SQL Server Agent & Tivoli Scheduler.
  • Used Tivoli Scheduler to schedule the ETL batch jobs to load the data into EDW.
  • Provond production support to schedule and execute production batch jobs and analyzed log files in Informatica 10.1 Integration servers.
  • Involved in daily status call with onsite Project Managers, DQ developers to update the test status and defects.

Environment: Informatica Power Center 10.1/ 9.6,Pentaho Data Integration 8.0.0(PDI/Kettle), Power Exchange, Informatica Data Quality 9.6.1, Informatica MDM 10.1, UNIX, Windows NT/2000/XP, SQL Server 2008, SSIS, OBIEE, DB2, Control tool, SVN, Windows Server, UNIX and CA ESP Workstation Scheduler, UNIX, IDQ(IDE).

Confidential, Houston, TX

Sr. Informatica Developer/IDQ


  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1.
  • Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
  • Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Responsible for Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Schedule the workflows to pull data from the source databases at weekly intervals, to maintain most current and consolidated data.
  • Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.0.
  • Worked on IBM InfoSphereDataStage 11.5 to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database
  • Implementing Industry ETL standards and best practices, performance tuning during designing the DataStage Jobs.
  • Using IBM InfoSphereDatastage software, extracting data from DB2, Oracle and Flat File and Load to target tables.
  • Worked on DataStage V11.3 to develop ETL jobs that loads the data from staging to target tables in Teradata server as database.
  • Developed, tested and implemented DataStage Jobs, JIL Jobs, Ksh scripts for several projects in an Operational Data Store.
  • Configured integration between the ActiveVOS and MDM 10.1 for creating the custom workflow process and designing the orchestration project.
  • Resolving issues related to Enterprise data warehouse (EDW), Stored procedures in OLTP system and analyzed, design and develop ETL strategies.
  • Worked with Services and Portal teams on various occasion for data issues in OLTP system.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Carried out changes into Architecture and Design of Oracle Schemas for both OLAP and OLTP systems.
  • Involved in creating IDD tasks and assigning roles to the tasks in ActiveVOS to trigger the workflows.
  • Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, XML, SQL server and Flat files and loaded into Oracle.
  • Perform Informatica MDM design, implementation, experience on master data management Informatica Data Quality (IDQ 9.6.1) is the tool used here for data quality measurement.
  • Exposure to Informatica B2B Data Exchange that allows to support the expanding diversity of customers and partners and their data with capabilities that surpass the usual B2B solutions.
  • Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.
  • Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Created PL/SQL stored procedures, scripts, functions and packages to extract the data from the operational database into simple flat text files using UTL FILE package.
  • Design the Source - Target mappings and involved in designing the Selection Criteria document.
  • Wrote BTEQ scripts to transform data. Used Teradata utilities fastload, multiload, tpump to load data
  • Responsible for manually start and monitor production jobs based on the business users requests.
  • Responsible to look into production issues and resolved them in timely manner.
  • Developed Informatica process to replace stored procedure functionalities and provide a time effective and high data quality application to the client.
  • Formulated a comprehensive data migration plan with different conversion strategies, detailed object and field mappings including its transformations and business rules for converting legacy Nationwide data into Oracle.
  • Analyze the business requirement and create ETL logic to extract data from flat files coming from Manufacturing at different geographic regions and load data in the data warehouse house.
  • Prepared ETL Specifications and design documents to help develop mappings.
  • Created Mappings for Historical and Incremental loads.
  • Worked on staging the data into work tables, cleanse, and load it further downstream into dimensions using Type 1 and Type 2 logic and fact tables which constitute the data warehouse.
  • Worked with PMCMD to interact with Informatica Server from command mode and execute the shells scripts.
  • Project based on Agile SDLC methodology with 2 weeks of software product release to the business users.
  • Take part in daily standup and scrum meetings to discuss the project lifecycle, progress and plan accordingly, which is the crux of Agile SDLC.
  • Provide post release/production support.

Environment: IBM InfoSphere DataStage11.5 (Designer, Administrator, Director),Quality Stage, Informatica Power Center 10.0, IDQ 9.6.1, Informatica MDM, Informatica B2B DT, Oracle Database 11g, SQL server, SQL * Plus, TOAD, SQL*Loader, Toad for Oracle, Tableau, Unix Shell scripts, Teradata.

Confidential, San Francisco, CA

ETL-Informatica Developer/IDQ/DataStage


  • Worked on Developed mappings/Reusable Objects/Transformation/Mapplet by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center 9.6
  • Worked with Informatica Data Quality 9.5.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, Score cards, reporting and monitoring capabilities of Informatica Data Quality IDQ 9.5.1.
  • Extensively used the DQ transformations like Address validator, Exception, Parser, Standardizer, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor
  • Designed and developed Complex mappings like Slowly Changing Dimensions Type 2 (Time Stamping) in the mapping designer to maintain full history of transactions.
  • Involved in Data Loading Sequence and Populated Data into Staging Area and Warehouse with Business Rules.
  • Integrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica MDM using IDQ cleanse Adapters. Extracted the source definitions from various relational sources like Oracle, XML and Flat Files.
  • Extensively used ETL to load Flat files, XML files, Oracle and legacy data as sources and Oracle, Flat files as targets.
  • Created Sessions and managed the Workflows using various tasks like Command, Decision, Event wait, counter, Event raise, Email using Workflow Manager.
  • Extensively used the Informatica Debugger for debugging the mappings.
  • Created Mappings to load data using various transformations like Source Qualifier, Sorter, Lookup, Expression, Router, Joiner, Filter, Update Strategy and Aggregator transformations.
  • Worked specifically with the Normalizer Transformation by converting the incoming fixed-width files to COBOL workbooks and using the Normalizer transformation to normalize the data.
  • Worked with Lookup Dynamic caches and Sequence Generator cache.
  • Created Reusable Transformations and Mapplets to use in Multiple Mappings and worked with shortcuts for various informatica repository objects.
  • Designed, developed and tested the DataStage jobs using Designer and Director based on business requirements and business rules to load data from source to target tables.
  • Used several stages like Sequential file, Hash file, Aggregator, Funnel, Change Capture, Change Apply, Row Generator, Peek, Remove Duplicates, Copy, Lookup, Join, Merge, Filter, Datasets during the development process of the DataStage jobs.
  • Development of XSLT's, Restful and SOAP based web services. Developed various ETL jobs including Data Extractions, Transformations rules based on business requirements using IBM InfosphereDatastage 9.1.
  • Established best practices for DataStage jobs to ensure optimal performance, reusability, and restartability.
  • Used Autosys to schedule, run and monitor Datastage jobs.
  • Identified and eliminated duplicates in datasets thorough IDQ components.
  • Used Teradata utilities like FastLoad, MultiLoad and Teradata SQL Assistant.
  • Design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages in MDM. Integrated IDQ process in MDM.
  • Load data files coming from external vendors onto Teradata EDW using mload and fload utilities.
  • Worked in implementation of Profiling, Score Card, Classifier models, Probabilistic models, Human task and Exception record management as part of IDQ process.
  • Worked with Informatica Power Exchange to pull the changed data in the form of Condense files and load into Teradata tables using Tpump import.
  • Created parameter files with Global variables.
  • Integrated Angular framework to REST services to facilitate LOGIN over Java Interfacing.
  • Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.
  • Business term and technical tern lineages in EDC and Axon for Data Stewards better understanding. End to End installation and configuration of Informatica 9.x/10.x product suits (PowerCenter(PC), Enterprise Data Catalog(EDC), Informatica Data Quality(IDQ), Preparation of Unit test cases
  • Profile files and shell scripts were used for recreation of dynamic parameter files.
  • Scheduling of Informatica workflows using Tidal Scheduler.
  • Migration of Informatica code from DEV to TEST environments in Informatica by creating deployment groups, folders, applying labels, creating queries in the Informatica Repository Manager.

Environment: IBMDataStage9.1/8.5, Quality Stage, Informatica PowerCenter 9.6.1/9.5.1 , Informatica IDQ (9.5.1), Informatica MDM, IDD, UNIX, Teradata13.0, Shell Scripts, SQl Server.

Hire Now