We provide IT Staff Augmentation Services!

Sr. Etl /idq Developer Resume

Jersey City, NJ


  • 8+ of experience in Business Requirements Analysis, Designing, coding and testing of Data Warehousing implementations across Financial, Insurance, Banking and Educational Industries.
  • Experience in building and managing various data warehousing/data marts using Informatica products such as Power Center, Power mart, Power Exchange for data models.
  • Strong experience in performing ETL operations like Data Extraction, Data Transformation and Data Loading with Informatica Power Center and Informatica Power Mart (Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor).
  • Involved in all aspects of the project and technologies (Informatica, Unix Scripts, and Business Objects Reports) and work with a team of Onsite and offshore build and Test resources.
  • Strong knowledge on IDQ Mapping Designer, mapplet Designer, Transformation developer Designer, Workflow Manager and Repository.
  • Experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
  • Knowledge in OLTP/OLAP System Study and E - R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
  • Enhanced performance for Informatica sessions by using physical pipeline partitions, DTM Session performance and Parameter Files.
  • Have experience with IDQ, MDM with knowledge on Big Data Edition Integration with Hadoop and HDFS.
  • Extensive experience in using tools like SQL Plus, TOAD, SQL Developer and SQL Loader.
  • Experience in data modeling using designed tool Erwin 4.1/4.0 and worked with Oracle Enterprise Manager and Toad.
  • Knowledge in extracting data from various sources like Oracle, DB2, Flat file, SQL SERVER, XML files, Teradata and loaded into Teradata, Oracle database.
  • Strong understanding of Performance tuning in Informatica and Databases with Oracle Stored Procedures, Triggers, Index and experienced in loading data into Data Warehouse/Data Marts using Informatica.
  • Hands on experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations, Mapplets and PL/SQL stored procedures.
  • Experience on data profiling & various data quality rules development using Informatica Data Quality (IDQ).
  • Developed and used ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and loading using InformaticaPower Center10.
  • Played a significant role in Extraction, Transformation, Load (ETL) data from various sources such as Oracle, flat files etc. into Data Warehouses and Data Marts using Informatica Power Center (Designer, Workflow Manager, Workflow Monitor and Repository Manager).
  • Experienced in developing applications in Oracle and writing Stored Procedures, Triggers, functions, Views and creating Partitions for better performance.
  • Worked with business SMEs on developing the business rules for cleansing. Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
  • Presented Data Cleansing Results and IDQ plans results to the OpCos SMEs.
  • Extensive experience in oracle SQL. PL/SQL Programming. Experienced in writing and designing the SQL queries.
  • Worked on IDQ tools for data profiling, data enrichment and standardization.
  • Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations. Experience in data profiling and analyzing the scorecards to design the data model.
  • SQL*Loader, Developed PL/SQL, SQL*Plus, Tuning using explain plan etc.
  • Setting up ODBC connections, Batches and sessions to schedule the loads at required frequency using Power
  • Center.
  • Experience in UNIX Shell Scripting. Autosys, Control-M for scheduling the Workflows.
  • Familiar with Agile development and waterfall methodologies.
  • Ability to work in teams as well as individually, quick learner and able to meet deadlines Development experience


Operating Systems: Windows 7 Professional, Windows NT 4.0, Windows 2000 Server, Windows 2000 Advanced Server, Windows 2003 Server, Windows XP, Windows Vista, 7, UNIX and Mac OS

Software: C,Java,SQL, HTML, XML,Oracle 11g / 10g, MS SQL 2008, Teradata 13, MS Access, MSOffice 2010/2007

RDBMS: Oracle, MS SQL Server 7.0, 2000, 2008, DB2.

ETL Tools: Informatica Power Center 10.1.1/9.6.1/9.5. 1/9.0.1/8.6.1/8.0.1/7.5/7.1 , Control-M, IDQ, MDM, Autosys, SharePoint, Erwin

Reporting Tools: SQL Server Reporting Services(SSRS), Tableau, Power View, SharePoint 2007

Data Modeling Tools: Erwin, (Star schema/Snow flake)

Markup Languages: XML, HTML, DHTML.

Database Query Tools: SQL Server Execution Plan, MS SQL Server Query Analyzer, SQL Profiler, Red Gate SQL Data Compare, Red Gate SQL Data Generator, Red Gate SQL Search.

Version Control Tools: SVN, Team foundation Server, VSS

Atlassian Tools: JIRA, Confluence


Confidential, Jersey City, NJ

Sr. ETL /IDQ Developer


  • Interacted with Subject Matter Experts (SME) to gather Business Requirements for the application, through one-on-one interviews, and worked on developing the business rules for cleansing. Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in InformaticaPowerCenter 10.1.1, 9.6.1, 9.1.0.
  • Created and analyzed Process Work Flows, Scope & Functional Specifications, and was responsible for preparing Software Requirement Specifications (SRS) and Functional Specification Documents (FSD).
  • Exclusively worked on TFS (Team Foundation Server) to migrate the code from different environments and stacks.
  • Created Teradata,tables DDL scripts using Erwin for Stage, Target environments
  • Used Informatica 10.1.1 Power Center Designer and used transformations like Source Qualifier, Expression, Filter, Router, Joiner, Sequence developer, Update Strategy, Lookup, Sorter, Aggregator, Normalizer, XML Source Qualifier, Stored Procedure etc. for extraction, transformation and loading of data.
  • Design and develop HP Vertica anchor tables, Projections. Analyze query logs and make corrections to Projections.
  • Worked with Informatica Data Quality(IDQ)10for data cleansing, data matching and data conversion
  • Extensively used Informatica Power Center 10.1.1/9.6 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Designed/Developed IDQ reusable mappings to match accounting data based on demographic information.
  • Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
  • Performed in creating tasks in the workflow manager and exported IDQ mappings and executed them
  • Worked on IDQ parsing, IDQ Standardization, matching, IDQ web services.
  • Imported the mappings developed in data quality (IDQ) to Informatica designer.
  • Worked on Informatica Analyst Tool IDQ, to get score cards report for data issues. Extensively worked on Performance Tuning of ETL Procedures and processes.
  • Develop HP Vertica vSQL scripts for bulk loading, delta loading stage & target tables
  • Developed scripts to copy data between various Vertica environments
  • Created end to end ETL data Lineage documents which include Source, Target, Interface, Transformation details at table level
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality environment.
  • Developed Interface design specifications to source data as well as sending extracts
  • Sourced data from Oracle, Flat files and loaded into target tables of SCD Type 1 and Type 2, Full Refresh
  • Exclusively worked on Autosys (Scheduling tool) to create Informatica jobs scheduled as demanded.
  • Extracted data from wide variety of data Sources like Flat files and Relational Databases (Oracle& SQL Server).
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Created profiles and score cards for the users using IDQ.
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Created sessions, batches for incremental load into staging tables, and schedule them to run daily/weekly/monthly.
  • Extensively worked on Performance tuning to increase the throughput of the data load (like read the data from flat file & write the data into target flat files to identify the bottlenecks).
  • Created UNIX shell and Wrapper Scripts to run the INFA Workflows
  • Used Autosys and cronjobs in UNIX environment for scheduling routine tasks.
  • Used Control M scheduling tool to schedule the Jobs.
  • Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
  • Extensively worked on SQL tuning process to increase the source qualifier throughput by analyzing the queries with explain plan, creating new indexes, partitions and Materialized views.
  • Developed, executed Test Plans and Test Cases to validate and check Referential integrity of data extracts before loading it to Data Warehouse.
  • Work closely with the development team during the development phase of the project to ensure: Standards and processes are followed.
  • Developed PowerShell scripting to automate the file transfers.
  • Unit/Integrated test plans are written and tested.
  • Migration checklist is prepared.
  • Responsible for migration of code from Dev to QA, UAT and PROD for ETL environments.
  • Provided expertise in design changes and strategy.
  • ETL job tuning.

Environment: Informatica Power Center 10.1.1/9.6.1/9.5. , Power Exchange, IDQ, Oracle 10g, SQL Server 2005/2008,Autosys, Powershell, TFS, 8.3/8.4.1, Unix, Windows XP/Windows7

Confidential, St Louis, MO

Sr. ETL /IDQDeveloper


  • Integrated ActiveVOS into hub and made sure that default workflows are integrated. Deployed the default workflows manually if required.
  • Created RESTful Web Services on IDD using provisioning tool which is used by third party applications for accessing the MDM Hub.
  • Used 7+ address validators using different scopes by crossing the default limit of 3 address validators in one mapping. Changed the default behavior of address doctor.
  • Used ETL process for cleansing, delta detection for minimizing the processing load on MDM before bringing in the data into landing tables.
  • Worked on profiling the data using Developer/Analyst Tool for identifying the data integrity from different sources.
  • Creating JAVA User Exits using SIF APIs to customize MDM Hub functionality.
  • Used validation rules extensively(50+) above and beyond Informatica advised limited of around 27 for picking up right column by MDM.
  • Developed mappings for loading the data from landing to the stage tables.
  • Created the custom cleanse functions and configured custom as well as default cleanse functions into mappings.
  • Defined Match rules in Match and Merge settings of the base tables by creating Match Path Components, Match Columns and Rule sets.
  • Used SIF API's (GET, SearchMatch, PUT, CleansePut, ExecuteBatchDelete etc.) to test search, update, cleanse, insert and delete of data from SoapUI.
  • Configured match rule set filters for meeting the different data scenarios using SQL filter, segment matching/Segment all matching and non-equal matching.
  • Performed match/merge, ran match rules to check the effectiveness of MDM on data, and fine-tuned the match rules.
  • Involved in analyzing different modules of facets system and EDI interfaces to understand the source system and source data.
  • Accepted inbound transactions from multiple sources using FACETS.
  • Supported integrated EDI batch processing and real-time EDI using FACETS.
  • Created custom indexes using Register Custom Index SIF API for improving the performance of the load, match, and merge process.
  • Installing and Configuring of Informatica MDM Hub Console, Hub Store Cleanse and Match Server Address of InformaticaPowerCenter applications
  • Implemented the pre-land and land process of loading the dataset into Informatica MDM Hub
  • Configured and documented the Informatica MDM Hub to perform loading, cleansing, matching, merging and publication of MDM data
  • Worked on Real Time Integration between MDM Hub and External Applications using SIF API for JMS
  • Closely worked with Data Steward Team for designing, documenting and configuring Informatica Data Director and worked on IDD user exits.
  • Worked independently on different claims systems - FACETS, NASCO, WGS.
  • Worked on scheduling the load, stage, match, merge jobs in appropriate sequence.
  • Used ActiveVOS for configuring workflows like one step approval, merge and unmerge tasks.
  • Configured static lookups, dynamic lookups, bulk uploads and Smart search in IDD.
  • Worked provision tool for custom configuring various views as part of entity 360 view for different roles.
  • Configured JMS Message Queues and appropriate triggers for passing the data to the contributing systems/receiving downstream systems.

Environment: Multi-Domain MDM 10.1, IDD, Oracle 11g, Oracle PL/SQL, Windows Application Server, JBoss Application Server, ActiveVOS, SIF API, Informatica Power Center 10.1, Provisioning tool, Address Doctor 5.1, PowerShell/Bat scripts.

Confidential, Chicago, IL

Sr. Informatica/MDM Developer


  • Involved in leading and monitoring the team, assigning the task, reviewing the development activity and status calls.
  • Produced detailed MDM design specifications consistent with the high-level MDM design specifications.
  • Coordinated with ETL team for performing batch process to populate data from external source systems to landing tables in hub.
  • Analyzed the source systems data for identifying integrity issues using Analyst Tool for profiling the Data.
  • Provided the feedback to the stakeholders based on profiling results that helped determining the trust scores and validation rules.
  • Configured Landing, staging tables and Base Object tables.
  • Trust scores and validation rules are configured in the hub.
  • Worked on integration of external application with MDM Hub using SIF APIs.
  • Design, document and configure the Informatica MDM Hub to support initial data loads and incremental loads, cleansing.
  • Worked on Address Doctor for cleansing addresses using Developer tool before feeding into landing tables.
  • Analysis and implementation of existing claim adjudication process in FACETS.
  • Used SOAP UI to perform SIF API calls like clean tables etc.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Defined Match rules in Match and Merge settings of the base tables by creating Match Path Components, Match Columns and Rule sets.
  • Used filters, segment/segment all matching and non-equal matching.
  • Performed match /merge and ran match rules to check the effectiveness of MDM on data and fine-tuned the match rules.
  • Customized User Exists for deferent scenarios.
  • Used Hierarchy tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles.
  • Data integration with claim processing engine (Facets).
  • Closely worked with Data Steward Team for designing, documenting and configuring Informatica Data Director.
  • Used Native BPM for configuring workflows like One-step approval, merge and unmerge.
  • Used Repository Manager/Change List for migrating incremental as well as bulk meta-data.

Environment: Multi-Domain MDM 9.7, IDD, Address Doctor, Oracle 11g, Oracle PL/SQL, SIF API, Windows Application Server, Native BPM.

Confidential, Atlanta, GA

Informatica/IDQ Developer


  • Worked closely with Development managers to evaluate the overall project timeline.
  • Interacted with the users and making changes to Informatica mappings according to the Business requirements.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers and data flow management into multiple targets using Router.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Lookup, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level.
  • Involved in standardization of Data like of changing a reference data set to a new standard.
  • Data if validated from third party before providing to the internal transformations should be checked for its accuracy (DQ).
  • Used Address validator transformation in IDQ.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Created partitioned tables, partitioned indexes for manageability and scalability of the application. Made use of Post-Session success and Post-Session failure commands in the session task to execute scripts needed for cleanup and update purposes.
  • Extensively worked in ETL and data integration in developing ETL mappings and scripts using SSIS, Worked on Data transfer from a Text file to SQL Server by using bulk insert task in SSIS.
  • Extensively used the Business Objects functionality such as Master-Detail, Slice and Dice, Drill Down and Hierarchies for creating reports.
  • Implemented slowly changing dimensions Type 2 using ETL Informatica tool.
  • Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
  • Created Tableau worksheet which involves Schema Import, Implementing the business logic by customization.
  • Created Use-Case Documents to explain and outline data behavior.
  • Working with Informatica Developer (IDQ) tool to ensure data quality to the consumers.
  • Used Address validator transformation for validating various customers address from various countries by using SOAP interface.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Involved in deployment of IDQ mappings to application and to different environments.
  • Defects are logged and change requests are submitted using defects module of Test Director.
  • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance.

Environment: Informatica Power Center 9.5/9.1, IDQ, SAS, Business Objects 3.1, Oracle 11g, UNIX, PLSQL, SQL* PLUS, SQL SERVER 2008 R2, TOAD, MS Excel 2007.

Hire Now