We provide IT Staff Augmentation Services!

Sr Idq/mdm Developer Resume

3.00/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY:

  • Over 8+ years IT experience in analysis, design, development and implementing software solutions in Data Warehousing using tools - Informatica Powercenter 9.5, Informatica Data Quality(IDQ) 9.6.1, Informatica Master Data Management (MDM) 10.0, Informatica Power Exchange 9.1 and Informatica Data Validation Option(DVO) 9.5.
  • Extensive Database experience using Oracle 11g/10g/9i, Teradata, DB2, MS SQL Server 2010/ 2005, SQL, PL/SQL.
  • Strong Data Warehousing/ETL experience using Informatica Powercenter Client tools - Designer, Repository Manager, Workflow Manager, and Monitor.
  • Experience in handling Informatica power center for ETL extraction, transformation and loading into target Data warehouse.
  • Strong in Data warehousing concepts, dimensional Star Schema, Snowflakes Schema methodologies, slowly changing Dimensions (SCD Type1/Type2).
  • Extensive experience in creation of complex ETL mappings, mapplets and workflows using Informatica Power Center to move data from multiple sources into target area.
  • Worked on non-relational sources like Flat Files and used secure FTP.
  • Proven experience translating business problems into actionable data quality initiatives.
  • Experience in data profiling & data quality rules development with business and IT end users.
  • Strong experience in using Informatica Data Quality 9.6.1 (IDQ) tool. Complex quality rule development and implementation patterns with cleanse, parse, standardization, validation, scorecard, exception, notification and reporting with ETL and Real-Time consideration.
  • Worked with most of the IDQ transformations (like standardizer, Parser, Exception, Address Doctor and Merger etc.) in Mapplets.
  • Experience with Informatica Analyst 9.6.1 to analyze the Source data quality issues by creating the Profiles and applying the data quality rules.
  • Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data cleansing and data validation by using IDQ.
  • Experience in exporting the IDQ objects as mappings or mapplets from IDQ Developer/Analyst to Power center and use them in Power center.
  • Experience in implementing the data quality rules on Hadoop Big data HDFS and HIVE source tables using Informatica Big Data Edition in IDQ 9.6.1.
  • Experience in implementing the Change Data Capture Process using Informatica Power Exchange.
  • Expertise in implementing Customer, Product and Supplier Master Data Management domains.
  • Experience in working with Informatica DVO tool and creating SQL Views, Lookup Views and Table pairs for the Automation Testing.
  • Developed UNIX shell scripts to automate applications, Schedule jobs, develop interfaces.
  • Experience in developing stored procedures, functions, triggers, cursors and joins, SQL Queries with T-SQL, PL/SQL and working with TOAD, SQL Developer, SQL*PLUS.
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
  • Improved the performance of mappings using various optimization techniques.
  • Experience in Agile and Waterfall methodology projects.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.5/9.0.1/8.6/8.1/7.1, Informatica IDQ 9.6.1, Informatica Analyst 9.6.1/9.1/8.6, Siperian/Informatica MDM 10.0/9.1, Informatica Power exchange 9.1,Informatica DVO 9.5,Informatica Big Data Edition 9.6.1

Reporting Tools: COGNOS BI- Framework manager, Query studio, Report studio

Programming Languages: SQL, PL/SQL, HTML, Basic UNIX SHELL SCRIPTING

Database: Oracle 9i/10g/ 11g, Teradata, DB2, MS SQL Server 2010/ 2005

Operating System: Windows 98/NT/2000/XP/7, UNIX

Programming Tools: SQL*PLUS, SQL* Loader, TOAD 10.0, Putty

Office Tools: MS PowerPoint, MS Word, MS Excel

Scheduling tools: Control M, Skybot, Robot, Crontab, UC4

Defect Management: HP Quality Centre, HP ALM, JIRA, Assembla, Service Now

Internet Tools: HTML,XML, XSLT

PROFESSIONAL EXPERIENCE:

Confidential, Minneapolis, MN

Sr IDQ/MDM Developer

Responsibilities:

  • Created Source to Target mapping document for various mappings and Rules developed for business line.
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, Score cards, reporting and monitoring capabilities of Informatica Data Quality IDQ 9.6.1.
  • Extensively used the DQ transformations like Address validator, Exception, Parser, Standardizer, Match, Merge, Comparison, Case converter etc.
  • Worked on Discovery, Design and Score card modules in Informatica Analyst 9.6.1.
  • Developed Mapping specification and rule specification in Informatica Analyst 9.6.1.
  • Created Reference tables with Customer specific data and used that in standardization process.
  • Configured MDM landing, staging and base object tables in MDM Hub.
  • Create complex mappings including cleanse functions and transformation rules applying Trust settings, Validation rules for the MDM Hub.
  • Create IDQ mapplets, export the mapplets as web services and use them as Cleanse Functions in MDM Hub.
  • Used the built inSIF (MVC model)architecture to build the action classes.
  • Involved in the configuration of target-actions.xml file based on the SIF Framework.
  • Create Match Rules including match paths, columns for the Base Objects using various options like Key Widths, Key Types, Purposes, Match levels and Match Filters.
  • Configure the Security of the application via Roles and Users in SAM/Admin console.
  • Create Queries and Packages for Data Stewards and IDD application.
  • Design and create IDD Tasks, workflows, data security, masking in IDD.
  • Configure IDD applications for Data Stewards to access data for Data Governance.
  • Design and configure Hierarchy in the Hub using the Hierarchies Workbench.
  • Analyzed the source feeds for various systems to design the ETL Process flow.
  • Imported Source/Target Tables from the respective databases and created reusable transformations like Joiner, Routers, Lookups, Filter, Expression and Aggregator and created new mappings using Designer module of Informatica.
  • Involved in the Unit Testing and Integration testing of the workflows developed.
  • Implemented reusable logic by creating mapplets, which are reused in several mappings.
  • Tuned Informatica mappings and sessions to remove bottlenecks for better performance.
  • Worked extensively with SVN for version control of various scripts and code.

Environment: Informatica IDQ 9.6.1, Informatica Analyst 9.6.1, Informatica Big Data Edition 9.6.1(IDQ), Informatica MDM 10.0, Informatica Data Director(IDD) 10.0, Oracle 11g, TOAD 12.7, Teradata, HP ALM, UC4, WIN SCP

Confidential, Detroit, Michigan

ETL Informatica Developer

Responsibilities:

  • Interact with the Product Owners to get a brief knowledge of business logics.
  • Participate in Agile team activities including daily standups, sprint planning and product demos, etc. Worked with the Product Owner to understand requirements and translate them into the design, development and implementation of ETL process using Informatica 9.x.
  • Working with business and technical resources to address business information and application needs.
  • Developed SQL and DB2 procedures, packages and functions to process the data for CGR Project (Complete Goods Reporting).
  • Involve in Data validating, Data integrity, performances related to DB, Field size validation, check Constraints and Data Manipulation and updates by using SQL.
  • Extensively worked on Informatica data integration platform.
  • Extract data from flat files, DB2, SAP and to load the data into respective tables.
  • Implementing various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy for CGR.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD, Type 2 SCD and fact implementation.
  • Extensively used Mapping Variables, Mapping Parameters, and Parameter Files for the capturing delta loads.
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.
  • Used cvs files and tables as sources and loaded the data into relational tables.
  • Created and Configured Workflows, Worklets, and Sessions to load the data to Netezza tables using Informatica PowerCenter.
  • Working with the business analyst team to analyze the final data and fixed the issues if any.
  • Used pushdown optimization to achieve good performance in loading data into Netezza.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Extensively work on Informatica Metadata tables and created reusable transformations for error handling and exception reprocessing.
  • Developed DVO table to table validation scripts and automated in production.
  • Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Worked on Java code for testing ETL frame work.
  • Conduct code walkthroughs and review peer code and documentation.
  • Writing Unix scrip for handling the source data and mapping.
  • Validate the ongoing data synchronization process using validation tools to ensure the data in source and target system are synchronized.
  • Extensively used ESP tool for scheduling Informatica batch jobs and provided production support on rotation.

Environment: Informatica Power Center 9.5, Informatica IDQ 9.5,Informatica DVO 9.5, Oracle 11g, SQL Server 2010 PL/SQL, Cognos Report Studio, Cognos framework manager, Linux server, Unix shell scripting, Assembla, Service Now, Skybot, Putty

Confidential

Sr MDM/ETL Developer

Responsibilities:

  • Working on building new data warehouse for customer data information.
  • Experienced working with team leads, Interfaced with business analysts and end users.
  • Worked with data modelers to understand financial data model and provided suggestions to the logical and physical data model.
  • Document unit test cases and provide QA support to make testers understand the business rules and code implemented in Informatica.
  • Defined and configured schema, staging tables, and landing tables, base objects foreign-key Relationships, look up systems and tables, packages, query groups and queries/custom queries.
  • Used Cleanse Functions to Cleanse and Standardize while data is loading into stage tables.
  • Enabled delta detection to extract the Incremental Data .
  • Defined the Systems, Trust Scores and Validation rules.
  • Created the Match/Merge rule sets to get the right master records.
  • Performed data steward operations by editing the records using data manager and merge manager.
  • Used Hierarchy Manager tool, for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
  • Implemented IDD applications and created subject area groups, subject areas, subject area child, IDD display packages in hub.
  • Import & Export of ORS using Metadata manager.
  • Configured Address Doctor which can cleanse Customer Address Data.
  • Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Router, Sorter, Expression, Normalizer, Sequence Generator and Update Strategy, data standardization, address validator etc.
  • Implemented SCD Type 1 and SCD Type 2 for loading data into data warehouse dimension tables.
  • Implemented error handling for invalid and rejected rows by loading them into error tables.
  • Implementing the Change Data Capture Process using Informatica Power Exchange.
  • Involved in Performance Tuning of ETL code by addressing various issues during extraction, transformation and loading of data.
  • Worked with Support team to migrate the ETL code to Production servers from development servers by creating a mapping document and Request for Change (RFC) of the process.
  • Involved in Knowledge Transfer sessions to Support team after completion of UAT signoff.
  • Handled Production issues and monitored Informatica workflows in production.
  • Extensively worked on batch frame work to run all Informatica job scheduling.

Environment: Informatica Power Center 9.1, Informatica MDM 9.1, Oracle, PL/SQL,DB2,RALLY, HP Quality Control, Informatica Power Exchange 9.1

Confidential

Informatica Developer

Responsibilities:

  • Designed developed, and managed the workflow processes to reflect business requirements with several adapters, exceptions and rules
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
  • Tuned the SQL queries used in the SQL override, Lookup Override of the Source Qualifier and Lookup transformations.
  • Worked with data modelers to understand financial data model and provide suggestions to the logical and physical data model.
  • Involved in creation of initial data set up in the Production environment and involved in code migration activities to Production.
  • Provided administrative functions like creating repositories, backing up repositories, setting up users, assigning permissions and setting up folders in Repository manager.
  • Wrote shell script utilities to detect error conditions in production loads and take corrective actions, wrote scripts to backup/restore repositories, backup/restore log files.
  • Migrated Informatica objects from development to QA and production using deployment tool.
  • Provided production support and involved with root cause analysis, bug fixing and promptly updating the business users on day-to-day production issues.

Environment: Informatica Powercenter 7.1, Oracle 9i, PL/SQL, SQL*Plus, Windows, UNIX, AUTOSYS.

We'd love your feedback!