We provide IT Staff Augmentation Services!

Sr. Mdm Engineer Resume

4.00/5 (Submit Your Rating)

Milwaukee, WI

SUMMARY

  • 9+ years of IT experience in design, analysis, development, documentation, coding, and implementation including Databases, Data Warehouse, ETL Design, Oracle, PL/SQL, SQLserver databases, SSIS, Informatica MDM(Version 10.3, 10.2), InformaticaPowerCenter 9.x/8.x/7.x, Informatica Data Quality, etc
  • Expertise in Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions
  • Experience in installation and configuration of core Informatica MDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server, and Cleanse Adapter in Windows.
  • Configured administrative activities related to MDM platform including but not limited to MDM hub, Process Server, Active VOS, Provisioning and IDD/C360 UI.
  • Understand / Translated the requirements pertaining to the Informatica’s C360 application implementation for a Customer MDM model.
  • Designed the Informatica master data C360 processes, C360 Model objects, Entities, etc
  • Implemented the C360 product functionality to support MDM application design within SFDC
  • Configured C360 application configuration for Customer objects
  • Worked with offshore teams to design and develop/configure C360 functionality
  • Deep experience /expertise in designing Informatica's C360 Master Data Management solution
  • Configured Informatica Data Director(IDD); design workflow customizations for Data Stewards
  • Expertise in the ETL Tool Informatica and have extensive experience in Power Center Client tools including Designer, Repository Manager, Workflow Manager/ Workflow Monitor
  • Extensively worked with complex mappings using various transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Unconnected / Connected Lookup, Aggregator, Stored Procedure, XML Parser, Normalize, Sequence Generator, Update Strategy, Reusable Transformations, User Defined Functions, etc.
  • Experienced in using Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling / IDQ Developer client, applying rules and develop mappings to move data from source to target systems
  • Experience on creating Column Profiling, Rule Profiling, Mid - stream Profiling, Score Cards, Decision Task, Reference Tables, DQ Tables, Notification task as a part of workflow and deploying them.
  • Expert in Data Extraction, Transformation, Loading from data sources like Teradata, Oracle, SQL Server, XML, Flat files, COBOL and VSAM XML files etc.
  • Production Automation: Design and schedule the ETL/ELT process for periodic automated updates, including FTP/SFTP SCP file transfers. Automated update processes must have built-in fail-over strategies and must ensure data consistency and correctness
  • Managed Collibra Confidential across the enterprise, driving governance activities for all participating business units and ensuring all work activity is completed on time and to standards; while mitigating risks as needed.
  • Configured Collibra Communities, Domains, Types, Attributes, Status, Articulation, and Workflow and customize attribution and solution including custom dashboard with data quality, metrics, status, workflow initiation and issue management for each Domain specific requirements.
  • Experience in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets and vice versa
  • Experience in Address Validator transformation extensively for North America Address validations.
  • Develop, implement and optimize stored procedures and functions using T-SQL .
  • Analyze existing T-SQL queries for performance improvements.
  • Having good knowledge on creation and scheduling of T-SQL jobs to run daily.
  • Extensively worked on Relational Databases Systems like Oracle11g/10g/9i/8i, MS SQL Server, Teradata and Source files like flat files, XML files and COBOL files
  • Experience in ActiveVOS workflow design, creation of Human task Extensive experience in MDM 10. x with IDQ and ActiveVOS 9. 2.
  • Experience in managing ActiveVOS Central Setup Identify Service on ActiveVOS Console
  • Proficient in Data warehouse ETL activities using SQL, PL/SQL, PRO*C, SQL*LOADER, C, Data structures using C, Unix scripting, Python scripting and Perl scripting.
  • Experienced in mapping techniques for Type 1, Type 2, and Type 3 Slowly Changing Dimensions
  • Strong experience in SQL, PL/SQL, Tables, Database Links, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, Packages, Triggers, Joins, Unions, Cursors, Collections, and Indexes in Oracle
  • Sound knowledge of Linux/UNIX, Shell scripting, experience in command line utilities like pmcmd to execute workflows in non-windows environments
  • Proficiency in working with Teradata utilities like (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, PMON, Visual Explain).
  • Experience in using Automation Scheduling tools like Autosys, Tidal, Control-M, Tivoli Maestro scripts.
  • Excellent Interpersonal and Communication skills coupled with strong technical and problem-solving capabilities.
  • Excellent analytical, problem solving, technical, project management, training, and presentation skills.

TECHNICAL SKILLS

Operating System: UNIX, Linux, Windows

Programming and Scripting: C, C++, Java, Python,.Net, Perl Scripting, Shell Scripting, XSLT, PL/SQL, T-SQL.

Specialist Applications & Software: Informatica MDM(10.3,10.2,10.1),Informatica Power Center/ 10.2/10.1 10/9.6.1/9.5/9.1/8.6/8.1/7.1, Informatica Power Exchange, Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Director (IDD), Customer C360, MDM, SSIS, Salesforce, DataStage, Profisee etc.

Data Modeling (working knowledge): Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical, Logical Data Modeling, and ER Diagrams.

Databases tools: SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, AQT v9 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN

Scheduling tools: Informatica Scheduler, CA Scheduler(Autosys), ESP, Maestro, Control-M, BMC, Microsoft Task Scheduler)

Conversion/Transformation tools: Informatica Data Transformation, Oxygen XML Editor (ver.9, ver.10)

Software Development Methodology: Agile, Water fall.

Domain Expertise: Publishing, Insurance/Finance, HealthCare, Education, Industrial Automation

Others (working knowledge on some): OBIEE RPD creation, OBIEE Reports, ECM, Informatica Data Transformation XMAP, DAC, Rational Clear Case, WS-FTP Pro, DTD, Microsoft .NET/C#, Visual Studio

RDBMS: SQL, SQL*Plus, SQL*Loader, Oracle11g/10g/9i/8i, Teradata, IBM DB2, UDB 8.1/7.0, Sybase 12.5, Netezza v9, MS SQL Server 2000/2005/2008, MS Access 7.0/2000.

Data Governance Tool: Collibra

PROFESSIONAL EXPERIENCE

Confidential, Milwaukee WI

Sr. MDM Engineer

Responsibilities:

  • Designed, codes and tests from Informatica MDM to new Profisee Master Data Management platform components, including supporting applications and interfaces
  • Supported cross-functional development activity in conjunction with other integration technology resources like Microsoft .NET/C#, SQL Server, Visual Studio, and DevOps
  • Developed and tested infrastructure components in Cloud and Edge-level environments
  • Proactively monitored industry trends (BMC Control-M, Microsoft Task Scheduler) and identified opportunities to implement new technologies
  • Managed the DevOps pipeline deployment model like CI/CD (Azure DevOps)
  • Enabled automated testing procedures
  • Implemented software in all environment’s API & ETL tools (Informatica PowerCenter 10.2 HF1, Informatica IDQ 10.2 HF1, MuleSoft, SAP Data Services, Microsoft Azure Data Factory)
  • Mappings according to the Business requirements for the Incremental Fixes Developer, and Mapping Designer.
  • Extensively used Informatica Data Explorer (IDE)&Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Created Informatica workflows and IDQ mappings for - Batch and Real Time.
  • Extracted data from various heterogeneous sources like DB2, Sales force, Mainframes, Teradata, and Flat Files using Informatica Power center and loaded data in target database.
  • Created and Configured Landing Tables, Staging Tables, Base Objects, Foreign key relationships, Queries, Query Groups etc. in MDM.
  • Defined the Match and Merge Rules in MDM Hub by creating Match Strategy, Match columns and rules for maintaining Data Quality
  • Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, Filter, and Union in developing the mappings to migrate the data from source to target.
  • Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager
  • Extensively worked in migrating the mappings, worklets and workflows within the repository from one folder to another folder as well as among the different repositories.
  • Created Mapping parameters and Variables and written parameter files.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Leveraged containerization models and works with other engineers and architects to keep the architecture current
  • Assisted in the support and enhancement of applications
  • Written high-quality code compliant with regulations
  • Collaborated with business systems analysts and product owners to define requirements

Environment: Informatica MDM 10.3, Informatica IDQ 10.2 HF1, Profisee 2019 R1.2, Informatica Power Center 10.2 HF1, Erwin, Teradata, Tidal, SQL Assistance, DB2, XML, Microsoft .NET/C#, Visual Studio, Microsoft SQL Server Studio 18 Oracle 9i/10g/11g, MQ Series, OBIEE 10.1.3.2, Toad and UNIX Shell Scripts.

Confidential

Sr. Informatica Developer/ MDM Developer/IDQ

Responsibilities:

  • Developed MDM Tool(Version 10.3) for project requirements
  • Designed data models within MDM Hub, and derived MDM Hub Architecture, Experience in Conceptual and Physical Data Model definition.
  • Designed and implemented mappings for bringing data into MDM Hub from the source systems.
  • Configured Informatica MDM Hub Match and Merge Rules
  • Configured Informatica Data Director(IDD); design workflow customizations for Data Stewards
  • Configured administrative activities related to MDM platform including but not limited to MDM hub, Process Server, Active VOS, Provisioning and IDD/C360 UI.
  • Configured C360 view to display the data of the root node of the composite objects
  • Configured unique lay out for a role using C360
  • Configured IDD and Used C360 to display the records that matches the entity.
  • Designed the Informatica master data C360 processes, C360 Model objects, Entities, etc
  • Implemented the C360 product functionality to support MDM application design within SFDC
  • Configured C360 application configuration for Customer objects
  • Worked with offshore teams to design and develop/configure C360 functionality
  • Deep experience /expertise in designing Informatica's C360 Master Data Management solution
  • Master Data Domain experienced in one or more of the following data domains (Person, Customer, Product)
  • Used InformaticaPower Center to load data from different data sources like xml, flat files and Oracle, Teradata, Salesforce.
  • Managed Collibra Confidential across the enterprise, driving governance activities for all participating business units and ensuring all work activity is completed on time and to standards; while mitigating risks as needed.
  • Expert in Data Extraction, Transformation, Loading from data sources like Teradata, Oracle, SQL Server, XML, Flat files, COBOL and VSAM XML files etc.
  • Production Automation: Design and schedule the ETL/ELT process for periodic automated updates, including FTP/SFTP SCP file transfers. Automated update processes must have built-in fail-over strategies and must ensure data consistency and correctness.
  • Configured Collibra Communities, Domains, Types, Attributes, Status, Articulation, and Workflow and customize attribution and solution including custom dashboard with data quality, metrics, status, workflow initiation and issue management for each Domain specific requirements.
  • Identified and validated the Critical Data Elements in IDQ.
  • Built several reusable components on IDQ using Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.
  • Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Extensively worked on Labeler, Parser, Key Generator, Match, Merge, and Consolidation transformations to identify the duplicate records.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ
  • Performed TDD for all the MDM components in each sprints in all Iterations when following Agile Methodology.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads, and process flow of the mappings
  • Provided 24x7 production supports for business users and documented problems and solutions for running the workflows.

Environment: Informatica MDM 10.3 HF1, Informatica IDQ 10.2 HF1 Informatica IDD, C360, SQL, UNIX,IDE, Oracle 12g, CDC, MDM,Java, Linux, Perl, AWS,WINSCP, Shell, PL/SQL, Netezza, Teradata, Collibra, Microsoft SQL Server 2008, and Microsoft Visual studio

Confidential, Pittsburgh, PA

Sr. Informatica Developer/MDM Consultant

Responsibilities:

  • Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
  • Communicated with business customers to discuss the issues and requirements.
  • Designed, documented and configured the Informatica MDM Hub to support loading, cleansing of data.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Developed ETL programs using Informaticato implement the business requirements.
  • Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets& Parameter files in Mapping Designer using Informatica Power Center.
  • Experience in ActiveVOS workflow design, creation of Human task Extensive experience in MDM 10. x with IDQ and ActiveVOS 9. 2.
  • Experience in managing ActiveVOS Central Setup Identify Service on ActiveVOS Console
  • Good Knowledge on ActiveVOS Error and Fault handing, Event Handling, Gateway, Control Flow
  • Understanding of Monitoring Services (Process, Task, Server) Schedule ActiveVOS Automatic Processes
  • Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
  • Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data
  • Used relational SQL wherever possible to minimize the data transfer over the network.
  • Identified and validated the Critical Data Elements in IDQ.
  • Managed Collibra Confidential across the enterprise, driving governance activities for all participating business units.
  • Expert in Data Extraction, Transformation, Loading from data sources like Teradata, Oracle, SQL Server, XML, Flat files, COBOL and VSAM XML files etc.
  • Production Automation: Design and schedule the ETL/ELT process for periodic automated updates, including FTP/SFTP SCP file transfers.
  • Provided support and quality validation through test cases for all stages of Unit and integration testing
  • Created, Deployed & Scheduled jobs in Tidal scheduler for integration, User acceptance testing and Production region.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
  • Used the Teradatafast load utilities to load data into tables
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Converted all the jobs scheduled in Maestro to Autosys scheduler as the per requirements
  • Worked on maintaining the master data using InformaticaMDM
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows)
  • Performed tuning of queries, targets, sources, mappings, and sessions.
  • Used Linux scripts and necessary Test Plans to ensure the successful execution of the data loading process
  • Worked with the Quality Assurance team to build the test cases to perform unit, Integration, functional and performance Testing.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads, and process flow of the mappings
  • Provided 24x7 production supports for business users and documented problems and solutions for running the workflows.

Environment: Informatica Power Center 10.1, UNIX, SQL, IDQ, IDE, CDC, MDM,Java, Linux, Perl, AWS,WINSCP, Shell, PL/SQL, Netezza, Teradata,Collibra, Microsoft SQL Server 2008, and Microsoft Visual studio.

Confidential, Charlotte, NC

Sr. Informatica Developer / Analyst

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, and support for production environment.
  • Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using InformaticaMDM Hub console.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Created Stored Procedures for data transformation purpose.
  • Involved in the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Data management
  • Worked on Informatica Power Center 9 x tools - Source Analyzer, Data warehousing designer, Mapping, Designer, Mapplet& Transformations.
  • Used Informatica power center and Data quality transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer, Standardizer, Labeler, Parser, Address Validator (Address Doctor), Match, Merge, Consolidation transformations
  • Experience in ActiveVOS workflow design, creation of Human task Extensive experience in MDM 10. x with IDQ and ActiveVOS 9. 2.
  • Experience in managing ActiveVOS Central Setup Identify Service on ActiveVOS Console
  • Good Knowledge on ActiveVOS Error and Fault handing, Event Handling, Gateway, Control Flow
  • Used Teradata Utilities (BTEQ, Multi-Load, and Fast-Load) to maintain the database.
  • Worked with different scheduling tools like Tidal, Tivoli, Control M, Autosys.
  • Created Tivoli Maestro jobs to schedule Informatica Workflows.
  • Built a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ
  • Created profile and scorecards to review data quality.
  • Actively involved in Data validations and unit testing to make sure data is clean and standardized before loading in toMDM landing tables.
  • Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.
  • Worked on Autosys as job scheduler and used to run the created application and respective workflow in this job scheduler in selected recursive timings.
  • Generated PL/SQL and Shell scripts for scheduling periodic load processes.
  • Involved in Production support activities like batch monitoring process in UNIX
  • Prepared Unit test case documents

Environment: Informatica Power Center 9.6.1, UNIX, Oracle, Linux, Perl, Shell, MDM, IDQ, PL/SQL, Tivoli, Oracle 11g/10g, Teradata 14.0.

Confidential, Southampton, PA

Sr. ETL Developer / Analyst

Responsibilities:

  • Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue...
  • Document the process that resolves the issue which involves analysis, design, construction and testing for Data quality issues
  • Involved in doing the Data model changes and other changes in the Transformation logic in the existing Mappings according to the Business requirements for the Incremental Fixes
  • Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer, and Mapping Designer.
  • Extensively used Informatica Data Explorer (IDE)&Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Defined the Match and Merge Rules in MDM Hub by creating Match Strategy, Match columns and rules for maintaining Data Quality
  • relational and Flat Files. Used Update Strategy transformation extensively with DD INSERT, DD UPDATE, DD REJECT, and DD DELETE.
  • Extensively Implemented SCD TYPE 2 Mappings for CDC (Change data capture) in EDW.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Worked on Scheduling Jobs and monitoring them through Control M and CA scheduler tool (Autosys).
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.
  • Worked with the SCM code management tool to move the code to Production
  • Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.

Environment:Informatica Power Center 9.0.1, Erwin, Teradata, Tidal, SQL Assistance, DB2, XML, Oracle 9i/10g/11g, MQ Series, OBIEE 10.1.3.2, IDQ, MDM Toad and UNIX Shell Scripts.

Confidential, Schenectady, NY

Sr. Informatica Developer/IDQ/MDM

Responsibilities:

  • Designing the dimensional model and data load process using SCD Type 2 for the quarterly membership reporting purposes.
  • Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement.
  • Generating the data feeds from analytical warehouse using required ETL logic to handle data transformations and business constraints while loading from source to target layout.
  • Worked on Master Data Management (MDM), Hub Development, extract, transform, cleansing, loading the data onto the staging and base object tables
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ
  • Hands on experience on HIPPA Transactions like 270, 271, 272, 273, 274, 275, 276, 277, 834, 835, 837 etc.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
  • Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.
  • Mapped client processes/databases/data sources/reporting software to HPE’s XIX X12 processing systems (BizTalk/Visual Studio/Oracle SQL/MS SQL/C#/.Net/WSDL/SOAP/Rest/API/XML/XSLT).
  • Source control using SCM.

Environment: Informatica Power Center 9.0.1, Erwin 7.2/4.5, Business Objects XI, Unix Shell Scripting, XML, Oracle 11g/10g, DB2 8.0, IDQ, MDM, TOAD, MS Excel, Flat Files, SQL Server 2008/2005, PL/SQL, Windows NT 4.0.

Confidential

ETL Developer / Analyst

Responsibilities:

 
  • Involved in the requirement definition and analysis support for Data warehouse efforts.
  • Documented and translated user requirements into system solutions; developed implementation plan and schedule.
  • Designed fact and dimension tables for Star Schema to develop the Data warehouse.
  • Extracted the data from Teradata, SQL Server, Oracle, Files, and Access into Data warehouse.
  • Created dimensions and facts in physical data model using ERWIN tool.
  • Used Informatica Designer to create complex mappings using different transformations to move data to a Data Warehouse
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Look up, Aggregator, Stored Procedure, Update Strategy, Joiner, Filter.
  • Scheduling the sessions to extract, transform and load data into warehouse database on Business requirements.
  • Handled common data warehousing problems like tracking dimension change using SCD type2 mapping.
  • Used e-mail task for on success and on-failure notification.
  • Used decision task for running different tasks in the same workflow.
  • Assisted team member with their various Informatica needs.
  • Developed and maintained technical documentation regarding the extract, transformation, and load process.
  • Responsible for the development of system test plans, test case creation, monitoring progress of specific testing activities against plan, and successfully completing testing activities within the requisite project timeframes.

Environment: Informatica Power Center 8.1, Erwin, Oracle 9i, UNIX, Sybase, MS SQL Server Windows 2000.

We'd love your feedback!