Informatica Mdm/pim Consultant Resume
Piscataway, NJ
SUMMARY
- MDM Consultant with 10 + years of experience in designing, developing and implementing Master Data Management, Data Warehousing, Business Intelligence, and other enterprise level solutions. Data Integration experience includes the Design and Development of ETL to and from OLTP databases, Enterprise Data Warehouses, Data Marts, Operational Data Stores(ODS) and MDM databases.
- 7+ years of experience in delivering large scale MDM solutions on Customer, Product, Employee, and Party models using Informatica MDM 10.x and 9.x editions, Informatica Data Director(IDD), Entity 360, Provisioning Tool, Business Entity Services(BES), Services Integration Framework(SIF), ActiveVOS, and Java/J2EE.
- 2+ years of experience in architecting, designing and configuring PIM(Product Information Management) solutions with Informatica PIM Product 360, Media Manger, Supplier Portal, Audit Trail, Control Center, PIM Web, PIM Rich Client, PIM SDK.
- 4+ years of experience in designing and developing ETL(Extract, Transform, Load), DW & BI solutions using Informatica Power Center, Informatica Data Quality(IDQ) and different databases such as Oracle, SQL Server, Teradata, AWS Redshift, SQL, PL/SQL, Linux/Unix, Java, Shell scripting.
- Expert developer experience in Informatica MDM 10.x including in creating Base objects, Staging tables, foreign key relationships, static lookups dynamic lookups, queries, packages, query groups, custom cleanse functions, match and merge property, match rule configuration, server log file analysis, to investigate any issue in MDM data model configuration, infrastructure tables, repository tables and associated tables, delta and delete detection concepts, configuring Hierarchy Manager and setting up manual workflows for match/merge rules.
- Experience in to delivering full lifecycle MDM projects for clients including data modeling, metadata management, data quality, data profiling design and configuration of matching and merging rules, design and configuration of standardizing, cleansing and deduplication rules.
- Experience in design and development of IDD applications using Informatica MDM Hub, Hierarchy Manager, Entity 360, Provisioning Tool, Smart Search, BES(Business Entity Services), Informatica ActiveVOS, SIF Framework, Java/J2EE, Informatica Customer360.
- Experience in designing and architecting integrations between Informatica MDM and other systems using standard connectors such as SIF, ETL(Informatica Power Center, Power Exchange), EAI( webMethods), MFT( Managed File Transfer), JMS( Java Message Service), Message Queues, Message Triggers, Java Webservices(EJB), SOAP, REST API, WSDL, XML, XSD, HTTP, JSON and MDM data publish mechanism both batch/real - time/near real-time.
- Extensive hands on experience in installing, upgrading, applying HF/EBF and managing Informatica MDM Hub Server, Informatica MDM Hub Cleanse, Cleanse Adapters ( IDQ, Address Doctor, Trillium Software), Informatica MDM Hub Resource Kit, importing ORS databases into HUB Console, ActiveVOS in Clustered, Non-Clustered, Cloud and On-Premise Environments.
- Experience in architecting, installing, upgrading and configuring MDM/PIM environment for Informatica PIM(P360) managing, Control Center, PIM web, Rich Client, Audit Trial and implementing solution to integrate PIM environment with interfacing applications, data quality tools(IDQ) and Business Process Management Tools( ActiveVOS).
- Experience to fine-tune and optimize MDM Hub performance, investigate and make recommendations on optimization opportunities.
- Well versed in building Proof of Concept (POC) to demonstrate Business Solutions using Informatica MDM and latest technology stack in market and providing Point of View (POV) for Business Problems and Requirements.
- Experience in production application support, IT service management, ITIL guidelines, Incident management, change management and SLA qualifications. Worked as L2 team member to solve Incidents, Service Requests within SLA.
- Experience in creating test strategies, and coordinate with teams for SIT, performance testing, user acceptance testing(UAT) and assisting business users in conducting the user acceptance testing.
- Ability to understand the functional design and explain/write problem definitions to communicate customers/colleagues and users from both a technical and a business functional point of view.
- Have strong analytical, problem solving, presentation and interpersonal skills, ability to work independently and as part of a team.
- Successfully led teams to meet stringent project deadlines. Ability to lead a team of less experienced professional on semi-routine tasks. Proven success in contributing to a team-oriented environment and leading a medium sized team and delivering MDM and PIM solutions.
TECHNICAL SKILLS
MDM Tools: Informatica Multidomain MDM
PIM Tools: Informatica PIM Product 360
Data Governance: IDD, E360, Provisioning Tool, BES, Customer360
BPM: Informatica ActiveVOS
Data Quality: IDQ, Address Doctor, Trillium Software
MDM SDK: SIF Framework, Java, Webservices
JMS: JMS, Message Triggers, Message Queues
Programming: Java, J2EE, JDBC, JSP, Servlet
IDE: JRE, JDK, Eclipse IDE
ETL Tools: Informatica Power Center
Databases: Oracle DBMS, SQL Server, Teradata, Redshift
DB Programming: SQL, PL/SQL, Packages, Stored Procedures, Cursors, Triggers
DB Client Tools: Toad for Oracle, Oracle SQL Developer, Teradata SQL Assistant
Data Replicators: Attunity Replicate
Application Servers: Oracle WebLogic, JBOSS, Tomcat
Cloud Tools: AWS VPC, EC2, S3, Redshift, EMR
Reporting Tools: Congos BI, Tableau
Big Data Tools: Hadoop, HDFS, Hive, Pig
OS Platforms: HP-Unix, Redhat Linux v6.x
CASE Tools: Erwin Data Modeler, MS-Visio
Source Code Control: PVCS, Enterprise Bitbucket
Change Management Tools: JIRA, HP-ALM
Scheduling Tools: Tidal, Control-M, Crontab
Domain Knowledge: Insurance, Manufacturing, Retail, Pharmaceutical, Healthcare
PROFESSIONAL EXPERIENCE
Informatica MDM/PIM Consultant
Confidential, Piscataway, NJ
Responsibilities:
- Participated in data discovery, data profiling and requirement analysis around master data management activities for product domain.
- Participated in project planning sessions with clients, business analysts, and team members to analyze and estimate development requirements/efforts and make appropriate recommendations.
- Extensively worked on installation, upgrading and configuration of core Informatica MDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse/Match Server, Informatica PIM(P360) core server, Control Center, Audit Trial, configured Message Queue and Audit Trial setup for PIM implementation.
- Defined, configured and optimized various MDM processes including landing, staging, base objects, foreign- key relationships, lookups, query groups, queries, custom queries, packages, cleanse functions, mappings, batch groups using Informatica MDM Hub console.
- Designed, developed and translated business requirements and processes for data matching and merging rules, survivorship criteria, and data stewardship workflows.
- Created and configured IDD applications with subject area groups, subject areas, relationships with subject areas, sibling references, packages, cleanse functions.
- Used Repository Manager to import and export the metadata and promote incremental changes from development to SIT, UAT and Production environments.
- Performed root cause analysis for data quality, code, MDM and PIM P360 related issues and worked with different teams to bring the defects to closure.
- Provided continuing enhancement and review of MDM matching rules, data quality and validation processes.
- Fine-tuned and optimized MDM Hub performance investigated and made recommendations on optimization opportunities.
- Identified defects in the MDM, Product 360 and collaborated with Informatica Support for resolving tickets.
- Worked on PIM Desktop and Web Client tools on daily basis as part of build work and debugging issues with the code.
- Created import mappings in PIM Rich client for inbound files to import data from upstream systems and export mappings and profiles for down stream systems and internal data enrichment.
- Configured and setup automated imports using Hotfolder and Hotfolder groups in PIM Rich client.
- Worked on data model change using Repository Editor and customization using PIM SDK.
- Created several data quality rules required for different channels using Data Quality perspective in PIM Desktop/Rich Client.
- Worked on trouble shooting and root cause analysis for issues during initial and full loads in PIM system and solved the problems by tuning database and application server level parameters.
- Worked on User Group Management by creating Users, User Groups and assigning Action rights, Interface Visibility, and Field Visibility to User Groups as Authorization profile document.
- Created product hierarchies using Structures perspective in PIM Desktop client for Brand, Franchise etc.
- Performed code migrations/deployments of PIM objects to SIT, UAT and Production environments which including User Groups, Import Mappings, Export Mappings, Data Quality Rules, Repository Changes etc.
- Provided inputs to the SIT, UAT teams in preparing test plans, test scripts, and test cases by explaining MDM and PIM functionality, behavior, data model, and architecture.
- Collaborated with various technical teams and business users for Development, SIT, UAT and Product Support.
- Authored and Maintained technical documentation such as HLD documents, Technical LLD documents, release notes for migration.
Environment: Informatica PIM(P360) v8.1.1 v8.1.0, Informatica Multidomain MDM v 10.3 v10.2, v10.1, IDQ v10.1, Informatica PowerCenter v10.2, Oracle Database v12c, Oracle WebLogic v12.1, Redhat Linux v7.4, MFT(MBOX), Toad for Oracle v10.1, Oracle SQL Developer v4.1, Win-SCP v5.1, Enterprise Bitbucket, HP-ALM, Control-M, Jira.
Informatica MDM Consultant
Confidential, Piscataway, NJ
Responsibilities:
- Participated in data discovery, data profiling and requirement analysis around master data management activities for Customer, Product and Sales Rep(Employee) domain.
- Profiled source data to gauge data quality and prepared detailed specifications for validation rules, trust, match rules, cleanse functions etc.
- Setup schema definitions that included creating and configuring landing tables, staging tables, base tables.
- Built cleanse lists as per the requirements and wrote medium to complex cleanse functions and classified then under Web Services, Java web application and general cleanse functions.
- Built the Match rules that included a combination of fuzzy and exact match rules. Some of the match rules were used by the web services to identify a potential match in the Hub database when new customer/product is entered or update in the system through the front-end applications.
- Configured and performed data standardization of addresses, account names using Trillium Cleanse Adapter as per business and application requirement.
- Configured JMS server in Hub Console, WebLogic and queue to enable real time master data publish to downstream subscribing system. Built the Alerts / Publish process by setting up Message queues, Message triggers for inserts, updates and supported EAI webMethods layer consume the messages / XMLs generated from hub, process them and send them to the downstream web services consumers.
- Built the ETL / MDM layer to process the one-time initial data conversion from multiple sources. Match rules were built to identify potential duplicates that included the use external match tables, to logically merge legacy data before loading them into the Hub.
- Built Mappings between landing and staging tables for different source systems. This included cleanse functions to cleanse data before loading them into the staging tables.
- Built Queries and Packages that were used by the Web Services and IDD to fetch and write data from and to the Hub.
- Created Users, Privileges, and roles, Assigned privileges to roles as per the requirements. Some of the roles were that of Data Steward, Data Governance, such as Basic Requestor, Advance Requestor, Customer Governance, Technical users, Interface Users using the SAM framework.
- Built the framework to execute / invoke MDM Batch groups from Linux shell scripts, and scheduling those using Control-M.
- Installed, administered, upgraded and managed the Informatica MDM 10.2 /10.1 HF1 EBF3 and Oracle WebLogic, Trillium Software, Oracle Database v12c and HP-Unix /Redhat Linux environments.
- Setup new MDM platform and support existing environment platform including application servers for cluster and high availability needs.
- Developed UNIX shell scripts to automate tasks like restarts, clearing historic data/space on application servers.
- Worked on process tuning in initial data load, subsequent data loads, match rules, for better performance.
- Built Subject Areas in IDD and added cleanse functions to the Subject areas to cleanse / validate the data when entered from the IDD console.
- Maintained the java code that was used to build the custom web services. Some of the custom web services were to create, update, match, search. Detail Search Merge / Unmerge, Multi-Client Search, Program Participation etc. and involved in writing IDD user exits in Java for Merge / Unmerge.
- Represented the team in internal meetings, coordinated with other teams such as DBA group, QA team, PMO team, Web Methods team, WebLogic team etc.
- Wrote technical documentation that included Design documents for the Hub / IDD Configuration, custom Web services design. Coordinated with the Business to streamline before submitting to the QCSV for review.
- Performed data analysis with the intent of identifying root causes, escalated to appropriate levels based on the criticality, provided inputs to the Business Analysts that were used to communicate with their Business counterparts.
- Performed troubleshooting and code fixes by resolving / closing tickets which were opened by project / business / QA teams as part of the System Integration Testing and User testing.
- Performed Administrative tasks such as MDM Hub installation, upgradation, deployments on releases from Development to QA and Production environments.
Environment: Informatica MDM v10.1, Informatica Power Center v9.6, v10.1, Trillium Data Director v11, v15, JSP (Java Server Pages), Oracle WebLogic v12, Oracle Database v12c,MFT(MBOX), Java/J2EE, RedHat Linux, HP-Unix.
Informatica MDM Consultant
Confidential, Skillman, NJ
Responsibilities:
- Participated in data discovery, data profiling and requirement analysis around master data management activities for Customer domain and analyzed the customer normalization process for source systems.
- Gathered and analyzed the business requirements and involved in preparing high level design documents, detail level design documents, technical design specifications and sent for client review.
- Collaborated with managers, data architects to understand the business processes and functional requirements.
- Defined, configured and optimized various MDM processes including landing, staging, base objects, foreign- key relationships, lookups, query groups, queries, custom queries, packages, cleanse functions, mappings, batch groups using Informatica MDM Hub console.
- Developed the code in Informatica Data Quality to remove the duplicates coming from the different source systems and Involved in match rules tuning.
- Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store and configured new sources in MDM and stream line the environment.
- Performed route cause analysis for the records which are getting rejected in the load process and send back to the source team and analyzed the manual matched records which are in DS Queue and Merged them manually.
- Executed the external match job on ad-hoc basis and sent the results to end users.
- Provided inputs to the SIT, UAT teams in preparing test plans, test scripts, and test cases by explaining MDM functionality, behavior, data model, and architecture.
- Collaborated with various technical teams and business users for Development, SIT, UAT and Product Support.
- Worked on technical documentation such as HLD documents, Technical LLD documents, release notes for migration.
Environment: Informatica Multidomain MDM v10.1, Oracle WebLogic v12, Oracle Database 12c, Informatica Power Center v10.1, Informatica Power Exchange v10.1, IDQ v10.1, HP-Unix.
Informatica MDM Consultant
Confidential
Responsibilities:
- Worked on data profiling and analyzing the data of various internal and external sources.
- Worked with functional architects to determine the viability of integrating the sources with customer hub.
- Drove data analysis efforts to define common data formats.
- Designed and implemented the ‘Subscribe’ layer by using a common layout format through which the Hub would subscribe the data from interfacing applications. This subscribe layer was built using Informatica Power Center.
- Defined the base tables, staging tables, and landing tables based on the data model and the high-level Architecture document.
- Defined relationships in the base objects and lookups in the staging tables to enforce referential integrity in the Hub.
- Designed and implemented the ‘Publishing’ layer of the Hub which fed the downstream applications including the Sales and Marketing data warehouse and the report generation process. This layer was built using Oracle materialized views and contained a de-normalized view of the Hub and JMS(Java Messaging Service), message queues & message triggers for real time data publishing.
Environment: Informatica MDM Multi Domain Edition v9.7.1, Informatica Power Center v 9.1.0. Oracle Database 11g, Toad for Oracle, Oracle SQL Developer.
ETL Informatica/SQL Developer
Confidential
Responsibilities:
- Worked on gathering requirements from business users and participated in the detailed requirement analysis for the design of data marts.
- Actively interfaced with other teams to gather requirements, design, code, debug, document, implement and maintain various DB projects.
- Worked according to the specifications and restrictions of the company by strictly maintaining data privacy.
- Worked on parsing high-level design specification to simple ETL coding and mapping standards.
- Built the ETL Source to Target mapping to load data into Data warehouse.
- Created mapping documents to outline data flow from sources to targets.
- Created ETL to extract the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Power Center Designer.
- Created mapplets to use them in different mappings.
- Designed and developing mappings to load into staging tables and then to Dimensions and Facts using existing ETL standards.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Created sessions, configuring workflows to extract data from various sources, transformed data, and loading into data warehouse using Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
- Modified existing mappings for enhancements of new business requirements.
- Worked on performing performance tuning at source, target, mappings, sessions, and system levels.
- Prepared migration documents to move the mappings from development to testing and then to production repositories.
- Migrated the codes from Dev to Test and Test to Prod and preparing the migration documentation in detail for system compatibility, object and parameter files for smooth transfer of code into different environments.
- Designed the automation process of Sessions, Workflows, scheduled the Workflows, created Worklets (command, email, assignment, control, event wait/raise, conditional flow etc.) and configuring them according to business logic & requirements to load data from different Sources to Targets.
Environment: Informatica Power Center v 9.1.0. Oracle Database v10g, Toad for Oracle