Mdm Architect Resume
SUMMARY:
- Over 18 years of IT experience with Data Warehouse/Data Mart/MDM/DQ Design & Development Life Cycle.
- Extensive Experience in Master Data Management (MDM) implementations
- Extensive experience in the design, build and maintenance of DWH/BI infrastructure
- Trained in Big Data Hadoop technology and its components - HDFS, MapReduce, YARN, Pig, Hive, Sqoop, Flume etc.
- An excellent leader & team player with strong communication and interpersonal skills with the ability to effectively and professionally communicate with clients, customers and project teams
- Experienced in all phases of the project life cycle including Requirement Analysis, Design, Coding and Testing
- Well experienced in translating business users’ needs into Technical Design Documents, Process Documents and Requirement Specification documents. Participated in the requirements gathering, sizing, designing, developing, testing and deployment of the requirements
- Worked in technical architecture design for data warehouse systems on cloud & on-premise platforms
- Specialized expertise in the use of all modules in the Informatica Data Management & Data Integration suite. Very good experience in Informatica Address Doctor, IDQ, RDM and MDM
- In-depth, hands-on knowledge of data warehousing, methodologies, designing of complex multi dimensional schema, enterprise wide data warehouses and data marts. End-to-end experience in large and complex data warehouses, from conceptualization to production roll out. Expert in designing data load process end to end
- Proficient knowledge and experience in working with Relational and Non-Relational Source and Target systems.
- In-depth understanding of DB2, Oracle and SQL Server and corresponding ETL/ELT components like Data Stage, ODI & SSIS toolsets, Talend ETL, Informatica Power Center
- Proficiency in PL/SQL. Expert in creation of stored procedures, packages to define ETL/ELT processes.
- Release management, build and code migration to environments and version control techniques as applicable to DW/BI. Have worked in ISO 9000 environments. Have developed and fine-tuned procedures and methodologies in a DW/BI context to achieve optimal results
- Developed processes and methodologies to ensure quality and consistency in project/milestone delivery
- Proficient in designing and implementing Unix/AIX /MKS Toolkit Shell Scripts
- Volunteer Software Quality Ambassador who reviews & suggests means to improve the processes followed by different projects within an organization. Pointing out and suggesting Best Practices for each of the projects reviewed
- Extensive experience in dissemination of work and coordinating with offshore resources to produce quality work within aggressive deadlines. Strong leadership, planning, execution, people management and mentoring skills
- Excellent analytical, problem-solving and communication/interpersonal skills and ability to adapt and act as an architect, support analyst, business analyst, data analyst, data modeler, developer, tester along with administrator. Stress on QA initiatives to ensure quality delivery of solutions.
TECHNICAL SKILLS:
DB/DWH: Oracle, Confidential, MS SQL Server, Teradata, Amazon Redshift, HP Vertica, Amazon Aurora, Amazon RDS
BI Reporting/Analytics: QlikView, Tableau, MS Power BI, Power Pivot, Confidential Cognos, MS SSAS, SSRS
ETL/ELT: Informatica PC, Confidential Datastage, Microsoft SSIS, Oracle ODI, Informatica cloud integration, Talend ETL
DQ: Ataccama, DataFlux, Informatica DQ (predominantly), Oracle DQ, Microsoft DQS, Talend DQ
MDM: Informatica MDM - Customer, Asset & Product; Oracle MDM - Customer, Supplier & Product, Semarchy MDM, Talend MDM
Misc./Basics: Star & Snow Flake Schema, Dimensional Modeling, Erwin Data modeler, Embarcadero ER Studio DA & SA, Ralph Kimball, Bill Inmon, Waterfall, Agile, Methodology, C, C++, SQL, PL/SQL, Unix/AIX/MKS Shell scripting and Perl Scripts, HTML, XML, SQL * Plus, SQL * Loader, AIX, Sun Solaris, Jira, LANDesk, HP Quality Center, TFS, Assembla, Control M, Autosys, AWS EC2 provision & configuration, Data transports, AWS cost estimator for DWH/BI & other applications to migrate to
Big Data: Hadoop platform - HDFS, YARN, Pig, Hive, Sqoop, Flume etc
PROFESSIONAL EXPERIENCE:
Confidential
MDM Architect
Responsibilities:
- Design solution & data architecture for Vendor Master initiative at PCYC. Present it to stakeholders
- Provide pros & cons on effective and optimized options for integrating the vendor data into the existing Customer hub
- Provide insight to the development team into best practices and processes on implementing a new MDM solution using the latest 10.2 version of Informatica MDM
- Creation of Design documents, Architecture documents
- As part of Customer MDM optimization initiative and adhere to ‘One HCO-One Active address’ guidelines set forth by Sales Ops team, performed comprehensive impact analysis on the different source systems and processes. Suggested solutions to accomplish the same
- Performed analysis & created implementation plan to introduce Address Doctor for validating addresses, thereby replacing the older custom written standardizer
- Developed reusable address validator mapplet which resolves multiple address related issues - Org Name in AddressLine1, City Mismatch, Zip Mismatch, PO Box addresses etc.
- Designed and introduced “reject table” usage using Analyst tool for invalid addresses/accounts
- Proposed and developed changes to existing HCO/HCP match rule sets by identifying incorrect merges
- Documented the Customer MDM processes - architecture, match rules/trust rules match-merge process with examples
Confidential
Solution Architect
Responsibilities:
- Technical architecture design for systems to be migrated to AWS from on premise
- Informatica cloud task flows design & creation - Integration with SalesForce.com & DWH
- Estimate costs on AWS calculator to enable decision making and ROI
- ETL, MDM, DB Vendor Software Evaluation and POCs for Platform/Environment Overhaul
- Informatica MDM, Oracle MDM, Semarchy MDM
- Talend ETL, Informatica ETL, Oracle Data Integrator
- Oracle 12C, MS SQL Server 2016, Amazon REDShift, HP Vertica
- Ensure specifications and configurations are designed appropriately for optimized data in/out transfers
- Help in acquiring appropriate Software licenses per core/per processor considering Dev, Test, OQ and Prod environments
- Install Oracle 12c, Informatica 10 suite of applications on each environment - for DWH environment
- Creation of Design documents, Test case documents for Sales & Finance DWH
- Profiling source systems for Reference & Master data and create DQ mapplets to be used in PowerCenter ETL mappings
- Design & Configure Informatica MDM - Landing, Stage & Base objects
Confidential
Solution Architect, Project Lead
Responsibilities:
- Design, Analyse and Implement change requests for Confidential Data warehouse via Informatica ETL, QlikView & Cognos Reports.
- Business requirement gathering, translation to technical design, impact analysis and development, testing, UAT and delivery to business, adhering to agreed upon SLAs.
- Involved in the enhanced development of SSIS mappings and also tune them for better performance.
- Frequent Hierarchy changes (Product & Region) to SSAS Sales & Inventory MOLAP cubes.
- Creation of Design documents, Test case documents and UAT/Production support.
- Ensure documented & approved biweekly Production release in consultation with CAB
- Represent DWH/BI team at ARB meetings in case of major/highly visible enhancements
Environment: QlikView, Microsoft SQL Server, SSIS, SSAS, SSRS, Oracle 10g - PL/SQL, Confidential Cognos 10, Informatica 8.6.
Confidential
Solution Designer/ETL& MDM Lead
Responsibilities:
- Gather requirements for Reference Data management and Hierarchy Management for GoMaster project.
- Install and setup environment for DQ, ETL and MDM
- Profile and generate reports for different source data systems
- Design & create Data quality rules
- Design reject tables for Data Stewards analysis
Environment: Oracle 11g, Informatica 9.6.1, IDQ and MDM, PL/SQL, Unix Shell scripts
Confidential
Solution Design/ETL & MDM Lead
Responsibilities:
- Gather requirements for Reference Data management and Hierarchy Management for Voyager project.
- Install and setup environment for DQ, ETL and MDM
- Was responsible of Customer MDM implementation for integrating variety of data systems, and geographically fragmented customer information
- Profile and generate reports for different source data systems
- Design & create Data quality rules, design reject tables for Data Stewards analysis
- Design and implement Data quality rules, Merge and Survivorship rules for golden records creation
- MDM Architecture & Implementation - Involved in the data flow design from Source to Staging to CR layer (data warehouse) to MDM landing zone to MDM tables, Automatic consolidation and back to CR layer and then to DMs.
- Involved in the logical and physical data model creation for each of the data layers
- Creation of Design documents, Test case documents and UAT/Production support.
- Solution design for Integration with Customer, Property, Product, Reservation and Allocations systems.
- Lead a team of 6 ETL, DQ developers in an onsite-offshore model
Environment: Oracle 11g, Informatica 9.5.1, IDQ and MDM, PL/SQL, Unix Shell scripts
Confidential
ETL Developer/Team Lead
Responsibilities:
- Design, Analyse and Implement change requests for SBUE2E Datawarehouse and Cognos Reports.
- Manage ETL and Cognos dash board work requests by co-ordinating with Off Shore team.
- Involved in the enhanced development of SSIS mappings and also tune them for better performance.
- Creation of Design documents, Test case documents and UAT/Production support.
- Understand new change requests and provide solution within acceptable time limits.
Environment: Microsoft SQL Server 2005 - SSIS Package, Oracle 10g/9i - PL/SQL, Confidential DB2, Confidential Cognos 8.4, Informatica 8.6.
Confidential
Informatica Developer
Responsibilities:
- Worked on the interfaces for external vendors for CMA (Covance, Pharmagistics, Accredo/McKesson etc). Brought down the execution time by an average 20% for all the external interfaces.
- Worked on Informatica 7.1/8.6 - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets and Transformations and Session Manager.
- Creation of UNIX Scripts/MKS Toolkit shell scripts to define/control ETL processes, interlink between Windows and UNIX machines as Informatica 8.6 is a pure windows machine.
- Involved in the enhanced development of Informatica mappings and also tune them for better performance. For CMA, the process was improved by 35% in accordance to performance.
- Creation of Design documents, Test case documents and UAT/Production support in time constraint assignment.
- Study data cleansing needs and interact with users for cleansing of data before it is loaded into the Data Warehouse.
- Performance tuning of Mappings/Sessions and work flows.
Environment: Informatica Power Center v7.1/8.6, Oracle 10g/9i - PL/SQL, MS-SQL Server, DB2 v8.1, Perl, MKS Toolkit.
Confidential
Project lead/ ETL Developer
Responsibilities:
- As a startup, created Functional and Application Information Documents for all related applications and brought the process under GWA and GNA architectures without any documentation in place.
- Involved in four applications (RoHS, CRS, ASource and CBB Metrics), and all associated databases/datawarehouse development.
- Involved in “IPD Divestiture” in 2005, a complex migration and integration of all applications identified by Lenovo users, when Lenovo took over the PC division from Confidential . This Divestiture was implemented as 2Q 2005 release. Efficiently handled the assignment to deploy this within tight timelines.
- Actively participated in the requirements gathering, sizing, designing, developing, unit testing and deploying of business needs. Supported Integration/UAT testing.
- Data Integration and ETL from different source systems including flatfiles, excel files, relational databases, xml sources and MQ Series data. Creation of stored procedures, packages, custom functions to define ETL processes in Oracle 9i and DB2.
- In association with the extended team, designed and implemented end-to-end solution from source definitions, target definitions, transformations, mappings, mapplets, sessions, workflows and worklets. Created re-usable mapplets, worklets wherever possible.
- Handled frequent requests for change of dataload-CRONs from Lotus Notes to DB2 as well as from different legacy systems to DB2 so as to maintain a central repository called Data mart. Worked on PowerConnect to handle real time MQ Series data.
- Lead and manage the team by co-coordinating with the extended team. Responsible for maintaining all documentation for technical as well as process quality purposes.
- Generated BRIO reports based on the centralized repository maintained and as and when requested by the users. Used Brio for testing purposes too, keeping in mind, Brio query is much simpler than the SQL query.
- Provided Training, Production support and Knowledge transfer to the clients and end-users.
- Survey of customer queries and resolution on “ON-DEMAND’ basis.
Environment: Confidential Datastage, Informatica Power Center v6.0/7.1.2/RT, Oracle 10g/9i - PL/SQL, Brio v6.1, Teradata, DB2 v8.1, Lotus Notes/Domino v6.0, Perl, AIX v5.0, Shell scripts.