We provide IT Staff Augmentation Services!

Informatica Mdm Consultant / Admin / Support Lead Resume

4.00/5 (Submit Your Rating)

Piscataway, NJ

SUMMARY

  • MDM Consultant with 14+ years of experience in designing, developing, and implementing Master Data Management, Data Warehousing, Business Intelligence, and other enterprise level solutions.
  • Data Integration experience includes the Design and Development of ETL to and from OLTP Databases, Enterprise Data Warehouses, Data Marts, and MDM databases.
  • Led large - scale data conversions and successfully led teams to meet stringent project deadlines.
  • Have extensive technical and business experience with strong analytical, problem solving, communication and interpersonal skills.

TECHNICAL SKILLS

MDM Tools: Informatica MDM 10.x, 9.x

ETL Tools: Informatica 9.x/8.x/7.x

Data Quality: Informatica Data Quality 9.x, Trillium

Databases: Oracle 12c/11g/10g/9i, SQL Server 2000/7.0/6.5 , Sybase 10, IBM DB2

Application Servers: Weblogic, JBoss, WebSphere

OS platforms: Linux, UNIX (Solaris 2.8/2.6), Windows 2000, NT, XP

Business Intelligence Tools: Business Objects XI

Core Programming Languages: Java, C++

ERP / CRM: PeopleSoft Financials 8.x

CASE Tools: Erwin 4.5/4.0

Source Code Control: VSS, Rational Clear Case, PVCS

Change Management Tools: Remedy, Clear Quest, JIRA, RTC

Scheduling Tools: AutoSys, Tidal, Control-M, CA Unicenter

Vertical Industry / Domain Knowledge: Life Sciences, Food Services, Public Sector, Healthcare, Finance

PROFESSIONAL EXPERIENCE

Confidential, Piscataway, NJ

Informatica MDM Consultant / Admin / Support lead

Responsibilities:

  • Performed Installation and configuration of the MDM Hub and the ORSs.
  • Served as Informatica MDM Admin, Weblogic Application server administrator for the multiple MDM projects
  • Upgraded MDM from 10.1 to 10.1 HF2 EBF3 in Dev, QA, and Production environments
  • Worked on support tickets such as incidents, service requests, MDBC's etc. Provided detailed troubleshooting and root cause analysis.
  • Collaborated with Database Administrators and Linux Admin's in resolving production issues
  • Provided support to the QA and UAT teams in preparing and validating test cases
  • Designed and developed customized MDM services and necessary batch interfaces to and from the MDM hub
  • Performed code reviews, analyzed execution plans, and re-factored inefficient code
  • Followed data standards, resolved data governance issues and prepared system documentation for MDM processes
  • Maintained the custom Java UI by making required bug fixes and enhancements
  • Performed capacity planning in adding disk space, memory, CPU etc.
  • Participated in scrum calls and provided representation in PMO meetings
  • Involved in the migration from HP-Unix to Linux. Modified and migrated the shell scripts from HP-UX to Linux
  • Performed change management for application enhancements and patch application
  • Reported on application SLAs and issue trending to the application owner.
  • Liaised with the Technical Application Owner to coordinate all platform/application related activities
  • Acted as single point of contact for MDM application support and enhancement
  • Assisted the MDM team(both onsite and offshore) in resolving business, technical issues, coordinated with various stakeholders and ensured timely closure of tickets
  • Liaised with Informatica support for Product Issues/Problems
  • Ensured maintenance of proper technical documentation (operations manual, configuration documents, technical specifications)
  • Provided solutions and estimations for new enhancements
  • Coordinated with the stakeholders to create, complete, and achieve sign-off on the enhancement artifacts.

Environment: Informatica MDM Multi-domain Edition 10.x, Weblogic 12c, Java 1.7, Oracle 12c, Linux, Tidal

Confidential, Richmond, VA

Informatica MDM Consultant

Responsibilities:

  • Participated in data discovery, data profiling and requirements analysis around master data management activities for the Customer, Vendor, and Product domains.
  • Participated in project planning sessions with clients, business analysts, and team members to analyze / estimate development requirements / efforts and make appropriate recommendations.
  • Defined, configured and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
  • Performed Data standardization of addresses using Address Doctor, and other services as defined by the business.
  • Designed, developed and translated business requirements and processes for data matching and merging rules, survivorship criteria, and data stewardship workflows.
  • Configured IDD applications for enabling subject area groups, subject are Childs, dropdown lookups, dependent lookups, sibling references, cleanse functions, etc.
  • Wrote / maintained IDD user exits to perform data validation as part of the CRUD process.
  • Used external matching to match ad-hoc data provided by non-sync systems and publish the results back to the systems via ETL.
  • Created users, assigned roles (such as Requestor, Data Steward, Credit, Legal, Transportation etc.) and privileges using the SAM framework.
  • Performed root cause analysis for data quality / code / MDM related issues, and worked with different teams to bring the defects to closure.
  • Provided continuing enhancement and review of MDM matching rules, data quality and validation processes
  • Used Repository manager to import and export the metadata and promote Incremental changes from development to SIT and UAT environments.
  • Identified defects in the MDM product and collaborated with Informatica Support for resolving tickets.
  • Provided inputs to the QA team in preparing test plans, test scripts, and test cases by explaining MDM functionality, behavior, data model, and architecture.
  • Collaborated with various technical teams and business users for Development, QA, UAT and Production Support.
  • Authored / Maintained technical documentation such as HLA documents, Technical LLD documents, release notes for migration etc.

Environment: Informatica MDM Multi-domain Edition 10.1, 9.7.1 HF5, IDD, JBoss 6.0, Java 1.7, Informatica Power Center 9.6, SQL Server 2012

Confidential - Atlanta, GA

Informatica MDM Consultant

Responsibilities:

  • Set up schema definitions dat included creating / configuring landing tables, staging tables, base tables.
  • Profiled source data using IDQ to gauge data quality and prepared high level followed by detailed specifications for validation rules, trust, Match rules, cleanse functions etc.
  • Built cleanse lists as per the requirements. Wrote medium to complex Cleanse functions and classified them under Web Services, IDD, and General cleanse functions.
  • Built the Alerts / Publish process by setting up Message queues, Message triggers etc. for inserts, updates. Built the corresponding Java and ETL layer to consume the messages / XMLS generated from the hub, process them and send them to the downstream web services consumers.
  • Built the Match rules dat included a combination of fuzzy and exact match rules. Some of the match rules were used by the web services to identify a potential match in the Hub database when a new client is entered or updated in the system thru the front office applications.
  • Built the ETL / MDM layer to process the one time initial data conversion from multiple sources. Match rules were built to identify potential duplicates dat included the use of external match tables, to logically merge legacy data before loading them into the Hub.
  • Built Mappings between landings to staging tables for different source systems. dis included using cleanse functions to cleanse data before loading them into the staging tables.
  • Built Queries and Packages dat were used by the Web Services and IDD to fetch and write data from and to the Hub.
  • Created Users, Privileges, and roles. Assigned privileges to roles as per the requirements. Some of the roles were dat of Data Steward, Data Governance, Technical users, Interface Users, etc.
  • Built the framework to execute / invoke MDM Batch groups from Linux shell scripts, and scheduling those using Autosys.
  • Tuned Initial data load, Subsequent data loads, match rules, for better performance.
  • Built Subject Areas in IDD. Added cleanse functions to the Subject areas to cleanse / validate the data when entered from the IDD console.
  • Maintained the java code dat was used to build the custom web services. Some of the custom web services were create, update, Match, Search. Detail Search, Merge / Unmerge, Multi-Client Search, Program Participation, etc.
  • Involved in writing some of the IDD User exits in Java for Merge / Unmerge.
  • Represented the team in internal meetings, coordinated with other teams such as DBA group, QA team, PMO team, Web methods team, web sphere admins etc.
  • Wrote technical documentation dat included Design documents for the Hub / IDD Configuration, custom Web services design. Coordinated with the PMO to streamline before submitting to the State for review.
  • Performed data analysis with the intent of identifying root causes, escalated to appropriate levels based on criticality, provided inputs to the Business Analysts dat were used to communicate with their state counterparts.
  • Performed troubleshooting and code fixes by resolving / closing tickets in RTC, which were opened by QA teams as part of the String and System Integration testing.
  • Performed Administrative tasks such as daily backup of Hub and IDD code. Migrated Hub and IDD code on a regular basis from Development to other environments.

Environment: Informatica MDM Multi-domain Edition 9.7.1, IDD, Informatica Data Quality (IDQ) 9.x, Websphere Application Server, Java 1.7, Informatica Power Center 9.6, Oracle 11g, SQL Developer, Red Hat Linux

Confidential, Long Island, NY

Informatica MDM Consultant

Responsibilities:

  • Analyzed business and functional requirements and translated into technical specifications and data rules required for the implementation of the MDM solution.
  • Provided high level estimates and identified technical risk during the initial stages of the project, in conjunction with the development of overall project plan.
  • Interfaced with Business Analysts and Business SME’s to gather and understand requirements. Partnered with functional owners and Project Managers to validate and document technical requirements.
  • Authored the MDM design documents including detailed steps and Visio diagrams dat described the data processes and scenarios.
  • Created an end-to-end roadmap on how the logical design will translate into an efficient and reliable technical solution, and how the data will flow through the successive stages.
  • Worked with Data Architects to define conceptual/logical/physical data models for the MDM hub.
  • Responsible for the introduction of the high level MDM / ETL design to the consulting services vendor.
  • Prototyped the technical solution and presented the hub solution to the Business and IT Stakeholders for review and approval.
  • Analyzed and profiled customer data residing in various source systems such as ERP, Legacy, CRM, Salesforce systems, and defined these source systems in the Hub.
  • Defined the schema / base tables, staging tables, and landing table, configured base objects, foreign-key relationships, query groups, queries and packages.
  • Built ETLs using Informatica Powercenter to load data from different sources into a landing table
  • Configured the Foreign Key Relationships among the Base Objects and defined the lookups for the staging tables
  • Configured Address Doctor by doing the changes in the Configuration files on the hub server
  • Developed mappings between landing and staging tables using various cleanse and address doctor functions including graph functions, cleanse lists etc.
  • Worked closely with the Business SMEs in gathering the Match and Merge rules and defining Trust by using results of the detailed source systems analysis to create Match and Merge rule document and a consolidated Trust framework document.
  • Helped Business in to determine the most reliable data which represents the Best Version of Truth (BVT) for data consolidation.
  • Developed validation Rules, Match/Merge Setup - the Match paths, Match Columns and Match rules (Exact and Fuzzy) in the Match and Merge Process.
  • Built User exits to perform data validation and notifying business and IT support of errors, if any.
  • Wrote / configured scripts / ETLs to Schedule the MDM batch jobs using a third-party scheduling tool.
  • Used Metadata manager to Import and Export Metadata and Promote Incremental changes between environments from development to QA.
  • Used Hierarchy manager to create hierarchical relationships between design objects using entities and hierarchies
  • Used Security Access Manager (SAM) to create roles and assign users to roles.
  • Worked on IDD UI to configure subject areas and child relationships dat assisted in searching and editing master records dat enabled data stewards in monitoring and correcting data.
  • Identified data discrepancies and data quality issues, and worked with data stewards to ensure data consistency and integrity.
  • Assisted the internal QA team in producing detailed test plans to ensure a rigorous quality assurance process
  • Documented ongoing data quality issues rules to be monitored post-production.
  • Worked effectively across various support teams and with senior management
  • Provided training and knowledge transfer to the application support group. Provided assistance to production support as required.

Environment: Informatica MDM Multi-domain Edition 9.x, IDD, Informatica Power Center 9.x, Oracle 11g, Toad, SQL Navigator, Red Hat Linux

Confidential, Plainsboro, NJ

Informatica MDM Consultant

Responsibilities:

  • Profiled and analyzed the data of various BMS internal and external sources using Data Flux. Worked with the functional architects to determine the viability of integrating the sources with the customer hub. The sources consisted of DB2, Oracle and Sybase.
  • Designed the ‘Subscribe’ layer by using a common layout format through which the Hub would subscribe the data from interfacing applications. dis subscribe layer was built using Informatica Powercenter. Some of the sources dat were integrated with the Hub were IMS, SAP, Siebel, and SFA (Sales Force Automation).
  • Designed the ‘Publishing’ layer of the Hub which fed the downstream applications including the Sales and marketing data warehouse and the report generation process. dis layer was built using oracle materialized views and contained a de-normalized view of the Hub.
  • Collaborated with Global Shared Service Governance teams to institutionalize policy, standards and project related architectural deliverables
  • Developed value-added, strategic solutions and sharing of best practices throughout the organization and the architectural community.
  • Drove data analysis efforts to define common data formats.
  • Worked with the Data Architects to define and tune the data model for the Hub.
  • Used the Hub Console tool to define the base tables, staging tables, and landing tables based on the data model and the high level Architecture document.
  • Defined relationships in the base objects and lookups in the staging tables to enforce referential integrity in the Hub.
  • Set up delta detection wherever applicable in the staging tables. Wrote custom CDC routines using ETL to load incremental data in landing when applicable.
  • Wrote custom cleanse functions to cleanse and standardize data before loading into the stage ensuring the cleanse functions offer maximum re-use potential.
  • Worked with the Business in defining the trust and match and merge rules including auto match and manual match. dis was an ongoing effort in the initial phases until the trust, match and merge rules were tuned and acceptable to the business.
  • Defined exact and fuzzy match keys including exact and fuzzy match rules, match paths, and appropriate key widths dat would aid in efficient matching.
  • Defined validation rules to downgrade trust when applicable.
  • Scheduled the batch jobs in CA unicenter by packaging the MDM procedures within an ETL frame work.
  • Involved in troubleshooting and identifying data discrepancies and worked with the data stewards to resolve / rectify data issues.
  • Wrote user exits to validate data and notify business and IT support teams if errors exceed a particular threshold value.
  • Set up Subject areas and child subject areas in IDD to enable the data stewards to query and rectify data. Built appropriate queries and packages dat would be used in IDD to aid the data stewards
  • Used Metadata manager to Import and Export Metadata / create Change lists and promote Incremental changes between environments from development to QA.
  • Used Hierarchy manager to create hierarchical relationships between design objects using entities and hierarchies
  • Used Security Access Manager (SAM) to create roles and assign users to roles.
  • Engaged/Assembled/Defined a data governance strategy to resolve data terminology/definition issues.
  • Authored technical documentation in the form of Software Design Documents (SDD), High Level Architecture (HLA) documents and helped create best practices.
  • Served as a guardian of system data integrity and liaison between end users and technical team. Worked with the QA team to develop testing procedures to ensure data quality.

Environment: Informatica MDM 9.x, Siperian XU (Informatica MDM), ETL Informatica Power Center 9.x/8.x/7.x, Oracle 11g / 10g, DB2, Toad, SQL Navigator, Red Hat Linux, Remedy, Data Flux 8.1, Dendrite 1.x.x, CA Unicenter 2.4

Confidential, Jersey City, NJ

Architect / Developer

Responsibilities:

  • Supported other ETL developers; provided mentoring, technical assistance, troubleshooting and alternative development solutions
  • Ensured common application architectural framework and patterns across the ETL Platform.
  • Consulted with key individuals across multiple projects regarding the usage and application of ETL architectural standards.
  • Determined whether a variance from standards should be approved, recommended for approval, or rejected. Enforced / reported all variances.
  • Oversaw the design and delivery of proof-of-concept for new and improved ETL technologies
  • Defined and implemented ETL architectural principals like modularity, restart ability, parallelization, and table-driven business rules.
  • Mapped data from source to target and prepared prototype models to support development.
  • Worked with DBAs and Data Architects in planning and implementing appropriate data partitioning strategy
  • Interacted with software, network, database, and Unix administrators to seamlessly implement solutions for complex technical issues.
  • Drove technical decisions, documented potential solution options and presented them so dat the releases can continue towards deployment.
  • Conducted data quality assessments to validate solution options.
  • Created, documented, and performed integration testing plans as required based on design documentation
  • Provided oversight to all database development efforts to ensure dat future development adheres to the data architecture blueprint.
  • Provided technical recommendations for optimized data access and retention for the data warehouse.
  • Pro-actively monitored and managed the data warehouse platform to ensure SLA adherence
  • Created and Managed ETL Architecture Documents.
  • Supported ETL and Data Warehouse production environments, and Conversion/Migration of data from new systems.
  • Performed baseline and ongoing benchmarks on Informatica production sessions, Monitored and optimized for both production and development/QA instances of powercenter.
  • Provided oversight for the pre-production / implementation phase of the Change Management process
  • Participated in technical design sessions to ensure optimal and appropriate use of data and compliance with data focused development standards.
  • Promoted the delivery of new approaches by driving Technology Proof of Concepts and Architecture Projects
  • Determined ETL Data Architecture and strategies to ensure ETL processes and Data Warehouse data models are optimized for data reuse and scalability.
  • Reviewed database design changes with the team to determine the impact to the physical database. Reviewed work product of others for compliance with development standards.
  • Coordinated with senior management, Business Analysts, DBAs, ETL developers and Testers.
  • Estimated the work effort and assisted the project manager with task planning and level of effort estimates.
  • Led ETL Developers in creating transformation strategy and troubleshooting ETL performance problems with ETL developers.
  • Performance tuned databases and SQL statements in connection to ETL code as required.
  • Reviewed business requirements and technical design documents to develop effective data and database solutions.
  • Communicated changes, enhancements, and modifications, verbally and through written documentation to project managers, sponsors, and other stakeholders so dat issues are escalated and solutions are adopted.
  • Tracked and managed issues, risks, action items and planned deliverables for the project.
  • Wrote and maintained technical documentation, including design documents, describing standard processes for other developers to follow.

Environment: ETL Informatica Power Center 8.6, Oracle 11i, Toad, SQL Navigator, Red Hat Linux, Autosys

Confidential

ETL Lead

Responsibilities:

  • Developed various Modules using Informatica ETL as part of SOX compliance to integrate with PeopleSoft General Ledger (GL)
  • The Modules were sourced from a combination of systems ranging from the NTAPS mainframe system (a customized version of Morgan Stanley’s TAPS Equities and Fixed Income system), Nomura Data warehouse (oracle), and Middle office system (Sybase) and unstructured data.
  • The Architecture was a top-down approach with the legacy OLTP systems / systems of record feeding the Data Warehouse dat in turn fed the data mart dat was used to post the highly summarized and refined data to PeopleSoft GL.
  • Designed the core Equity Modules like Stock Loan, Options Reserve and Index Analysis. Used data from external vendors like Bloomberg to get the daily pricing for Stocks and Options
  • Designed the Fixed Income module, Hong Kong Dollar Repo, to process Repo and Reverse Repo trades. dis module calculated, among other things, the daily PnL for each roundtrip trade.
  • Maintained (in the form of Change Requests) other Equity modules like NAV, Fair Value, Equity Load pricing.
  • Interfaced with the Database Administration Group to ensure proper configuration of database objects in support of ETL code
  • Designed ETL routines according to specifications provided by the Business Analysts, Made the mappings more robust by enabling parameters of the hard wiring within the maps so dat they easily adapt to changes in Business, thereby reducing the turnaround time.
  • Implemented Change Data Capture (CDC) by using a combination of the Time Dimension and the Current Business date. The data extraction strategy was based on the source date keys. The source NTAPS database was of size 3 terabytes with the pivotal tables averaging more than 100 million rows on any given day.
  • The ETL was run 5 days a week, Monday to Friday, through Control-M, with the average window of opportunity being 4 hours a day. Helped in setting up the Control-M jobs by writing appropriate documentation forms dat specified the sequence of the workflow execution and the dependencies between workflows including dependencies on Trigger / Sentinel files and user input files. Most of the jobs were event driven while some were time driven.
  • Tracked and managed Defects using Clear Quest. Worked with development teams to identify root causes and resolve issues including critical and emergency fixes, within the published SLAs. Kept client informed of progress and recorded status progress in the problem management log through to closure.
  • Worked with business users during UAT defect resolution and ensured dat all change requests are reviewed against and implemented according to the Technical and Data Architecture standards
  • Worked with other colleagues in the technology department to improve upon best engineering practices with regard to the data mart testing and contributed to test processes collateral for the Quality assurance team. Liased with the Business Analysts to support reconciliation phases, including reconciliation at both the aggregate and granular level.
  • Identified gaps and inefficiencies in the existing modules, and recommended solutions to correct/prevent potential problems. Played an active role in developing the Informatica standards and in further establishing the Informatica centre of excellence
  • Coordinated the configuration management and deployment procedures for releasing code to test and live environments to ensure a reliable and highly available production environment.
  • Maintained PL/SQL procedures and functions as part of the ETL framework. Reviewed existing PL/SQL routines and re-wrote some using ETL code.
  • Worked with data stewards / data owners to produce / maintain data mapping documents and wrote technical documentation required for a transition to the production operations team.

Environment: ETL Informatica Power Center 8.1.1/7.1.4 , Oracle 10g, Red Hat Linux, Business Objects XI, Sybase X, Rational Clear Quest, PL/SQL Navigator

Confidential, NY

Sr. Developer / Architect

Responsibilities:

  • Performed in-depth review of source data quality. Developed specifications for transformation and cleansing.
  • Designed and Architected the ETL solution dat included designing Mappings and workflows, deciding load strategies, implementing appropriate error handling and error notification processes, scheduling and designing re-usable ETL pieces through parameterization.
  • Reviewed data models and functional specifications before creating the ETL mappings. Apart from Relational databases, these included unstructured data and non-relational data dat were received on a daily basis. Automated the process using Shell Scripts (bash, perl) and Informatica to load the newly arrived flat files.
  • Designed and implemented Change Data capture processes for Fact and Dimension tables through a combination of Time Stamps, staging (before / after images) and bridge tables.
  • Developed mappings for Hybrid Dimension tables. These mappings updated multiple dimension tables in a single mapping by simulating an update cascade.
  • Strategically used ELT (Extract, Load, Transform) approach to increase performance of poor performing maps by using the features of relational database systems.
  • Wrote SQLs to run against the Informatica Metadata to determine the execution time of each session over a period of time to identify those dat were performing poorly. These mappings / sessions were then analyzed to resolve the performance bottlenecks. dis was an on-going process to improve ETL efficiency.
  • Improved performance of the ETL processes by benchmarking different approaches to reduce the ETL load time. Based on the benchmark figures, Implemented changes to the ETL processes dat included changing the design of certain mappings, addition of new intermediate / staging tables and using database partitioning / informatica pipeline partitioning for large loads.
  • Designed and developed the re-startability module by writing UNIX scripts. These scripts would automatically re-start the ETL jobs dat fail due to specified errors like database connection errors, etc.
  • Provided administrative and operational support of the Informatica environment dat included installation, configuration, monitoring, maintenance, and Backups.
  • Provided technical support for the enterprise data mart, including the development of enhancement requests and defect resolution.
  • Extensively used Visual Source safe to archive technical documentation and UNIX / PLSQL code. Used Mantis to track / resolve open defects.
  • Provided input and recommendations on technical issues to the Project Manager.
  • Created and maintained technical documentation related to the ETL processes.

Environment: ETL Informatica Power Center 7.1.3, Oracle 10g / 9i, SQL Server 2000, RedHat Linux, Perl, MS-Project, Visual Source Safe

We'd love your feedback!