Sr. Informatica Mdm Developer Resume
Detroit, MI
SUMMARY
- 8+ years of IT experience in analysis, design and development for various Software applications in client - server environment and providing Business Intelligence solutions in Data Warehousing for Decision Support Systems and OLAP application development.
- Worked on multiple client specific environments related to Financial, Tele-Communications, Banking and Insurance
- Review the high level documents to ensure the requirements are addressed accordingly and Discuss the possibilities of the requirements.
- Installing and Configuring of Informatica MDM Hub Console, Hub Store, Cleanse and Match Server, Address Doctor, Informatica Power Center applications
- All facets of MDM implementations including Data Profiling, metadata acquisition, data migration, validation, reject processing and pre-landing processing.
- Performing requirement gathering, analysis, design, development, testing, implementation, support and maintenance phases of both MDM and Data Integration Projects
- Master Data Management (MDM) & Data Integration concepts in large-scale implementation environments.
- Creating landing tables, Staging tables, Base Tables as per the data model and data sources
- Developing mappings using various cleanse functions and Address doctor functions to move data into Stage tables
- Defining Trust Score, Validation rules, Match rules and Merge settings
- Closely worked with other IT team members, business partners, data stewards, stakeholders, steering committee members and executive sponsors for all MDM and data governance related activities
- Worked on Real Time Integration between MDM Hub and External Applications using Power Center and SIF API for JMS
- Worked extensively on creating new CUSTOMER ONBOARDING process with new configuration on External matching Rules and processing the customer data before loading into MDM.
- Extensively used ETL methodologies for supporting data Extraction, Transformation and Loading (ETL) process in a corporate-wide-ETL solution using Informatica 9.5.1/8.6
- Experience in using Data sources/targets such as Oracle, SQL Server, Teradata, Netezza, DB2, XML and Flat files.
- Worked extensively on various Informatica Data Integration components - Repository Manager, Designer and Workflow Manager/Monitor
- Good understanding of Data warehouse concepts and principles, Kimball & Inman approaches, Star & Snowflake Schema, Fact/Dimension tables, Normalization/Denormalization
- Expertise in Data Analysis, Data Mapping, Data Modeling, Data Profiling and development of Databases for business applications and Data warehouse environments
- Extensively involved in creating Oracle PL/SQL Stored Procedures, Functions, Packages, Triggers, Cursors, and Indexes with Query optimizations as part of ETL Development process
- Proficiency in Data Warehousing techniques for Data Cleansing, Slowly Changing Dimension (SCD) phenomenon, Surrogate Key assignment and Change Data Capture (CDC)
- Strong skills in data analysis, data requirement analysis and data mapping for ETL processes
- Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions
- Tuning and optimization of SQL/Teradata Queries, Change request management.
- Extensively writing Teradata Coding with multiple SQL statement scenarios in Dev and Prod Environment.
- Designed Source to Target mappings, Code Migration, version control, scheduling tools, Auditing, shared folders, data movement, naming in accordance with ETL best practices, Standards and Procedures
- Have good communication skills, strong decision making and organizational skills along with outstanding analytical and problem solving skills to undertake challenging jobs. Able to work well independently and also in a team by helping to troubleshoot technology and business related problems.
TECHNICAL SKILLS
Data Warehousing: Informatica MDM 9.5.0/9.5.1 , Informatica 9.5.1/8.6. Power Exchange, IDQ, IDD, Address Doctor, SIF, Jboss
Languages: C, Java, C++, UNIX Shell Scripts, HTML, XML, Perl, PL and SQL, T-SQL, Visual Basic.
Databases: Oracle, MySQL, Business Intelligence.
Reporting Tools: Exposure to Cognos Reporting Studio
Testing Tools: QTP, Quality Center.
Job Tools: Autosys, Tivoli, JSS, IBM Rational Clear Case
Source code control: TFS
Operating Systems: UNIX (Sun Solaris), Windows (9X, XP, NT)
PROFESSIONAL EXPERIENCE
Confidential, Detroit, MI
Sr. Informatica MDM Developer
Responsibilities:
- Implement and configure new partner data model and partner data hub.
- Enable Partner data model Front End for business to enable partner management functions.
- Leverage front end to validate and accept cleaned partner data.
- Make partner profile data available to BI / Analytics / DWH layer.
- Installed & Configured MDM Hub on Dev, Test and Prod Server, cleanse, and Address Doctor in Dev, QA
- Gathered requirements from business users by conducting brain storming sessions.
- Played an important Key role to create data model according to the requirements.
- Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.
- Created Mappings to get the data loaded into the Staging tables during the Stage Process.
- Defined Trust and validation rules before loading the data into the base tables.
- Coordinated with Business Leads in making them understand Match & Merge and incorporated their requirements and ideas.
- Created Match rule sets in for the base objects by defining the Match Path components, Match columns and rules.
- Developed Hierarchies using Hierarchy Manager in the Hub as per the needs.
- Created IDD application and Subject Areas, Subject Area Groups, Deploy and test IDD application, cleanse functions, utilizing timeline, export and import master data from flat file.
- Worked extensively on BDD config file to get the changes reflected on IDD.
- Analyzed the data by running the queries and provided the stats after Initial data and incremental Loads.
- Configured JMS message Queue and message triggers with SIF API.
- Configured Web services using SIF API Interface.
- High-level review of SAM - discussed use of Roles, creation of users and assignment of user to Role.
- Defined Roles and privileges for each environment according to the platform requirements.
- Scheduled MDM Stage Jobs, Load jobs.
- Helped UAT team testing the new platform and authoring test cases.
- Involved with ETL team in getting the data loaded into the landing tables.
- Work closely with Enterprise Data Quality Lead and business users to understand the Data lineage and DQ framework.
- Participate in design and development for DQ and Data lineage, Implementation and coordinating User testing and business training.
- Execute data quality measurement and monitoring process utilizing techniques to provide insights into the quality of Data
- Experience in developing data quality initiatives to evaluate new and existing data attribute.
Environment: Informatica Multidomain MDM 9.5.0, Informatica 9.5.1/8.6, IDQ, IDD, Jboss, Oracle 10g, SQL Developer, Address Doctor, SIF, Toad, SQL*Loader, Linux.
Confidential, Atlanta, GA
Informatica MDM Developer
Responsibilities:
- Installed Informatica MDM 9.5 In UNIX (Red Hat) with the middleware JBOSS 5.1 GA
- Implement Informatica MDM workflow, including data profiling, configuration specification and coding, match rules tuning, migration
- Build out best practices regarding data staging, data cleansing, and data transformation routines within the Informatica MDM solution
- Define and build best practices regarding creating business rules within the Informatica MDM solution
- Created Base Tables, Stages tables based on the data model and number of source systems
- Developed the mappings using various cleanse functions in the Hub and the address doctor to standardize the data
- Worked with Business users in understanding Match & Merge setup and incorporated their requirements and ideas
- Collaborated with users in defining Match Rules and conducted workshops to discuss over and under matching
- High level review of SAM - discussed use of Roles, creation of users and assignment of user to Role
- Defined Roles and privileges for each environments according to the platform requirements
- Defined the security such that schema will be secured with access only granted for specific downstream integration uses, using users created for those specific integrations
- Develop and maintain the master and translation entities in the MDM Hub and published the data into SFDC's and ERP's for supporting the ongoing business support
- Created "Customer onboarding" process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store and define automation process for staging, loading, match and merge.
- Configured, designed and delivered MDM hubs across multiple data domains (Party, Service/Product & Prospect).
- Used hierarchies tool, for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
- Created and Developed IDD application by creating Subject Area Groups, Subjects Areas and Entity Relation ships
- Performed analysis, design, development and maintenance activities associated with the database and data in support of multiple applications (including following standards/procedures to achieve integration of systems through database design).
- Finally, Created Data Validation document, Unit Test Case Document, Technical Design Document, Informatica Migration Request Document and Knowledge Transfer Document.
Environment: Informatica MDM 9.5.1, Informatica 9.5.1/8.6, IDQ, Oracle 11g/10g, Tivoli, MS Excel, Unix
Confidential, Nework,NY
Sr. Informatica/ETL Developer
Responsibilities:
- Performed business analysis, requirements gathering and converted them into technical specifications
- Designed the MIS data mart (star schema dimensional modeling) after analyzing various source systems and the final business objects reports
- Designed all the slowly changing dimensions to hold all the history data in the data mart
- Developed all the ETL data loads in Informatica Power Center to load data from the source data base into various dimensions and facts in the MIS data mart
- Implemented Slowly Changing Dimensions (Type 2) while loading data into dimension tables to hold history
- Created reusable transformations, Mapplets and used them in the mappings and workflows
- Developed Informatica Sessions & Workflows using Informatica workflow manager
- Optimized the performance of the Informatica mappings by analyzing the session logs and understanding various bottlenecks (source/target/transformations)
- Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors that occur while loading
- Developed Sybase stored procedures to load data into some of the fact tables
- Involved in PL/SQL query optimization to reduce the overall run time of stored procedures
- Created UNIX shell scripts to invoke the Informatica workflows & Sybase stored procedures
- Created UNIX shell scripts to file move, file archive & FTP data files to other downstream applications.
- Designed CA scheduler jobs to invoke the UNIX shell scripts
- Involved in unit testing of various objects (Informatica workflows/Sybase stored procedures/UNIX scripts)
- Inheriting the developed TSQL script to business intelligence datasets for developing the SSRS reports
- Supported various testing cycles during the SIT & UAT phases.
- Involved in creation of initial data set up in the Production environment and involved in code migration activities to Production.
- Supported the daily/weekly ETL batches in the Production environment
- Prompt in responding to business user queries and changes.
Environment: Informatica, Sybase, Unix Shell Scripting, SQL, PL/SQL, Tivoli, IBM rational Clearcase
Confidential, CA
Senior Informatica Developer
Responsibilities:
- Designed the ETL processes using Informatica Power Center to load data from Oracle, MF-VSAM Files and Flat Files to target Oracle Data Warehouse.
- Loading the OLTP data into the Warehouse OLAP and built the Re-usable Mappings using Informatica Power Center.
- Created Sessions and Workflows to help schedule nightly loads and process data from all source terminal Data Collection points.
- Extensively used pre-session and post-session variable assignment for simulating multithreading scenarios for Load balancing and performance improvement.
- Used Constraint Based loading & Target load ordering to efficiently load tables with PK-FK relation in the same mapping.
- Executed Stored Procedures from different part of the mapping and used the data set in the mapping.
- Worked on Power Exchange to connect to Mainframe and to register the structure of VSAM files as CDC Data Maps using copybooks, performed row test and integrated it in Power Center mappings
- Implemented SCD methodology including Type 1, Type 2 to keep track of historical data.
- Extensively used Parameter file to override Mapping parameter, Mapping Variables, Workflow Variables, Session Parameters, FTP Session Parameters and Source-Target Application Connection parameters.
- Effectively used Static, Dynamic and Persistent Caches on Connected & Unconnected Lookup transformations.
- Wrote Shell scripts and Stored Procedures for regular Maintenance and Production Support to load the warehouse in regular intervals and to perform Pre/Post Session Actions.
- Developed PL/SQL procedures for Data extraction, Transformation and Loading.
- Created various Documents such as Source-To-Target Data mapping Document, Unit Test Cases and Data Migration Document.
- Performed Data Profiles and created Scorecards using Informatica IDQ for analyzing data quality for the business rules in which Pass/Fail percentage is populated in the BO reports
- Worked on Reference Data table generation and to create and run data quality rules in Informatica Data Quality
- Developed Mapping and Mapplets in Informatica IDQ to cleanse the data in the Production and test Environment
- Performed profiling on all PII data and created scorecard with help of Informatica Data Quality tools
- Used HP service manager (change management) for code migration from different environments.
- Created Test Scripts using Quality Center as well as developing test cases and test plans.
Environment: Informatica Power Center 9.5/8.6.1, Informatica Data Quality 9.5, ERWIN, Oracle 11g, PL/SQL, SQL*Plus, TOAD, OBIEE, COGNOS, Windows, HP-UNIX, AUTOSYS
Confidential, Sacramento, CA
Informatica Developer
Responsibilities:
- Participation in project meetings with other Data Analysts in preparing Analysis reports and project development reports according to industry regulations and Informatica standards.
- Create ETL mappings to extract data from multiple legacy systems. Transform and load the data into the Enterprise Data Warehouse using Informatica Power Center 9.5.1.
- Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
- Worked on performance tuning of the ETL processes. Optimized/tuned mappings for better performance and efficiency.
- Developed and executed SSIS packages to populate data from the various data sources like Oracle, created packages for different data loading operations for many applications
- Defined Target Load Order Plan and to load data correctly into different Target Tables.
- Understand business challenges and translate them into process/technical solutions. Creation of documents related to the ETL process.
- Successfully migrated BI reporting from various legacy platform to Enterprise Oracle Business Intelligence platform (OBIEE)
- Used different Data Warehouse techniques like Star-Schema, Snowflake schema.
- Participated in the EL code review meeting to build the Informatica workflow, session and mappings according to the Informatica standards.
- Strong knowledge in Data warehousing concepts, dimensional Star Schema and Snowflakes Schema methodologies.
- Worked with a lot of third party Vendors whose requirement varies from multiple file formats to various destinations
- Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
- Worked on UNIX Shell scripting and called several shell scripts using command task in Workflow manager.
- Designed Excel Sheets for each mapping of their Test Scenarios.
- Worked with the team members to identify and resolve various issues relating to Informatica& other database Issues.
- Designed ETL mapping documents, various mappings with Transformation rules and complex mappings including SlowlyChangingDimensions. in BCBSM we had an additional step of Code Reviews (Informatica, SQL) prior to Implementation. This is an arena for improvement, Suggestions and explanations.
- In BCBSM we had a Community of Practice for the Data Integration Teams where ideas were shared and work across the teams was brain stormed and new technologies were discussed in this forum.
Environment: Informatica Power Center 8.6.1, SSRS, Oracle, Microsoft SQL Server Management studio 2008 and Windows Vista/7, business intelligence, Change control, T-SQL, SSIS, linux,Microsoft office, BMC Service Desk Express, Agile, Microsoft Visual Studio Professional2010