We provide IT Staff Augmentation Services!

Sr. Informatica Mdm Developer Resume

5.00/5 (Submit Your Rating)

OK

SUMMARY

  • Experience in all aspects of software development and systems management - including analysis of project, development, deployment, testing, implementation and documentation in various industries such as Banking, Financial, Retail and HR.
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
  • Good experience in data sources, data profiling and data Validation based on the business and functional requirements.
  • Working knowledge of Dimensional Data Modeling, Star Join Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling using ERWIN and MS Visio.
  • Highly experienced in data mart life cycle development and ETL procedure to load data from different sources into data marts and data warehouse using Power Center Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor. Extensively worked with Informatica mappings, sessions and workflows.
  • Working experience in Informatica metadata manager.
  • Designing, Installing & Configuring core Informatica MDM Hub components, Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD & Data Modeling.
  • Wrote web services client to implement SIF framework in Informatica MDM.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring.
  • Used Address doctor to profile, validate and enrich address.
  • Experience in Real Time Data integration using Informatica Power Exchange Change Data Capture (CDC) technique.
  • Created a pilot application in Java Spring boot application to create enterprise java application using spring framework.
  • Worked on DataStage tools likeDataStage Designer, DataStageDirector and DataStage Administrator.
  • Used BODS Repository manger, Designer, Management console. Extensively worked on Bods Dataflow, Workflows and Jobs.
  • Extensively worked on mappings for Financial and banking Data Loads.
  • Have significant experience in XML, XML schema and XML Generator.
  • Having significant experience in using SAP IDOC Interpreter and Prepare transformation for data extraction.
  • Have experience in writing UNIX Shell scripts.
  • Extensively worked with Oracle, DB2, Sql Server and Sybase.
  • Extensively worked with Oracle PL/SQL Stored Procedures, Functions and Triggers and involved in Query Optimization.
  • Extensively worked on Teradata Mload, FLoad, TPump.
  • Sql tuning for all the databases.
  • Level 2 and 3 production support. Implemented migration and change tickets in production.
  • Have strong analytical and communication skills. Excellent in oral and written communication skills.
  • Good at evaluating business needs and architecting a solution including project time, cost, resource estimation, system design, specification etc.

TECHNICAL SKILLS

Data Warehousing: Informatica 5.x-10.x (Informatica Power center/Power Exchange), Informatica IDQ/MDM, SAP Data services 3.2, DataStage 7.5, SAS

Databases: Oracle 7x/8x/9x/10g,11g, Sybase, DB2, Sql Server, Teradata12, 13x, Netezza 6x,7x

Database Tools: Toad, Quest Central, SQL plus, Sqldbx

Reporting Tools: Business Objects 6x, XI R2.

Utilities: MLOAD, FLOAD, BTEQ, SQL Loader

Data Modeling Tools: Erwin ERX 3.5, 4.0, Visio

Languages and Version Controls: SQL, PL/SQL, HTML, C, C++, Unix Shell Scripting and COBOL; XML, XSL and XSD; Microsoft Visual Source Safe 6.0

Operating Systems: Windows 9x/NT/2000/XP, UNIX IBM-AIX 5.1/4.1/3.2, Sun Solaris 2.6, MS-DOS 6.22.

PROFESSIONAL EXPERIENCE

Confidential, OK

Sr. Informatica MDM Developer

Responsibilities:

  • Description: Confidential Corporationis an Americanpetroleumandnatural gasexploration and production company headquartered inOklahoma City .The role would be supporting MDM and helping with implementation projects. Specifically, helping with the SAP owner implementation. They are in the release cycle for Phase 3 of their SAP implementation (PRA and core have different concepts of "owners"). They need to implement this in to MDM, in order to make sure the information gathering takes less time in validating that their information is correct. Data quality clean up. User-interface is in IDD.
  • Roles & Responsivities: Building master data management solutions in an enterprise MDM environment using MDM tools such as Informatica Master Data Management.
  • Provide technical support for MDM platform, including inbound/outbound data integration (ETL), Data Quality (DQ) and Informatica data director (IDD) development, and maintenance/tuning of match rules and expectations.
  • Design data models to support MDM solutions.
  • Design, develop and translate business requirements and processes for data matching and merging rules and survivorship criteria using MDM.
  • Created Bulk import for mass load of data, Created format to upload in different subject areas and child areas.
  • Cleanse rules are implemented for data cleansing and data format of incoming data.
  • Demonstrated experience and ability to apply data quality techniques such as match, merges, profiling, score carding, data standardization and parsing using IDQ.
  • Consume Application service (URI) in MDMHUB created in IDQ for data cleansing purpose.
  • Create report in IDQ (Create Mapplet, IDQ rules) using profile. User uses Analyst to see the report.
  • Crete ETL mapping in Informatica PC to run the report. Mapping will consume Mapplet created in IDQ.
  • Collaborate with source systems, data strategists and technical staff for data governance and to resolve any data quality or technical issues.
  • Troubleshooting and resolving MDM related issues, providing solid solutions and aligning with master data strategies and best practices.

Environment: Informatica MDM 10.2, Informatica pc 10, Informatica IDQ 10, Soap UI, Web Services, MDM SIF APIs, Jboss, Sql Oracle 10, VSTS for source code

Confidential, GA

Sr. Informatica MDM Developer

Responsibilities:

  • Description: A global leader in beverage industry, the Coca Cola company is the world’s biggest. Offers hundreds of beverages including Soft drinks, fruit juice, sports drinks and other beverages. Coca cola has three segments Bottling, Refreshment and Coca-Cola Co. Sales/Customer organization hierarchy will be defined and stores in Informatica HM, Informatica HM the system of records for sales and pricing(Customer) organization
  • Role & Responsibilities: Analyze the requirements with architect and created technical documents to create MDM hub and hierarchies.
  • Used IDQ to complete initial data profiling and removing duplicate data
  • Extensively used IDQ for functional analysis, Profiling and for development activities and invoking IDQ as Mapplets.Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Create & deployed IDQ applications and run the applications, workflows, mappings using Unix scripts.
  • Medium to complex quality rule design, development and implementation patterns with cleanse, parse, standardization, global Address Validation, global phone number validation and format, Search Match, Match Merge.
  • Created landing tables, base tables, staging tables according to the data model and number of source systems.
  • Developed various kinds of mappings to load landing, lookup and reference tables.
  • Testing jobs to assure that data is loaded as per ETL specification & testing the ETL jobs in UAT and Test environment.
  • Optimizing the mapping through SQL query override, created complex mappings and tuned for better performance.
  • Fixed the issues while getting the data loaded in stage and base tables and worked with Informatica Support.
  • Defined the Trust scores for the source systems as per understanding the business process. create hierarchies Entity, Entity type, Hierarchies, Entity Relationship, Relationship types and profiles.
  • Created Match rule sets in for the base objects by defining the Match Path components, Match columns and rules.
  • Analyzed and profiled the data and came with up the initial match rules and gone through several iterations for tuning matches.
  • Successfully implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries
  • Utilized SOAP UI to call SIF API requests for Data Clean up services/ Data Put services/ Data retrieval services / testing the cleanse functions services.
  • Develop the web services using SIF APIs.
  • Utilized Repository manager for Validating, promoting, importing and exporting the ORS repositories to different environments.
  • Executed Stage Jobs/ Load/ match & merge jobs using Batch viewer and Batch Group Jobs.
  • Used SOAPUI to test webservices deployed on MDM HUB.
  • Create test cases to test all the IDD UI components and Bulk table components.
  • Experience on ActiveVOS, which includes AVOS Console, ActiveVOS Designer, and ActiveVOS Central.
  • Experienced in integration of MDM and Avos.
  • The Process flow diagrams have been reversed from reviewing the ETL code to perform data quality measures.
  • Performance tuning heavy queries and optimizing Informatica MDM jobs.

Environment: Informatica MDM 10.1, Informatica pc 9.6 on Premise/Cloud, Informatica IDQ 9,5, ActiveVos 9.2, Soap UI, Java 7, J2EE, Web Services, MDM SIF APIs, Jboss, Sql Server 2012, SVN for source code

Confidential, MN

Sr. Informatica ETL/MDM Developer

Responsibilities:

  • Build complete ETL Specifications and logic buildings.
  • Interact with system analysts to understand the requirements
  • Data load to landing table is designed to be either full load or CDC. The frequency of data load is set to Real time using Informatica Power Exchange or on a set interval.
  • Utilized Informatica IDQ to complete initial data profiling and matching/removing duplicate data.
  • Used address doctor to validate and enrich address related data.
  • Developed various kinds of mappings to load landing, lookup and reference tables.
  • Testing jobs to assure that data is loaded as per ETL specification & testing the ETL jobs in UAT and Test environment.
  • Optimizing the mapping through SQL query override, created complex mappings and tuned for better performance.
  • Extensively used Mapping Variables, Mapping Parameters, and Parameter Files in the mapping
  • Process prime Members centric claims using Master Data Management initiatives using Informatica Master data management and data quality product
  • Created landing tables, base tables, staging tables according to the data model and number of source systems.
  • Concatenated the columns to get unique values loaded into the Pkey source of the staging tables.
  • Fixed the issues while getting the data loaded in stage and base tables and worked with Informatica Support.
  • Defined the Trust scores for the source systems as per understanding the business process.
  • Created Match rule sets in for the base objects by defining the Match Path components, Match columns and rules.
  • Analyzed and profiled the data and came with up the initial match rules and gone through several iterations for tuning matches.
  • Successfully implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries.
  • Utilized SOAP UI to call SIF API requests for Data Clean up services/ Data Put services/ Data retrieval services / testing the cleanse functions services. Used SOAPUI to test web services deployed on MDM HUB.
  • Utilized Putty to shut down and restart the Admin Server / Hub server to speed up the MDM process.
  • Utilized Repository manager for Validating, promoting, importing and exporting the ORS repositories to different environments.
  • Wrote web services client to invoke web services using SIF framework.
  • Create low level design and ETL technical design doc according to requirement.
  • Created customize Sql code to enhance the process, Sql scripts to remove redundant data from base tables.
  • Created Unix script to execute Sql script.

Environment: Informatica power center 9.5/MDM 9.7, Informatica IDQ 9.5, Informatica power exchange, ActiveVOS, Soap UI, Java (Core), Jboss, Oracle 11g, E/R Studio 8.0, Putty, Linux, Subversion, Shell Scripts, Quest Toad, PL/SQL.

We'd love your feedback!