We provide IT Staff Augmentation Services!

Sr. Informatica Mdm Developer Resume

4.00/5 (Submit Your Rating)

Oklahoma City, OK

SUMMARY:

  • 8+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
  • 7+ years of experience in Informatica MDM, Informatica Powercenter, IDQ, Oracle as an Informatica, Informatica MDM and Etl Developer.
  • Experience working on Informatica MDM to design, develop, test and review & optimize Informatica MDM (Siperian).
  • Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules,
  • Merge properties and Batch Group creation.
  • Experience in creation and maintenance of entity objects, hierarchies, entity types, relationship objects
  • Relationship types using Hierarchy tool to enable Hierarchy Manager (HM) in MDM HUB implementation.
  • Hands on experience in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign - key relationships, lookups, query groups, queries/custom queries and packages.
  • Designed, Installed, Configured core Informatica/Siperian MDM Hub components such as Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, Data Modeling.
  • Worked with Sif API to interact with real time applications.
  • Solid expertise in Data Extraction, Data Migration, Data Transformation and Data Loading using ETL process in Informatica Power Center 9.x/8.x/7.x.
  • Experience in designing Reusable Transformations in Informatica such as Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Worked on Data Profiling using IDE-Informatica Data Explorer and IDQ-Informatica Data Quality to examine different patterns of source data. Proficient in developing Informatica IDQ transformations like Parser, Classifier, Standardizer and Decision.
  • Experience in designing and developing Informatica mappings for data loads that include Source Qualifier, Aggregator, Unconnected Lookup, Connected Lookup, Filter, Router, Update Strategy etc.
  • Strong SQL, PL/SQL, T-SQL programming skills in Oracle 12c/11g/10g, SQL Server 2012/2008/2000 , Teradata 14/13/12 and DB2 databases. Proficient in writing Packages, Stored Procedures, Triggers, Views, Indexes and query optimization.
  • Experience in integration of various data sources like Oracle, SQL server and MS access, flat files, and XML files.
  • Experience in debugging, error handling and performance tuning of sources, targets, mappings and sessions with the help of error logs generated by Informatica server.
  • Experience in Agile, Waterfall and UML Methodologies.
  • Good Exposure to development, testing, debugging, implementation, documentation and production support.
  • Extensive experience in implementation of Data Cleanup procedures, Transformations, Scripts and Execution of Test Plans for loading the data into the targets.
  • Experience in UNIX shell and batch scripting, Perl, FTP, SFTP and file management in various UNIX/Windows environments.
  • Performed unit testing at various levels of the ETL.
  • Experience in production support to resolve critical issues and mingle with teams to ensure successful resolution of the reported incident.

TECHNICAL SKILLS:

Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS

ETL Tools: Talend, TOS, TIS, Informatica Power Center 9.x/8.x/7.x/6.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server),, SSIS, Ab-Initio.

Databases: Oracle 12c/11g/10g/9i/8i, MS SQL Server /2005, DB2 v8.1, Netezza, Teradata, Hbase.

Data Modeling: Logical Physical

Dimensional Modeling: Star / Snowflake

Languages: SQL, PL/SQL, UNIX, Shell scripts, C++, SOAP UI, JSP, Web Services, Java Script, HTML, Eclipse

Autosys, Control: M

Testing Tools: QTP, WinRunner, LoadRunner, Quality Center, Test Director, Clear test, Clear case.

PROFESSIONAL EXPERIENCE:

Sr. Informatica MDM Developer

Confidential, Oklahoma City, OK

Responsibilities:

  • Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Shire’s data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.
  • Designed and created BO, staging tables, mappings, transformations as per business requirements.
  • Created mappings to perform the tasks such as cleansing the data and populate that into staging tables.
  • Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
  • Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Identified Golden Record (BVT) for Customer Data by analyzing the data and duplicate records coming from different source systems.
  • Helped the testing team for data integrity and consistency.
  • Defined and configured schema, staging tables, and landing tables, base objects foreign-key Relationships, look up systems and tables, packages, query groups and queries/custom queries.
  • Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records
  • Used Hierarchies tool, for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
  • Successfully implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries.
  • Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
  • Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Created mappings using Informatica Power center and loading data into pre-landing tables.
  • Created Unit Test Case Document, Detailed Design Document, Supplementary Document and Knowledge Transfer Document.
  • High level review of SAM - discussed use of Roles, creation of users and assignment of user to Role
  • Finally, Created Data Validation document, Unit Test Case Document, Technical Design Document, Informatica Migration Request Document and Knowledge Transfer Document.

Environment: Informatica MDM, Informatica Power Center 9.6, Oracle 11g/10g, Toad, JBoss 5.1.0, Address Doctor 5, IDD, Informatica Data Quality IDQ 9.5.1, Windows server, Unix.

Sr. Informatica MDM Developer

Confidential, Alpharetta, GA

Responsibilities:

  • Teamed with business analysts to deduce the business requirements to produce effective technology solutions.
  • Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Conducting requirement gathering sessions, feasibility studies and organizing the software requirements in a structured way.
  • Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems.
  • Collaborated with the data modelers to perform mappings and organize the data using the Erwin tool for data modeling.
  • Experienced in creating and maintaining the entity objects, hierarchies, entity types, relationship objects and relationship types using Hierarchy tool to enable Hierarchy Manager (HM) in MDM HUB implementation and Informatica Data Director (IDD).
  • Participated in the development and implementation of the MDM decommissioning project using InformaticaPowerCenter that reduced the cost and time of implementation and development.
  • Worked on data cleansing using the cleanse functions in Informatica MDM.
  • Used Informatica ETL Power Center 9.6 for migrating data from various OLTP databases to the data mart.
  • Worked with different sources like Oracle, flat files, XML files, DB2, MS SQL Server
  • Extracted data from Sales department to flat files and load the data to the target database
  • Developed complex mappings using Informatica to load Dimension & Fact tables as per STAR schema techniques.
  • Extracted data from sources like fixed width and Delimited Flat files transformed the data according the business requirement and then loaded into Target Data mart.
  • Data Quality and Data integration in real-time using SIF API's.
  • Defined Trust and validation rules before loading the data into the base tables
  • Involved in SIF integration to develop xml code for external applications to perform search match API calls against MDM Hub Data.
  • Running the Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.
  • Performed in creating tasks in the workflow manager and exported IDQ mappings and executed them using the scheduling tools.
  • Created IDD application and Subject Areas, Subject Area Groups, Deploy and test IDD application, cleanse functions, utilizing timeline, export and import master data from flat file.
  • Worked on Tidal and Autosys scheduling tools depending on client’s preference to schedule tasks
  • Involved in preparing required Tech Specification Documents from the Business requirements following Organization standard.
  • Publishing Data using Message Queues to notify external applications on data change in MDM Hub Base Objects.
  • Constructed reusable objects like Mapplets and worklet transformations combined with user-defined functions to use across multiple mappings.
  • Executed jobs using ETL framework by setting up scripts and configuration files.
  • Created deployment groups for production migration (mappings, workflows, parameter files and UNIX scripts) and supported post production support during warranty.
  • Practiced soft proofing on JDF enabled products to communicate between each other by exchanging JDF files through hot folders.
  • Worked on UNIX platform in creating a shell script to perform functions for updating the parmfiles, sending mails and FTP files.
  • Used scheduling tools to create new jobs and job dependencies are setup with different Autosys cycles.
  • Experienced in writing SQL queries for retrieving information from the database based on the requirement.
  • Experienced in partition creations and dimension additions.
  • Successfully implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries for searching the data for crops, materials and breeders in IDD data tab.
  • Involved in the UAT testing of the Items Data Mart price files tool automation project.
  • Performed testing on connections, scripts, workflows, mappings and other scheduled activities.
  • Worked with the offshore team and supervised on their development activity.
  • Reviewed code and confirmed it was compatible to standard programming practice.
  • Conducted Knowledge transfer sessions about the project to the team and managed the project by providing reviews on it.

Environment: Informatica MDM, IDD, IDQ 9.7, Informatica 9.7, 9.6.0 & 9.5.1, (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Flat files, SQL, UNIX Scripting, Putty, Visio, Tidal, Autosys.

Confidential, Indianapolis, IN

Informatica Developer

Responsibilities:

  • As a consultant studied the existing DataMarts to understand and integrate the new source of data.
  • Managing the off shore support group in India for support issue as well as small enhancements for data warehouse.
  • Preparing the weekly status report and coordinating weekly status calls with technology lead/business
  • Designed and created new Informatica jobs to implement new business logic into the existing process.
  • Using Informatica modules ( Repository Manager, Designer, Workflow Manager and Workflow Monitor) to accomplish end to end ETL process.
  • Performed data profiling with Sources to analyse the content, quality and structure of source data
  • During mapping development.
  • Created required scripts/transformations to extract the source data from various sources such as Oracle, Flat Files etc.
  • Used all the complex functionality of informatica (Mapplets, Stored Procedures, Normalizer,
  • Update Strategy, Router, Joiner, Java, SQL Transformation Etc...) to interpret the business logic into
  • The ETL mappings.
  • Designed and developed complex aggregate, joiner, lookup transformations to implement the business rules in the ETL mappings to load the target Facts and Dimensions.
  • Defined Target Load Order Plan for loading data into Target Tables
  • Used Mapplets and Reusable Transformations prevent redundancy of transformation usage and maintainability.
  • Created Complex Informatica mappings and in other hand Simple mappings with Complex SQLs in it based on the need or requirement of business user.
  • Used Informatica’s features to implement Type 1, 2, 3changes in slowly changing dimension Change Data Capture (CDC)
  • Different database triggers Created and configured workflows, worklets & Sessions to transport the data to target systems using Informatica Workflow Manager.
  • Fine-tuned the session performance using Session partitioning for long running sessions.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Used Versioning, Labels and Deployment group in the production move process.
  • Automation of Workflow using UNIX scripts using PMCMD, PMserver commands.
  • Setup Permissions for Groups and Users in all Environments (Dev, UAT and Prod).
  • Createdtables, views, primary keys, indexes, constraints, sequences, grants and synonym.
  • Involved in developing optimized code using PL/SQL for Server related Packages to centralize the application through procedures containing PL/SQL were created and stored in the database and fired off when contents of database were changed.
  • Used debugger to test the mapping and fixed the bugs.
  • Conducted Design and Code reviews and extensive documentation of standards, best practices and ETL Procedures.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time.
  • Developed Oracle Stored Procedures, Packages and Functions and utilized in ETL Process.
  • Handled the performance tuning of Informatica Mappings at various level to accomplish theestablished standard throughput .
  • Analysed the Target Data mart for accuracy of data for the pre-defined reporting needs.
  • Wrote complex SQLs to achieve and interpret the reporting needs into the ETL Process. Also worked on SQL tuning to achieve the maximum throughput.
  • Assisted in all aspects of the project to meet the scheduled delivery time. .
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Wrote UNIX shell scripts to work with flat files, to define parameter files and to create pre and post session commands.
  • Used Autosys Tool to schedule shell scripts and Informatica jobs.
  • Performed Unit, Grid Integration, Testing and validate results with end users.
  • Worked as a part of a team and provided 7 x 24 production support.

Environment: Talend 5.5.1, Informatica Power Center 9.5, Erwin, MS Visio, Oracle 11g, SQL, PL/SQL, Oracle Sql Developer Tool, SQL Server 2008, Flat Files, XML, Mainframe, Cobol Files, Autosys, UNIX Shell Scripting, Subversion.

Confidential, Nashville, TN

Informatica Developer

Responsibilities:

  • Extraction, Transformation and data loading were performed using Informatica into the database. Involved in Logical and Physical modeling of the drugs database.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files to target Oracle Data Warehouse database.
  • Based on the requirements created Functional design documents and Technical design specification documents for ETL.
  • Created tables, views, indexes, sequences and constraints .
  • Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Transferred data using SQL Loader to database .
  • Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Implemented SCD methodology including Type 1 and Type 2 changes.
  • Used legacy systems , Oracle , and SQL Server sources to extract the data and to load the data.
  • Involved in design and development of data validation, load process and error control routines.
  • Used pmcmd to run workflows and created Cron jobs to automate scheduling of sessions.
  • Involved in ETL process from development to testing and production environments.
  • Analyzed the database for performance issues and conducted detailed tuning activities for improvement
  • Generated monthly and quarterly drugs inventory/purchase reports.
  • Coordinated database requirements with Oracle programmers and wrote reports for sales data.

Environment: Informatica Power Center7.1, Ora cle 9, SQL Server 2005, XML, SQL, PL/SQL, UNIX Shell Script.

Confidential

ETL Developer

Responsibilities:

  • Worked with BusinessAnalyst and application users to finalize Data Model, functional and detailed technical requirements.
  • Extracted data from heterogeneous sources like Oracle, SQL Server
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregatorand Update Strategy extensively.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Created Mapplets and used them in different Mappings.
  • Involved in doing Error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Worked with SAP and Oracle sources to process the data.
  • Worked on SAP for data migration Human Resources and Finance and converted various objects on Organizational Structure, Addresses, Time, Basic Pay, Bank Details, Recurring Payments, Tax assignment, Insurance Plans, Payroll etc., to generate report from SAP BI system.
  • Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
  • Created Unix Shell Scripts to automate sessions and cleansing the source data.
  • Implemented pipeline partitioning concepts like Round-Robin, Key-Range and Pass Through techniques in mapping transformations.
  • Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.

Environment: Informatica Power Center 8.6/8.5, Oracle9i, SAP, SAP BI 7.0, SQL Server, Sun Solaris.

Confidential

ETL DEVELOPER

Responsibilities:

  • Participated in business requirements analysis and closely worked with DataArchitects to design Tables (TAC Tables, AMC Tables)
  • Worked with the Data Integration Team to perform data and application integration with a goal of moving data more effectively, efficiently and with high performance to assist in business critical projects coming up with huge data extraction.
  • Designed and Developed ETL process using Talend Enterprise DI.
  • In parallel to Development acted as a Talend Admin: Creating Projects/ Scheduling Jobs / Migration to Higher Environments & Version Upgrades.
  • Created Talend Jobs to populate the data into dimensions and fact table.
  • Created complex jobs in Talend using tMap, tJoin, tDie, tConvertType, tFlowMeter, tLogCatchertReplicate, tParallelize, tjava, tJavaFlex, tAggregateRow, tWarn, etc. and created Talend jobs to populate the data into dimensions and fact tables.
  • Implemented Talend POC to extract data from Salesforce API as an XML Object & .csv files and load data into SQL Server Database.
  • Hands of Experience on many components which are there in the palette to design Jobs & used Context Variables/Groups to Parameterize Talend Jobs.
  • Implemented Error Logging, Error Recovery, and Performance Enhancement's & created Audit Process (generic) for various Application teams.
  • Experience in using Repository Manager for Migration of Source code from Lower to higher environments.
  • Created Projects in TAC and Assign appropriate roles to Developers and integrated SVN (Subversion).
  • Worked on Custom Component Design and used to have embedded in Talend Studio used to be on call Support if the Project is deployed to further Phases.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)

Environment: Talend Enterprise Platform for Data management V5.5.1, UNIX, Oracle, Microsoft SQL Server management Studio, WINDOWS XP.

We'd love your feedback!