We provide IT Staff Augmentation Services!

Sr. Informatica Mdm Etl Developer Resume

Wakefield, NY

SUMMARY

  • 8.5+ years of IT experience in Development and Implementation of EIM services using Informatica Master Data Management, ETL, Informatica Power centre, Informatica Power Exchange Data Quality in various organizations such as Banking, Insurance, Telecommunication and Pharmaceutical.
  • Proficient in various databases like Oracle 11g, DB2, SQL Server, Netezza, Teradata and Flat Files (delimited files, fixed width files, XML & CSV files)..
  • Having Experience in B2B Projects - DT/DX (Data Transformation Studio), Data Transformation Accelerator, Data Format Transformation Library, Data Transformation Engine and full integration with Power center.
  • Built integration process for migrating data from Oracle platforms, SQL Server platforms and even excel spreadsheets via Data Stage.
  • Experience in Java Design Patterns such as Session, Singleton, Data Access Objects (DAO) and Business Delegate.
  • Experienced in the Analysis, Design, Development of Data warehousing solutions and in developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Ab Initio.
  • Highly experienced in ETL tool Ab Initio and working experience with all the Ab Initio components.
  • Expertise and well versed with various Ab Initio Transform, partition, DE partition, Dataset and Database components.
  • Experience with Ab Initio Co-operating System, application tuning and debugging strategies.
  • Extensively used Data Stage Designer to develop various parallel jobs to extract, cleanse, transform, integrate and load data into EDW and then to Data Mart.
  • Worked with Data Stage Designer to import /export jobs, metadata, Data Stage components between the projects.
  • Extensive experience in developing stored procedures, Functions, views and Triggers, complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced in Conducting User Acceptance Testing (UAT).
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
  • Experienced with Informatica Power Exchange 10.1 for Loading/Retrieving data from mainframe systems.
  • Experience in several facts of MDM implementations including Data Profiling, Data extraction, Data validation, Data Cleansing, Data Match, Data Load, Data Migration, Trust Score validation.
  • Experienced in Installing, Managing and configuring Informatica MDM core component such as Hub Server, Hub Store, Hub Cleanse, Hub Console, Cleanse Adapters, Hub Resource Kit.
  • Experience in defining and configuring landing tables, staging tables, base objects, lookups, query groups, queries/custom queries, packages, hierarchies and foreign-key relationships..
  • Developed mappings in informatica to load the data from various sources into the Data warehouse different transformations like source Qualifier, Expression, Lookup, Aggregate, Update Strategy and Joiner.
  • Experience in configuring Entity Base Objects, Entity Types, Relationship Base Objects, Relationship Types, Profiles using Hierarchy tool.
  • Experience in resolving on-going maintenance issues and bug fixes, monitoring informatica sessions as well as performance tuning of mappings and sessions.
  • Performed all kinds of MDM Hub jobs - Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.
  • Expertise in Informatica MDM Hub Match and Merge Rules, Batch Jobs and Batch Groups Columns, Match Rule Sets by defining all suitable properties of Fuzzy and Exact Match concepts.
  • Designed & developed multiple cleanse function, Graph function and standardization scenarios using Address Doctor.
  • Worked on MDM Hub configurations - Data modeling & Mappings, Data validation, Match and Merge rules, Hierarchy Manager, customizing/configuring Informatica Data Director (IDD).
  • Well acquainted with deploying multiple Application Servers such as Jboss, Weblogic and Web sphere.
  • Responsible for creating user groups, privileges and roles to the users using Security Access Manager (SAM).
  • Configured Informatica Data Director (IDD) for data governance to be used by the business users, IT Managers and Data Stewards .
  • Worked extensively on different types of transformations like Source Qualifier, Expression, Aggregator, Router, Filter, Update Strategy, Lookup, Sorter, Normalize, Union, Stored Procedure, and Sequence Generators.
  • Extensive experience in design of Multi-Dimensional modeling concepts like Star and Snowflakes schema.
  • Created, launched & scheduled workflows/sessions and extensively involved in the performance tuning of mappings and sessions.
  • Strong experience in designing and developing mappings to extract data from different sources including flat files, RDBMS tables and XML files.
  • Highly Proficient in the use of T-SQL for developing complex Stored Procedures, Triggers, Functions.
  • Experienced in Teradata utilities like BTEQ, Fast Load, Multi Load, Tpump, Fast Export and Teradata SQL.
  • Experienced in Data Cleansing and Data Standardization using Informatica Power center, Power Exchange and UNIX Shell script by using Reference Data/Golden copy.
  • Experience in Designing and Building the Dimensions and cubes with star schema using SQL Server Analysis Services (SSAS).
  • Experienced in automating the Informatica jobs, also experienced in Informatica scheduler, Auto sys, and Tivoli scheduler scheduling tools.
  • Extensive Experience on Teradata database, analyzing business needs of clients, developing.
  • Experience in Involved various testing activities like Unit testing, System Integration testing and User acceptance testing.

TECHNICAL SKILLS

EIM tools: Informatica MDM Multi-domain 9.5.1/9.7.1 , IDD, SIF.

ETL tools: Ab Initio, Informatica Power Centre 8.X, 9. X.10.X, Power Exchange

Cleanse Adapters: Address Doctor, Trillium

Databases: Oracle 10g/11g, My SQL 5.0/4.1, SQL Server 2008, MS Access

Editors: SQL Navigator, Toad

Data Modeling: ERWIN 4.0, MS Visio

Scheduling Tools: Control M, Autosys, Tivoli

Programming Skills: SQL, PL/SQL, C, C++, UNIX Script, Java

Application Servers: Web Logic 10.x/9.x, JBoss

Environment: Windows OS, UNIX, Windows SERVER 2003/08

BI Reporting Tools: MS Excel 2010, SSIS, SSRS.

PROFESSIONAL EXPERIENCE

Confidential, Wakefield, NY

Sr. Informatica MDM ETL developer

Responsibilities:

  • Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Shire's data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.
  • Extracted Data from different flat files, MS Excel, MS Access and transformed the data based on user requirement using informatica MDM/Power Center/Power Exchange and loaded data into target, by scheduling the sessions.
  • Responsible for Developing, support and maintenance for the ETL (Extract, Transform and load) processes Using Oracle and Informatica Power Center.
  • Responsible for cleansing the data from source systems using Ab Initio components such as Join, Denormalize, Normalize, Reformat, Filter - by - Expression, Rollup.
  • Extensively used Oracle ETL process for address data cleansing.
  • Implemented Java and J2EE Design patterns like Business Delegate and Data Transfer Object (DTO), Data Access Object and Server Locator.
  • Effective use of advanced reporting and formatting functions in Business Objects reporting.
  • Developed Single and multiple dashboards and scorecards using Business Objects.
  • Created common reusable objects for the ETL Informatica MDM team, overlook coding standards and reviewed high - level design specification, ETL coding and mapping standards.
  • Designed DataStage parallel jobs involving complex business logic, update Strategies, transformations, filters, lookups and necessary source-to-target data mappings to load the target.
  • Developing mappings using various cleanse functions and Address doctor functions to move data into Stage tables.
  • Designed and developed Big Data analytics platform for processing customer viewing preferences and social media comments using java, Hadoop.
  • Experienced with Informatica Power Exchange 10.1 for Loading/Retrieving data quality from mainframe systems.
  • Created SSIS packages to extract data from flat files, Teradata, Oracle and DB2 and transform the data according to the business requirements and load the data in SQL server tables.
  • Performed Unit Testing, tuned it for better performance and created various documents such as Source - to - Target Data mapping document and unit Test Cases Document.
  • Developing the IDD application building Hierarchies as per the business needs.
  • Involved in UAT of the applications by providing users with test cases and scenarios and guiding them during the testing process.
  • Supporting role towards Business Analyst and Team lead in requirement phase, peer review and preparing high level design.
  • Developed the code based on Low level design - includes more detail information tables, database and data flow.
  • MDM development includes creating Base object tables, staging tables and landing tables as per requirement in LLD.
  • Created individual mappings (i.e. Stage jobs) for moving data from landing table to staging tables.
  • Configuring the trust and validation in base object columns as per business requirement.
  • Ran the load jobs for moving data from Staging tables to base object tables like Bo tables, Xref tables history tables.
  • Published the records to downstream applications with help of ETL power center with the help of message queue.
  • Configured IDD for client usage on web application. Clients can access the master record and update the information directly in base object.
  • Setup SIF for java application communication and interfacing the IDD as required.

Environment: Informatica Multi domain MDM 9.5.0, JBoss 5.1, Oracle 11g, Address Doctor 5, IDD, SIF, Informatica Data Quality (IDQ) 9.5.0,Java and J2EE, Teradata R6, Informatica Power Center 9.1.0, Business Objects, Toad, SQL*Loader, Windows Server 2008XP.

Confidential, Columbus, Ohio

Informatica MDM ETL Developer

Responsibilities:

  • Having strong understanding in Product life cycle, MDM, Data governance, Data quality, Data security for products across the enterprise from many disparate systems.
  • Involved in Design, analysis, Implementation, Testing and support of MDM Informatica ETL processes for stage, ODS and Mart.
  • Prepared ETL standards, Naming conventions and wrote ETL power center flow documentation for stage, ODS, and Mart.
  • Developed reusable Mapplets and Transformations, and worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
  • Worked w /Data Profiling Team in analysing the source systems data for duplicative and data quality issues.
  • Performeddata profilingof reference data for large data sets using Informatica Data Analyst tool.
  • Worked w/ MDM/ETL team in configuring the landing tables’ structure according to standardization of the Client.
  • Configured Base Object Tables in schema and relationship tables in hierarchies according to data-model.
  • Configured Landing Tables, Staging Tables, Look-ups, Cleanse Lists, Cleanse functions, Mappings, Audit trail/ Delta detection.
  • Performed the performance and tuning at source, Target levels using Indexes, Hints and Partitioning in DB2, ORACLE and Informatica.
  • Used Query Designer, BEX Analyzer and Business Objects info view, crystal reports for reporting.
  • Designed and formatted crystal reports that have SAP BW as data sources using BW query driver.
  • Created Multi-Dimensional cubes using SSAS for sales, Inventory, Operational, merchandising and financial environment.
  • Creating project plans, design and development Building a data quality model according to business needs by interacting with data architect.
  • Configured the three Source system from which approximately 1.2 million data will be used in landing process.
  • Configured Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, query groups, packages and custom cleanse functions.
  • Defined System Trust and Validation rules for base object column.
  • Worked in implementation of Profiling, Score Card, Classifier models, Probabilistic models, Human task and Exception record management as part of IDQ process.
  • Experienced in customizing error messages displayed on the IDD screen.
  • Configured Search Strategy, Match rule sets, Match Columns for Match and Merge Process in improving data quality by match analysis.
  • Created Queries, Query Groups and packages in MDM Hub Console.
  • Implemented inserting/updating strategy using batch process which is done nightly, SOAP for near real time data and IDD
  • Configured Informatica Data Director (IDD) in reference to the Data Governance of users, IT Managers & Data Stewards.
  • Created &deployed new applications in Informatica Data Director & binding application to a specific ORS.
  • Used Metadata manager for validation, promotion, importing and exporting the ORS repositories to different environments.

Environment: Informatica MDM 9.5.0, Informatica Data Quality (IDQ) 9.5.0, Informatica Power Center 9.1.0, Informatica Power exchange 10.1, AQT 9.0, Erwin, Jboss, HP Quality Center 9.2 and Windows XP.

Confidential, Ann Arbor, MI

Informatica MDM Developer

Responsibilities:

  • Perform Designs and Analysis on business systems applications, systems interfaces, databases, reporting, or business intelligence systems.
  • Worked with Business Analyst and end users to generate Requirement Document according to business needs for Customer Master ORS.
  • Involved in landing the data from Seven Different source systems by running external batch process to load business data for creating customer master data. External batch process used was ETL process.
  • Developed and managed relationships between data domains for the data models.
  • Involved in loading data from different sources to staging area by cleansing data to do this we are using Trillium services and Duns Numbers from Duns and Bradstreet (DNB). After getting data as per business requirement from DNB and Trillium we are loading data to the ware house.
  • Created Base objects, staging tables, loading tables and foreign key relationships for a defined party model for Customer Master ORS.
  • Created mapping document and mappings in MDM HUB ORS using various cleanse list and cleanse functions.
  • Defined Trust and validation rules for base table defined by End users and BA team.
  • Performed Match and Merge Rules as per business requirement on source data.
  • Configured IDD application for Party Communication, Party to Party Rel and Master Guarantee Agreement Subject Area required for data governance used by the Data Services Team.
  • Created queries, procedures and packages in MDM Hub for displaying and updating the data.
  • Established data quality rule and data quality improvement, defined data mining and performance optimization policies for customer master data.
  • Created Roles and privileges for Read Only and Data Entry for Data Entry Team users.
  • Involved in Data Steward Activities like manual Merge/Unmerge and updating data.
  • Used metadata manager for importing, exporting, validating the data for transforming the schema from development environment to testing environment.
  • Configured Entity Base Objects, Entity Types, Relationship Base Objects, Relationship Types, and Profiles using Hierarchies tool.
  • Improved the data quality by Match analysis, analysis of the MDM system performance and tuning the performance by changing the existing cleanse function and match rules.
  • Experienced in Metadata manager to Import and Export of Metadata and Promote Incremental changes between environments from development to testing environment.
  • Created user groups, privileges and roles to the users using Security Access Manager (SAM).

Environment: Oracle 11g, Informatica MDM (Formerly Siperian) 9.1, TOAD, JBoss, Address Doctor 5.

Hire Now