We provide IT Staff Augmentation Services!

Sr. Informatica Powercenter And Mdm Developer Resume

3.00/5 (Submit Your Rating)

Englewood, CO

PROFESSIONAL SUMMARY

  • Over 8+ years Extensive experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and developing Master Data using Informatica Power Center/IDQ/MDM, Teradata.
  • Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure and more.
  • Extensively worked with Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, TeradataSQL Assistantand Query man.
  • Expertise in writing large/complex queries using SQL.
  • Expertise inSQL Server Integration Services(SSIS)andSQL Server Reporting Services (SSRS)with good knowledge onSQL Server Analysis Services(SSAS).
  • Worked on Informatica Data Quality 9.1(IDQ) tool kit and performed data profiling, cleansing and matching and imported data quality files as reference tables
  • Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
  • Involved in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, Telecom and Manufacturing Sectors.
  • Strong understanding of OLAP and OLTP Concepts
  • Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.
  • Experience in SQL, PL/SQL and UNIX shell scripting.
  • Hands on experience working in LINUX, UNIX and Windows environments.
  • Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.
  • Good knowledge on data quality measurement using IDQ and IDE
  • MDM Developer with experience in implementing, development, maintenance and troubleshooting on Informatica MDM solutions, Metadata Management, data quality, data integration.
  • Extensive ETL experience using Informatica Power Center (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.
  • Experience in designing and Developing complex Mappings usingInformatica Power CenterwithTransformationssuch asLookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, XML generator, XML parser, Stored Procedure, Sorter and Sequence Generator.
  • Working experience usingInformatica Workflow Managerto create Sessions, Batches and schedule workflows and Worklets, Re-usable Tasks, Monitoring Sessions
  • Experienced in Performance tuning of Informatica and tuning the SQL queries.
  • Proficient with Informatica Data Quality (IDQ) for cleanup and massaging at staging area.
  • Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Experience in handlinginitial/fullandincremental loads.
  • Expertise in scheduling workflows Windows scheduler, Unix and scheduling tools like CRTL-M & Autosys
  • Experience in support and knowledge transfer to the production team.
  • Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.
  • Experienced in Quality Assurance, Manual and Automated Testing Procedures with active involvement in Database/ Session/ Mapping Level Performance Tuning and Debugging.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.x/9.5.x/9.1/8.6.x/8.5.x., Informatica Power Exchange 9.x, Data Stage, Informatica Data Explorer (IDE), Informatica MDM 9.5.1, Informatica Data Quality (IDQ), SSIS,SSRS.

Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.

Databases: Oracle11g/10g/9i, SQL Server2008/2005,TOAD, Teradata 13.1/V2R5, V2R6, Sybase, MS Access.

Scheduling Tools: Control-M, CA7 Schedule, Autosys, Informatica Scheduler.

Reporting Tools: Crystal Reports, Business Objects XI R2/XI 3.3, OBIEE 11g R1 (11.1.5).

Programming: SQL, PL/SQL, Transact SQL, HTML, XML, C, C++, Korn Shell, Bash.

Operating Systems: Windows 7/XP/NT/95/98/2000, DOS, UNIX and LINUX

Other Tools: SQL*Plus, Toad, SQL Navigator, Putty, MS-Office, SQL Developer.

PROFESSIONAL EXPERIENCE

Confidential, Englewood, CO

Sr. Informatica Powercenter and MDM Developer

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
  • Experienced in Parsing high-level design specs to simple ETL coding and mapping standards.
  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
  • Coded Teradata BTEQ sql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
  • Worked with team to convert Trillium process into Informatica IDQ objects.
  • Extensively used the DQ transformations like Address validator, Exception, Parser, Standardizer, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor
  • Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
  • Extensively worked on CDC to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.
  • Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
  • Proficient in System Study, Data Migration, Data integration, Data profiling, Data Cleansing / Data Scrubbing and Data quality
  • Worked onInformaticaData Quality (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting and monitoring capabilities of IDQ.
  • Coded Teradata BTEQ sql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
  • Experience in creation of ETL Mappings and Transformations using Informatica Power Center to move data from multiple sources into target area using complex transformations like Expressions, Routers, Lookups, Source Qualifiers, XML generator, XML Parser, Aggregators, Filters, Joiners
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data
  • Responsible in preparing Logical as well as Physical data models and document the same
  • Performed ETL code reviews and Migration of ETL Objects across repositories.
  • Developed ETL's for masking the data when made available for the Offshore Dev. team
  • Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's
  • Integrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica MDM using IDQ cleanse Adapters.
  • Migrated codes from Dev to Test to Pre-Prod. Created effective Unit, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure successful execution of accurate data loading.
  • Scheduled Informatica workflows using OBIEE.
  • Involved in implementing change data capture (CDC) and Type I,II, III slowly changing Dimensions
  • Developed functions and stored procedures to aid complex mappings

Environment: Informatica Power Center 10.x/9.6, Informatica Multi Domain MDM 9.5, Informatica Data Quality (IDQ) 9.6, Oracle 11g, Teradata, PL SQL, SQL developer, TOAD, Putty, Unix

Confidential, Houston, TX

Informatica Powercenter and IDQ Developer

Responsibilities:

  • Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.
  • Worked closely with ETL Architect and QC team for finalizing ETL Specification document and test scenarios.
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, Score cards, reporting and monitoring capabilities of Informatica Data QualityIDQ 9.6.1.
  • Extracted data from oracle database and spreadsheets, CSV files and staged into a single place and applied business logic to load them in the central oracle database.
  • Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Workedinimplementation of Profiling, Score Card, Classifier models, Probabilistic models, Humantask and Exception record management as part ofIDQprocess.
  • ntegrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica MDM using IDQ cleanse Adapters.Implementing Slowly Changing Dimension (SCD type II) design for the Data Warehouse.
  • Migration of code between the Environments and maintaining the code backups.
  • Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
  • Integration of various data sources like Oracle, SQL Server, Fixed Width & Delimited Flat Files, DB2.
  • Implemented data quality rules on Hadoop HDFS and HIVE source tables using Informatica Big Data Edition Adapters in IDQ.
  • Involved in the Unit Testing and Integration testing of the workflows developed.
  • Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.
  • Imported Source/Target Tables from the respective databases and created reusable transformations like Joiner, Routers, Lookups, Filter, Expression and Aggregator and created new mappings using Designer module of Informatica.
  • Used the Address Doctor to validate the address and performed exception handling, reporting and monitoring the system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed the workflows as an application to run them. Tuned the mappings for better performance.
  • Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Developed shell scripts for running batch jobs and scheduling them.
  • Handling User Acceptance Test & System Integration Test apart from Unit Testing with the help of Quality Center as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.
  • Involved in Production Support

Environment: Informatica Power Center 9.6, Informatica Multi Domain MDM 9.1,, Informatica Data Quality (IDQ) 9.5, Oracle 11g, SQL Server, PL/SQL, Unix and WINSCP

Confidential, Columbus, Ohio

Sr. Informatica ETL Developer

Responsibilities:

  • Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.
  • Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.
  • Assisted in building the ETL source to Target specification documents
  • Built the Physical Data Objects and developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the data. Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ 9.5.1. for the MDM.
  • Effectively communicate with Business Users and Stakeholders.
  • Worked on Master Data Management (MDM), Hub Configurations (SIF), Data Director, extract, transform, cleansing, loading the data onto the tables.
  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
  • Design, develop, test and review & optimize Informatica MDM and Informatica IDD Applications.
  • Involved inmatch/merge and match rules to check the effectiveness of MDM process on data.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Perform Data Conversion/Data migration using Informatica PowerCenter.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Automated the jobs using CA7 Scheduler.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
  • Created Data Model for the Data Marts.
  • Used materialized views to create snapshots of history of main tables and for reporting puspose
  • Coordinating with users for migrating code from Informatica 8.6 to Informatica 9.5
  • Contact with Informatica tech support group regarding the unknown problem
  • On-Call support during the weekend
  • Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's
  • Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
  • Prepared SQL Queries to validate the data in both source and target databases.

Environment: Informatica 9.5/8.6, Informatica Data Quality (IDQ) 9.5, Informatica MDM, Oracle 11g, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL Loader, OBIEE, Unix, Flat files, Teradata

Confidential, Detroit, MI

Informatica ETL Developer

Responsibilities:

  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA.
  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
  • Worked with Informatica Data Quality (IDQ) 9.5.1 Developer/Analyst Tools to remove the noise of data using different transformations like Standardization, Merge and Match, Case Conversion, Consolidation, Parser, Labeler, Address Validation, Key Generator, Lookup, Decision etc.
  • Worked with health payer related data such as customers, policy, policy transactions, claims.
  • Generated weekly and monthly report status for the number of incidents handled by the support team.
  • Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex ETL logics
  • Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Informatica Designer to create complex mappings from Business requirements.
  • Created various transformations like filter, router, lookups, stored procedure, joiner, update strategy, expressions and aggregator to pipeline data to Data Warehouse/Data Marts and monitored the Daily and Weekly Loads.
  • Designed and developed various complex SCD Type1/Type2 mappings in different layers, migrated the codes from Dev to Test to Prod environment. Wrote down the techno-functional documentations along with different test cases to smooth transfer of project and to maintain SDLC.
  • Experience in using Stored Procedures, TOAD, Explain Plan, Ref Cursors, Constraints, Triggers, Indexes-B-tree Index, Bitmap Index, Views, Inline Views, Materialized Views, Database Links, Export/Import Utilities.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.
  • Used different algorithms like Biogram, Edit, Jaro, Reverse and Hamming Distance to determine the threshold values to identify and eliminate the duplicate datasets and to validate, profile and cleanse the data. Created/modified reference tables for valid data using IDQ Analyst tools for MDM data.
  • Developed Informatica Workflows and sessions for mappings using Workflow Manager.
  • Deployed the Informatica code and worked on code merge between two different development teams.
  • Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.
  • Created Pre & Post-Sessions UNIX Scripts to merge the flat files and to create, delete temporary files, change the file name to reflect the file generated date etc.

Environment: Informatica Power Center Designer 9.5/8.6, Informatica Repository Manager, Oracle10g/9i,DB2 6.1, Erwin, TOAD, Unix- SunOS, PL/SQL,SQL Developer,Teradata

Confidential

Informatica ETL Developer

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.
  • Involved in designing dimensional modeling and data modeling using Erwin tool.
  • Created high-level Technical Design Document and Unit Test Plans.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
  • Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
  • Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
  • Prepared user requirement documentation for mapping and additional functionality.
  • Extensively used ETL to load data using Power Center from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analysed the existing systems and made a Feasibility Study.
  • Analysed current system and programs and prepared gap analysis documents
  • Experience in Performance tuning & Optimization of SQL statements SQL trace
  • Involved in Unit, System integration, User Acceptance Testing of Mapping.
  • Supported the process steps under development, test and production environment
  • Participated in the technical design along with customer team, preparing design specifications, functional specifications and other documents.
  • Used Transformation Developer to create the reusable Transformations.
  • Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
  • Wrote SQL Scripts for the reporting requirements and to meet the Unit Test Requirements.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
  • Used Informatica’ s features to implement Type I, II changes in slowly changing dimension tables and also developed complex mappings to facilitate daily, weekly and monthly loading of data.
  • Extensively worked on Oracle SQL's for Data Analysis and debugging.
  • Handled scripts for pre validating the source File structures before loading into the Staging by comparing the source file headers against the base lined header
  • Worked on Teradata Utilities (Multiplied, fast Load, and Export/Import) to improve performance.
  • Wrote shell scripts and control files to load data into staging tables and then into Oracle base tables using SQL*Loader.
  • Used PMCMD command to automate the Power Center sessions and workflows through UNIX.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
  • Involved in troubleshooting existing ETL bugs.

Environment: Informatica Power Center 8.6, ETL, Flat files, Oracle 10g, MS SQL Server 2008, PL/SQL, Shell Programming, TIBCO,SQL * Loader, Toad, Excel and Unix scripting, Sun Solaris, Windows 2002.

Confidential

Informatica ETL Developer

Responsibilities:

  • Involved in implementation of the Test cases and Test Scripts.
  • Tested the data and data integrity among various sources and targets.
  • Tested to verify that all data were synchronized after the data is troubleshoot, and also used SQL to verify/validate test cases.
  • Written Test Cases for ETL to compare Source and Target database systems and check all the transformation rules.
  • Defects identified in testing environment where communicated to the developers using defect tracking tool HP Quality Center
  • Performed Verification, Validation, and Transformations on the Input data
  • Tested the messages published by INFORMATICA and data loaded into various databases
  • Written Test Cases for ETL to compare Source and Target database systems and check all the transformation rules.

Environment: Informatica Power Center 8.6.1, Erwin 4.5, Windows XP, Oracle 10g/9i, Sybase 4.5, SQL*Loader, SQL Navigator, SQL, PL/SQL.

We'd love your feedback!