We provide IT Staff Augmentation Services!

Informatica Developer/idq Developer Resume

0/5 (Submit Your Rating)

Woodland Hills, CA

SUMMARY

  • Have 10 years of expertise in designing, developing and implementing Data Warehousing Applications using Informatica PowerCenter 9.x/8.x/7.x, Power Exchange, Data Quality, Master Data Management and Informatica Cloud.
  • Have Experience in various domains like Health Care, Oil and Gas, Retail, Insurance, Finance and Pharmaceuticals.
  • Experience in System Analysis, Design and Development in the field of Databases.
  • Functional and Technical experience in Decision Support Systems - Data Warehousing and ETL (Extract, Transform and Load) using Informatica PowerCenter tools.
  • Experience with working as a team through the project’s Software Development Life Cycle (SDLC) to analyze, gather project requirements, and to develop, test and maintain developed software using various approaches like Waterfall and Agile.
  • Experience in OLAP concepts, Data Warehousing concepts and Relational & Dimensional Data Modeling (Star and Snowflake Schema).
  • Designed customized ETL process to manage Metadata Dependencies and Relationships.
  • Extensive experience in Power Center Repository Manager, Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Experience in finding and fixing Data Quality problems using tools like Informatica Analyst, Data Explorer which enables profiling and analysis of data along with universal connectivity to all types of data sources.
  • Experience in working on Informatica Cloud using various tasks like Data Replication, Data Synchronization etc.
  • Experiences working on Informatica Process Developer for cloud based activities and develop ICRT process for application integration involving token authentication process using REST API.
  • Well acquainted with Performance Tuning of sources, targets, mappings and sessions to overcome bottlenecks in mappings using Informatica and Database concepts.
  • Highly proficient in designing and developing complex mappings from varied transformation logic like Lookups, Router, Filter, Transaction Control, Aggregator, Stored Procedure, Joiner, Update Strategy, SQL, Java, Union etc.
  • Understanding & Working knowledge of InformaticaCDC (Change Data Capture) and Veeva Salesforce Data tables (SFDC).
  • Involved in Parsing, Standardization, Matching, and ETL integration implementations.
  • Thorough understanding on Match and Merge concepts in defining Match path components and Match rules and configuring theMatch and Merge Rulesto optimize the better Match percentage using IDQ results.
  • Involved in creating Data Maps, extracting (incremental) CDC data from Oracle sources, exporting and updating them to the repository, importing the required source files on the staging environment by using Informatica Power Exchange.
  • Extensively worked with ETL tools to extract data from Relational and Non-Relational databases i.e. Oracle, SQL Server, DB2, and Flat files.
  • Extensive experience in various database development using PL/SQL, TSQL, Stored Procedures, Triggers, Functions and Packages.
  • Experience in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions along with installation and configuration of core MDM hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server and Cleanse Match Adapter on Windows.
  • Experience in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match Rules, Merge properties and Batch Group Creation.
  • Strong knowledge and experience in performance tuning both Informatica and Databases.
  • Knowledge on implementing hierarchies, relationship types, packages and profiles for hierarchy management in MDM Hub implementation.
  • Involved in maintaining jobs and supporting jobs in Production Environment.
  • Experience in creating Test cases for different sources like Relational sources, Flat files etc.
  • Experience in Upgrading, Installing and Configuring Power Center 8.x/7.x/6.x on Client/Server environment.
  • Experience of writing UNIX scripts as pre-sessions and/or post-sessions.
  • Excellent analytical, problem solving, communication skills. Ability to interact with individuals at all levels.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 9.x/8.x/7.x/6.x, Informatica Power Exchange, Data stage 8.1

Cloud Integration Tools: Informatica Cloud

EIM Tools: Informatica Data Quality 9.x, Informatica MDM Multi Domain

Databases: Oracle 11g/10g/9i, SQL Server, DB2, Teradata V2R5, Netezza

Database Tools: TOAD, SQL*Plus, Oracle SQL Developer, WinSQL

Operating Systems: MS Windows 2000/XP/Win7, UNIX.

Languages: C/C++, PL/SQL, UNIX/Linux Scripting, HTML, SQL

Modeling Tools: Toad Data Modeler, Microsoft Visio, Erwin 4.x/7.2

Scheduling Tools: UC4 Operations Manager 8.x/6.x, Control-M, Tivoli, Autosys

PROFESSIONAL EXPERIENCE

Confidential, Warren, NJ

Sr. Informatica/IDQ Developer

Environment: HR Mart - Informatica Power Center 9.1, Informatica Data Quality 9.1, Oracle SQL Developer, Flat Files, Autosys, UNIX

Responsibilities:

  • Assessed and Created Technical Specification Documents based on the requirements.
  • Involved in ETL coding using Informatica tool to extract, transform, cleanse and load data from different source systems according to the DataMart standards.
  • Developed Power Center mappings using various transformations like Aggregator, SQL, Normalizer, Union, and Lookups etc.
  • Optimized some of the older code by changing the logic, reduced running time and encouraged the usage of reusable objects viz., and reusable transformations.
  • Implemented business rules by Standardizing, Cleansing and Validation of data.
  • Implemented data profiling, score cards, creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, Completeness, Duplication, Validity and Consistency.
  • Developed mappings using various Data Quality transformations like Address Validator, Parser, Labeler, Match, Exception, Association, Standardizer.
  • Involved in migration of mappings from IDQ to Power Center.
  • Developed the mappings, applied rules and transformation logics as per the source and target system requirements.
  • Involved in data conversions, standardizing, correcting, error tracking and enhancing the source data.
  • Developed transformation logic and designed various simple and Complex Mappings in Designer with performance, accuracy and efficiency to achieve operational objectives.
  • Created shell scripts to implement miscellaneous tasks like copying source files, creating indicators, cleaning up files once the process is complete etc.
  • Setup Informatica batch jobs using Informatica Scheduler for DataMart.
  • Provided monitoring and production support for the ClaimPATH nightly run loads.
  • Performed audits on the data whenever a break in the data was identified.
  • Involved in Migration of objects from Dev and Test server to Production servers.
  • Created autosys scripts to schedule and process the jobs on a daily basis.
  • Involved in monitoring tasks, workflows and performance for mappings in Workflow monitor.
  • Fined tuned ETL processes by considering mapping and session performance issues.
  • Prepared/maintained documentation on aspects of ETL processes to support knowledge transfer to other team members.

Confidential, Mooresville, NC

Sr. Informatica Developer

Environment: Informatica PowerCenter 9.5, DB2, UNIX, Oracle SQL Developer, JCL, Mainframe.

Responsibilities:

  • Interfaced with business to understand the requirements for the project.
  • Assessed and Created Technical Specification Documents based on these meetings.
  • Worked with technical lead to formulate the design for the new PVS and Adhoc process.
  • Estimated and planned the development work using Agile Software Development.
  • Involved in improving the functionality of existing Dotcom load process.
  • Working on ETL coding using Informatica tool to extract, transform, cleanse and load data.
  • Developing Power Center mappings using various transformations like Aggregator, Normalizer, Union, XML, Java etc.
  • Created and developed the Adhoc process related to adding and removing of different products on Confidential website.
  • Developed the configuration of Informatica webservices to automate the eID requests using web services consumer transformation.
  • Involved in creating the process to replicate and replace the current BODL model
  • Developed the process to transfer existing and new attributes from BODL attribute dictionary to commerce database
  • Assisted in the process of transferring product details and prices from Confidential to Google and Ebay feeds.
  • Worked with QA testers and actively involved in setting up test cases.
  • Involved in monitoring tasks, workflows and performance for mappings in Workflow monitor.
  • Tuned the existing ETL processes by for mapping and session performance issues.
  • Setup Informatica jobs using JCL on mainframe
  • Prepared/Maintained documentation on aspects of ETL processes to support knowledge transfer to other team members

Confidential, Commack, NY

Sr. Informatica Developer

Environment: Informatica PowerCenter 9.5, Informatica Cloud Services (ICS), Oracle 11g, Veeva Salesforce, TOAD, UNIX, Informatica Scheduler, Windows 7.

Responsibilities:

  • Involved in assessing the technical and business requirements needed for replacing the CRM system.
  • Involved in estimates, planning and identifying options for potential risks/data gaps.
  • Analyzed existing application and system to formulate logic for new process.
  • Consult with Business analysts and Architects to understand, anticipate and meet current and future needs for the project.
  • Used Informatica Cloud as Data Replication tool to extract data from Veeva Sales force.
  • Worked on ETL coding using Informatica tool to cleanse, enrich and standardized data from various sources into staging.
  • Created Data Replication Jobs to replicate Sales force tables in Oracle. Data synchronization Jobs were also created to synchronize data back to Sales force tables.
  • Used Application Source qualifier to retrieve data from Salesforce and load data into SAP using BAPI transformations.
  • Created scripts to createnew tables, views, queriesfor new enhancement in the application using TOAD.
  • Developed Power Center mappings using various transformations like Aggregator, Union, Lookup, Joiner, Expression, Router, Filter, and Update Strategy.
  • Developed an ICRT process for application to application integration using token based authentication so that the URL (Swagger/SOAP) is share to third party vendors for their use.
  • Involved in implementing processes to capture Delta changes along with Data profiling.
  • Created process to extract and load product data from Oracle to SAP using BAPI transformation.
  • Fined tuned ETL processes by considering mapping and session performance issues.
  • Helped in Performance Tuning for Oracle using features such asQuery Optimizer, Execution Plans, Indexes and Database Partitioning.
  • Designed and setup tasks using informatica scheduler to run the jobs based on schedule or on demand.
  • Created Task flows and Schedules to run Cloud Jobs based.
  • Provide on call support for the production systems and provide solutions when needed.
  • Support any minor enhancements or service requests for the ETL process.
  • Created, Updated and maintain ETL Technical Design documents (TDD).

Confidential, Woodland Hills, CA

Informatica Developer/IDQ Developer

Environment: Informatica PowerCenter 9.1, Informatica Data Quality (8.6/9.1), Informatica Data Explorer, DB2, Flat Files, Oracle 11g, UNIX, DBVizualizer, Control-M, Windows 7.

Responsibilities:

  • Involved in assessing the technical and business suitability of the requirements for EPDSv2.
  • Involved extensively in estimates, planning, translating client’s requirements, identifying options for potential risks/data gaps.
  • Coordinated with technical architects, data modelers and DBA’s on ETL strategies to improve EPDS.
  • Worked on ETL coding using Informatica tool to extract, transform, cleanse and load data.
  • Developed Power Center mappings using various transformations like Aggregator, Union, Lookup, Joiner, Expression, Router, Filter, and Update Strategy.
  • Involved in implementing processes to capture data change (CDC), Business Validation and Data profiling.
  • Worked as a Tech Lead for one of the track to load source data into EPDSv2.
  • Managing an offshore team for the different tracks related to the EPDS project.
  • Worked with IDQ to ensure accurate Address matching/cleansing and data quality for all source data types.
  • Defined Data Quality standardization & cleansing rules using Informatica Data Quality (IDQ) discovered from the profile results.
  • Performed Data Quality checks, cleansed the incoming data feeds and profiled the source systems data as per business rules using IDQ.
  • Worked with various developer modules like profiling, standardization and matching.
  • Designed various mappings using different transformations such as key generator, match, labeler, case converter, standardize, parser and lookup.
  • Worked extensively with address validator to cleanse the address elements. Created input and output templates.
  • Created different types of profiles like Column level profiling, Summary profiles, drill down etc. using IDE.
  • Involved in Match and Merge rules, developed address validations etc. and also reusable error handling rules using IDQ.
  • Created DQ mapplets and incorporated into PowerCenter mappings to test the standardized and cleansed data.
  • Handled various issues related to monitoring and performance tuning of data.
  • Involved in code checks as well as testing to ensure all the requirements defined in the RTM are implemented.
  • Used scheduling tool Control-M to automate Informatica Workflows for daily Batch Jobs.
  • Designed and setup tasks using informatica process for on-call support.
  • Created, Updated and maintain ETL technical documents.

Confidential, Houston, TX

ETL Consultant/Informatica Developer

Environment: Informatica PowerCenter 9.1, SAP ECC 6.0, SQL Server 2008, Oracle 11g, UNIX, TOAD, Windows 7.

Responsibilities:

  • Involved in Extraction, Transformation and Loading of data using Informatica.
  • Involved in ETL coding using Informatica tool from various source systems according to the PPDM (Professional Petroleum Data Management) standards.
  • Extracted data from SAP R/3 4.6 and loaded into SAP ECC 6.0.
  • Generated and Installed ABAP Program/SAP R/3 Code Using Informatica 9.1.
  • Loaded data into SAP ECC 6.0 using IDOC’s, BAPI and Function Modules.
  • Developed Power Center mappings using various transformations like Stored Procedure, Union, Lookup, Joiner, Expression, Router, Filter, and Update Strategy.
  • Involved in implementing processes to capture data change (CDC) and also for Data profiling.
  • Created reusable sessions and commands in the Workflow Manager.
  • Setup EDW jobs and job schedules using TIDAL scheduler.
  • Designed and setup tasks using ABC process for EDW failures for on-call support.
  • Created, Updated and maintain ETL technical documents.
  • Setup complete EDW jobs folder hierarchy, naming convention and job schedule management in UC4 Operations Manager.

Confidential, Oklahoma City, OK

ETL Consultant/Sr. Developer

Environment: Informatica PowerCenter 9.x/8.x/7.x, Oracle 11g/10g, SQL Server, UC4, AIX, Windows 7/XP.

Responsibilities:

  • Extensively involved in Performance tuning of ETL processes in Data Warehouse.
  • Setup complete EDW jobs folder hierarchy, naming convention and job schedule management in UC4 Operations Manager.
  • Design, develop and setup call operators in UC4 tool on job failures for EDW on-call support.
  • Prepare and maintain ETL standards and common practices documents on SharePoint.
  • Setup EDW on-call support schedule and support the ETL code development in Production environment.
  • Develop schedules, events, calendars and master job plans in UC4 tool for automation of EDW processes.
  • Worked on implementing procedures/functions to support daily ETL process miscellaneous tasks.
  • Created physical data model in the Data Warehouse and designed custom ETL execution plans.
  • Assisted in managing Metadata driven dependencies and relationships by capturing deleted records, Index management as DAC administrator.
  • Involved in implementing data transformation processing for Relational database (Oracle) using Informatica push down optimization option.
  • Provided error reporting and email monitoring to isolate bottlenecks as part of daily process.
  • Applied Informatica Data Quality methods to apply rules to financial, customer and asset data to profile, specify and validate rules, and monitor data quality over time.
  • Actively involved in testing Informatica EDW Power Center Upgradation to version 7, 8, and 9.
  • Actively involved in testing Oracle EDW Database Upgradation to version 11g.
  • Provided expertise to Business analysts on how to prepare ETL process documents.
  • Prepare and maintain documents to support EDW daily UC4 job operations and yearly maintenance procedures.
  • Create customized metadata reporting process for Informatica Repository.
  • Involved in implementing processes to capture data change (CDC) using Triggers on tables, applying status indicators or timestamp indicators on rows of data.
  • Develop Packages, Stored Procedures, and Functions using PL/SQL for the automation of database maintenance EDW processes.
  • Create customized metadata reporting process for UC4 Operations Manager Repository.
  • Develop UNIX shell scripts to invoke Informatica jobs using the pmcmd commands.
  • Design and develop unit test cases for the ETL processes.
  • Implement Informatica partitioning to improve data load time for various EDW Mappings.
  • Design and develop customized data profiles using the Power Center Designer.
  • Various complex data cleansing logic was developed using the Power Center Designer.
  • Develop Mapplets to be used in various other Mappings using the Power Center Designer.
  • Create complex mappings that involve slowly changing dimensions (Type 1, Type 2, and Type 3), target load order and constraint based loading using the PowerCenter Designer.
  • Prepared/maintained documentation on aspects of ETL processes to support knowledge transfer to other team members.
  • Provide training to support EDW daily ETL activities and expertise on Informatica related issues.

Confidential, San Diego, CA

Informatica Developer

Environment: Informatica PowerCenter 8.6, Data stage/Quality stage 8.1, Erwin 7.2, Oracle 10g, Teradata V2R5, SQL Server 2008, HP AIX, Tidal & Windows XP.

Responsibilities:

  • Created and Imported/Exported various Sources, Targets, and Transformations using Informatica PowerCenter 8.6.
  • Extracted data from several source systems and loaded data into EDW (Enterprise Dataware house).
  • Worked on Power Center Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer and Mapplet Designer.
  • Extensively worked with the data conversions, standardizing, correcting, error tracking and enhancing the data.
  • Develop Packages, Stored Procedures, and Functions using PL/SQL for the automation of database maintenance EDW processes.
  • Worked on implementing procedures/functions to support daily ETL process miscellaneous tasks.
  • Involved in performance tuning with Informatica and database.
  • Involved in Migration of objects from Dev and Test server to Production servers.
  • Worked on Data stage/Quality stage 8.1.
  • Worked on Teradata BTEQ & T-SQL Tools for querying and retrieving data.
  • Worked with Mload, Fload & Fast export Scripts for loading and exporting data from Teradata database.
  • Worked with several applications like Cheetah mail, Wine.com, Florist Express etc.
  • Prepared Source to Target mapping documents and other related support documentation.
  • Mappings/Jobs scheduled by using Tivoli Scheduler Tool.
  • Design and develop unit test cases for the ETL processes.
  • Involved in Development, Testing and Production support.

Confidential, Atlanta, GA

Informatica Consultant/Developer

Environment: Informatica PowerCenter 6.0, Erwin 4.0, XML, DB2, SQL Server 2000, Teradata, Oracle 8.x, AIX, HPUX, Solaris and Windows 2000.

Responsibilities:

  • Created and Imported/Exported various Sources, Targets, and Transformations using Informatica Power Center, Repository Manager and Designer.
  • Developed various Mappings with the collection of all Sources, Targets, and Transformations.
  • Created Mapplets with the help of Mapplet Designer and used those Mapplets in the Mappings.
  • Developed and scheduled various Pre and Post Sessions, Batches for all Mappings for data loading from Source TXT files, Tables to Target tables.
  • The tables had to be populated for use in daily load for the initial load 'inserts' and then updated using incremental aggregation and update strategy transformation.
  • Created and managed the global and local repositories and permissions using Repository Manager.
  • Created Mappings between Source and Target using Power Center Designer.
  • Worked Extensively with Slowly Changing Dimensions i.e. Type1 & Type2.
  • Used Dimensional Modeling Techniques to create Dimensions, Cubes and Fact tables.
  • Involved in Logical and Physical Database Design, forward engineering & reverse engineering using Erwin tool.
  • Imported an XML file to Designer, performed multiple operations, used in the Mappings and exported into an XML file.
  • Optimized the mappings by changing the logic, reduced running time and encouraged the usage of reusable objects viz., and mapplets.
  • Performed data cleansing activities like Data Enhancement, Parsing, Matching and Consolidation.
  • Analyzed and Created Facts and Dimension Tables.
  • Worked on Teradata BTEQ & T-SQL Tools for querying and retrieving data.
  • Worked with Mload, Fload & Fast export Scripts for Inserting and exporting data from Teradata database.
  • Involved in Development & Testing Phase.

We'd love your feedback!