We provide IT Staff Augmentation Services!

Sr. Informatica Data Quality (idq) Developer Resume

3.00/5 (Submit Your Rating)

Jersey City, NJ

SUMMARY

  • Over 8 plus years of progressive hands - on experience in analysis, ETL processes, design and development of enterprise level data warehouse architectures, designing, coding, testing, integrating ETL.
  • Experience in Dimensional data modelling techniques, Slow Changing Dimensions (SCD), Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development & Testing), and Data warehouse concepts - Star Schema/Snowflake Modelling, FACT & Dimensions tables, Physical & Logical Data Modelling.
  • Experienced in integration of various data sources like Oracle … MS SQL Server, XML files, Teradata, Netezza, Sybase, DB2, Flat files, Salesforce, Mainframe sources into staging area and different target databases.
  • Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage Director for developing jobs and to view log files for execution errors.
  • Expertise in integration of various data sources like SQL Server, Oracle, Teradata, Sybase, Flat files, DB2 Mainframes.
  • Expertise in the ETL Tool Informatica and have extensive experience in Power Center Client tools including Designer, Repository Manager, Workflow Manager/ Workflow Monitor.
  • Extensively worked with complex mappings using various transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Unconnected / Connected Lookup, Aggregator, Stored Procedure, XML Parser, Normalize, Sequence Generator, Update Strategy, Reusable Transformations, User Defined Functions, etc.
  • Extensively worked on Relational Databases Systems like … MS SQL Server, Teradata and Source files like flat files, XML files and COBOL files
  • Experienced in mapping techniques for Type 1, Type 2, and Type 3 Slowly Changing Dimensions
  • Strong experience in SQL, PL/SQL, Tables, Database Links, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, Packages, Triggers, Joins, Unions, Cursors, Collections, and Indexes in Oracle
  • Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server, MS Access, Teradata, Flat Files, XML files and other sources like Salesforce, etc.
  • Sound knowledge of Linux/UNIX, Shell scripting, experience in command line utilities like pmcmd to execute workflows in non-windows environments.
  • Experienced in extracting data from SAP & Salesforce Source databases by using ETL tools.
  • Have hands on experience in tools like Address Doctor which is used for Address validations
  • Implemented change data capture (CDC) using Informatica power exchange to load data from clarity DB to TERADATA warehouse.
  • Experience in maintaining Batch Logging, Error Logging with Event Handlers and Configuring Connection Managers using SSIS.
  • Experienced in DTS, SSIS packages creation and scheduling them by using Windows Scheduler, SQL Server Agent and Autosys tools.
  • Proficient in Oracle Tools and Utilities such as TOAD and SQL*Loader.
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
  • Excellent analytical, problem solving, technical, project management, training, and presentation skills.

TECHNICAL SKILLS

  • Informatica Power Center 10.1/9.6.1
  • Data Quality 9.6.1
  • UNIX
  • SQL
  • MDM
  • Linux
  • Perl
  • PL/SQL
  • DB2
  • Tidal
  • Autosys
  • Oracle 11g/10g
  • Microsoft SQL Server 2008
  • Microsoft Visual studio.

PROFESSIONAL EXPERIENCE

Sr. Informatica Data Quality (IDQ) Developer

Confidential - Jersey City, NJ

Responsibilities:

  • Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the SQL Server, DB2 and Oracle databases.
  • Created various SSIS packages for the ETL functionality of the data and importing data from various tables to the rollup tables.
  • Designed the Target Schema definition and Extraction, Transformation and Loading (ETL) using SSIS.
  • Performed Data Analysis and Reporting by using multiple transformations provided by SSIS such as data conversion, conditional split, bulk insert, merge, and union all.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Created mappings in Informatica Data Quality (IDQ) using Parser, Standardizer and Address Validator Transformations.
  • Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
  • Experienced in creating IDQ mappings using Labeler, Standardizer, Address Validator transformations with Informatica Developer and migrated to Informatica Power Center.
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
  • Created data quality jobs using Informatica Data Quality (IDQ) and address doctor to profile customer information as per the user requirement.
  • Handled different types of transformations in IDQ including Address Validator (Address Doctor), Merge and consolidation Transformations.
  • Design, document and configure the Informatica MDM Hub to support loading, cleansing of data.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM & Worked on maintaining the master data using Informatica MDM
  • Integrate the address doctor with address validation transformation to cleanse the address.
  • Exporting the Mapplets from IDQ into Informatica Power center to use the Mapplet in various mappings for implementation of Address doctor.
  • Imported the mappings developed in data quality (IDQ) to Informatica designer.
  • Worked on Informatica Analyst Tool IDQ, to get score cards report for data issues.
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows).
  • Involved in creating UNIX shell scripts for Datastage job and Informatica workflow execution.
  • Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.
  • Imported the mappings developed in data quality (IDQ) to Informatica designer.
  • Scheduling the workflows using Tidal as per business needs & Passed the parameters to work flow from the tool Tidal directly to run the map.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2.
  • Used Autosys for Scheduling the Informatica Workflows & have done testing using Autosys scheduling tool.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads, and process flow of the mappings.

Environment: Informatica Power Center 10.1/9.6.1, Data Quality 9.6.1, UNIX, SQL, MDM, Linux, Perl, PL/SQL, DB2, Tidal, Autosys, Oracle 11g/10g, Microsoft SQL Server 2008, and Microsoft Visual studio.

Sr. Informatica / MDM Developer

Confidential -Dallas, TX

Responsibilities:

  • Developing the ETL components as well as Oracle procedures, functions & triggers.
  • Defined Trust and validation rules for the base tables & created PL/SQL procedures to load data from Source Tables to Staging Tables
  • Created Oracle PL/SQL Cursors, Triggers, Functions and Packages
  • Created, executed and managed ETL processes using Oracle Data Integration (ODI) & Customized ODI Knowledge modules like Loading Knowledge Modules and Integrated Knowledge Modules.
  • Developed, executed, monitored and validated the ETL DataStage jobs in the DataStage designer and Director Components.
  • Developed common modules for error checking and different methods of logging in SSIS.
  • Extensively used SSIS Import/Export Wizard, for performing the ETL operations.
  • Worked with DataStage Director to schedule, monitor, analyze performance of individual stages and run DataStage jobs.
  • Implemented Incremental load, used Event Handlers to clean the data from different data sources
  • Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems.
  • Involved in Installing and Configuring of Informatica MDM Hub Console, Hub Store, Cleanse and Match Server, Address Doctor, Informatica Power Center applications.
  • Used IDQ transformation like labels, standardizing, proofing, parser, address doctor, Match, Exception transformations for standardizing, profiling and scoring the data.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store.
  • Data governance application for Informatica MDM Hub that enables business users to effectively create, manage, consume, and monitor master data using IDD (Informatica Data Director).
  • Worked on IDQ parsing, IDQ Standardization, matching, IDQ web services.
  • Responsible in using Data Integration Hub (DIH) in creating topics and applications to publish and subscribe data.
  • Scheduling the workflows using Tidal as per business needs & Passed the parameters to work flow from the tool Tidal directly to run the map.
  • Scheduled the batch jobs in Autosys to automate the process.
  • Designed and Developed SSIS Packages using various Control Flow and Data Flow items to Transform and load the Data from various Databases using SSIS.
  • Configuring with checkpoints, package logging, error logging and event handling to redirect error rows and fix the errors in SSIS.
  • Worked on Teradata Utilities like Fast-Load, Multi-Load & Fast-Export.
  • Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables in Teradata Database.
  • Involved in writing Autosys jobs, JIL file for Box as well as Command jobs.
  • Created run books for job information which is scheduled on Autosys.
  • Created scripts in Teradata to load data in multiple layers and designed and developed FLOAD for stageload, MLOAD for OLAP load and worked on TPT for reporting table load.
  • Extensively tested the Address doctor files and updated them with the new monthly release files from Informatica address doctor site

Environment: Informatica Power Center 9.6.1, MDM, UNIX, DB2, Oracle, Linux, Perl, Shell, IDQ, IDS, SSIS, PL/SQL, Tidal, Autosys, Oracle 11g/10g, Teradata 14.0.

ETL / Teradata Developer

Confidential -Richardson, TX

Responsibilities:

  • Documented high and low-level design document specifications for source-target mapping, based on the transformation rules.
  • Documented technical requirements for ETL process and Design documents for each source. Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration.
  • Uploaded data from operational source system (Oracle 8i) to Teradata.
  • Used utilities of FLOAD, MLOAD, FEXP, TPUMP of Teradata and created batch jobs using BTEQ.
  • Worked on Teradata SQL Assistant querying the source/target tables to validate the BTEQ scripts & imported Metadata from Teradata tables.
  • Written Teradata BTEQs & as well Informatica mappings using TPT to load data from Staging to base.
  • Fine-tuned Teradata BTEQs as necessary using explain plan and collecting statistics
  • Has a very good knowledge of FACETS tool and Healthcare domain, Worked on the various modules like Subscriber/Member, Groups, Enrollment, Claims, Billing, Accounting, Provider, MTM and Utilization Management.
  • Good experience on FACETS CTP (Claims Test Pro) and FACETS Testing
  • Used IDQ's standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration Confidential user's machines and resolved the issues & used IDQ to complete initial data profiling and removing duplicate data.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality(IDQ) environment.
  • Created and used the Normalizer Transformation to normalize the flat files in the source data.
  • Worked on Maestro job scheduling and Unix Scripting.
  • Involved in finding production status by using Autosys commands.
  • Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions, batches and scheduling workflows.
  • Involved in migrating the ETL Code to different environments from Dev to UAT and then to Production with ETL Admins.
  • Experience in working with reporting team in building collection layer for reporting purpose.

Environment: Informatica Power Center 9.6.1/9.5.1 , IDQ, DB2, Oracle 11g/10g, My Sql, Autosys, Teradata 13.10/12, Flat File, UNIX, Windows.

ETL / Informatica Developer

Confidential -Kansas City, KS

Responsibilities:

  • Designing the dimensional model and data load process using SCD Type 2 for the quarterly membership reporting purposes.
  • Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement.
  • Extracted data from multiple sources such as Oracle, XML, and Flat Files and loaded the transformed data into targets in Oracle, Flat Files.
  • Wrote Shell Scripts for Data loading and DDL Scripts.
  • Designing and coding the automated balancing process for the feeds that goes out from data warehouse.
  • Implement the automated balancing and control process which will enable the control on the audit and balance and control for the ETL code.
  • Improving the database access performance by tuning the DB access methods like creating partitions, using SQL hints, and using proper indexes.
  • All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
  • Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and create mapplets that provides reusability in mappings.
  • Analyzing the impact and required changes to incorporate the standards in the existing data warehousing design.
  • Following the PDLC process to move the code across the environments though proper approvals and source control environments & Source control using SCM.

Environment: Informatica Power Center 9.1/8.5, Power Exchange, UNIX, Oracle 10g, SQL Server 2008, SQL Assistant, DB2.

ETL / DWH Developer

Confidential

Responsibilities:

  • Involved in analysis, design, development, test data preparation, unit and integration testing, Preparation of Test cases and Test Results
  • Coordinating with client, Business and ETL team on development
  • Developed Batch jobs using extraction programs using COBOL, JCL, VSAM, Datasets, FTP to Load Informatica tables
  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing logic, and transformation as per the requirement and creating mappings and loading the data into BI database.
  • Based on the business requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Developing code to extract, transform, and load (ETL) data from inbound flat files and various databases into various outbound files using complex business logic.
  • Most of the transformations were used like Source Qualifier, Aggregator, Filter, Expression, and Unconnected and connected Lookups, Update Strategy.
  • Created automated shell scripts to transfer files among servers using FTP, SFTP protocols and download files.
  • Expertise in creating control files to define job dependencies and for scheduling using Informatica.

Environment: Informatica Powercenter 8.5/8.1, ETL, Business Objects, Oracle 9i/8i, PL/SQL.

We'd love your feedback!