We provide IT Staff Augmentation Services!

Lead Informatica Data Quality Consultant Resume

Jersey City, NJ

PROFESSIONAL SUMMARY:

  • Over Fourteen years of experience working with Informatica Data Quality/ Power center in Designing and developing large - scale Data warehouse and Client/Server Applications. including: Data Extraction/Conversion, Data Quality checks and Database Administration.
  • Experienced in Data Quality Analysis, Data Profiling, Data cleansing and Master data management.
  • Provided subject matter expertise for Informatica Data Quality including profiles, scorecards, and IDQ Developer tooling
  • Good functional knowledge of Finance, Insurance, and Media
  • Interacted with Business Users to analyze the business process and made necessary changes to schema objects to cater their reporting needs.
  • Excellent problem solving, communication, analytical, interpersonal skills and ability to perform independently or as part of a team.
  • Working Knowledge of Cobol data Structures, Mainframe and CICS Screens.

Software Skills:

Databases: Oracle, Teradata, SQL Server, UDB/DB2 and Sybase.

Tools: Informatica Data Quality 10.x, Informatica PowerCenter v10.xInformatica Power Exchange 9.x (STRIVA), Informatica Metadata ManagerCOGNOS, Business Objects XI R2, CONTROL - M.

PROFESSIONAL EXPERIENCE:

Confidential, Jersey City, NJ

Lead Informatica Data Quality Consultant

Responsibilities:

  • Coded many re-usable rules, profiles and scorecards in IDQ to monitor data Quality of the Critical data elements.
  • Created mappings in IDQ to load invalid data generated by rules and Profiles.
  • Used Labeler and Standardizer transformation in IDQ to create rules.
  • Coded rules against Finance and Treasury data. (ex: CCAR, ALM & FRR)
  • Migrated Informatica Data Quality code to higher environments.
  • Coded rules on data residing in Vertica, SQL server, Oracle & Flat files.
  • Worked with various Data stewards in different lines of business to discuss DQ requirements and code rules in IDQ.
  • Worked in Informatica PowerCenter to create mappings to load the rule’s metadata into Custom Oracle database.
  • Provided Bridge table entries (rule metadata) to Data stewards by executing SQL queries in Profile warehouse.
  • Conducted daily scrum meetings with onsite and offshore team members.
  • Created Power point presentation to depict current and future state of Variable Rate automation.
  • Worked with Collibra to obtain data lineage, rule’s and physical data element information.
  • Executed many SQL queries using analytical functions.
  • Worked with Data Stewards for Reference Data Management.
  • Used JIRA to allocate and work on Data Quality contracts.

Environment: Informatica Data Quality V10.2.1 (Developer tool), Informatica DQ Analyst, Informatica PowerCenter v10.2.1, Oracle 11g, UNIX, JIRA, Collibra.

Confidential, Jersey City, NJ

Informatica DataQuality Lead

Responsibilities:

  • Worked with the source system for analysis of RCIF data and COBOL files format.
  • Profiled source data to assess data quality , identify anomalies and build business rules .
  • Created IDQ web service process to integrated in informatica MDM.
  • Executed Queries in GreenPlumn database for RCIF data analysis.
  • Provided organizations original and standardized addresses to Compliance Team.

Environment: Informatica MDM on AWS, Informatica Data Quality V10.2.1 (Developer tool), Informatica PowerCenter v10.2.1, Oracle 11g, UNIX, Syncsort, Power Builder, Green Plumn.

Confidential, Holmdel, NJ

Senior informatica DataQuality consultant

Responsibilities:

  • Designed ETL Process for integrating data from various source systems.
  • Profiled the Flat files\db tables and applied rules to the profiles.
  • Created reference tables and applied in Standerdizer transformation in IDQ.
  • Worked with IDQ Match and Association transformation to find duplicates in source system.
  • Created Custom Data Object (CDO) & Logical Data Object (LDO) to join the data sets for creating profiles.
  • Used IDQ Parser Transformation to parse customer’s names and email IDs.
  • Developed several mappings workflows, mapplets in informatica PowerCenter to source data from Oracle db, flat files and XML files and loaded Dimensions, Facts, Satellites and Hubs.
  • Created scorecards to monitor the Data Quality of McCracken (CRE) files.
  • Used Labeler transformation to tag Numeric and Alphabets data.
  • Wrote SQLs to read data from Informatica repository tables (OPB tables) and loaded the check in \ check out information of DEV and UAT Informatica code.
  • Migrated Informatica Power center and Data Quality code to higher environments.
  • Created Complex Stored Procs and triggers and optimized them for maximum performance.
  • Worked closely with QA team to compare the flat file data to database dimensions and facts.
  • Created multiple UNIX scripts for checking file’s availability.
  • Created jobs in CONTROL- M for scheduling informatica workflows.
  • Monitored jobs in COTROL- M for proper execution of Informatica Power Center workflows.
  • Created a POC in Informatica BDM to read data from S3 and load into Amazon Redshift.

Environment: Informatica Data Quality V10.2.1 (Developer tool), Informatica PowerCenter v10.2.1, Oracle 11g, UNIX, BMC CONTROL-M, Erwin, TOAD. Informatica BDM.

Confidential, Weehawken, NJ

Informatica DataQuality & PowerCenter Architect\Lead

Responsibilities:

  • Provide estimations for ETL deliverables and oversee the progress for quality ETL Deliverables.
  • Designed the data flow and architecture of Confidential database.
  • Created multiple mapplets in Informatica for shorthand implementation.
  • Used Informatica parameter file functionality to parameterize the table names, period date and connections.
  • Designed and implemented complex SQL queries and PLSQL procedures to process the data per MDRM from AXIOM backend tables.
  • Coded 3000 + rules in Informatica PowerCenter and Developer tool to check the Data Quality of the reports - FR Y9C, FR Y14Q, FR Y14A & FR Y14M .
  • Heavily used Normalizer and java transformation in Informatica mappings.
  • Heavily used Parser, Labeler and Standardizer transformation in IDQ.
  • Used reference tables in Informatica Data Quality for shorthand definition expansion.
  • Created profiles in Informatica Data Quality for identifying unique values for tables in AXIOM (reporting database).
  • Performance Tuned the Oracle queries using Explain Plan / HINTS .
  • Performed Data Analysis and Validations on business rules provided by Data governance team.
  • Designed and implemented stored procedures in oracle for executing gather stats to improve the load performance.
  • Created Tables, Sequences, Stored procedures and Triggers in Confidential database.
  • Designed and implemented mappings in developer tool and imported in powercenter as mapplets.
  • Created Profiles in IDQ and Analyst tool on AXIOM tables.
  • Designed and implemented partition on Oracle tables.
  • Enforce standards and best practices around data modeling efforts.
  • Worked with Informatica Metadata Manager for creating Mapping\session documents.
  • Created MS Access database and forms for Business Analysis and maintenance of RLI inventory.

Environment: Informatica PowerCenter v10.1.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica Data Quality 10.1.1 (Developer tool), informatica Metadata Manager, Oracle 11g, UNIX, Microstrategy 10.3, Oracle 11g, UNIX,AUTOSYS, ERWin, TOAD Data Modeler, Functional knowledge of AXIOM front end.

Confidential, Jersey City, NJ

Project Manager\Senior Developer

Responsibilities:

  • Wrote Functional Specifications for Monitoring and reporting of Intraday Liquidity.
  • Built SQL queries to retrieve payment data made to BNPs clients via FED (Federal Reserve).
  • Performed analysis to retrieve different collateral information that is pledged at the bank.
  • Wrote UNIX script for file’s date\record count validation and file statistics report.
  • Created Informatica mappings for GCARs file processing.
  • Built SQLs to retrieve BNPs collateral pledged at Ancillary system which is ‘CHIPS’.
  • Created reference tables in Informatica Data Quality tool to remove repeated and common client names.

Environment: Informatica PowerCenter v9.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica Data Quality v9.6.x, Oracle 11g, Business Objects, SQL Developer, UNIX, Tableau.Business Bridge.

Confidential, Jersey City, NJ

Senior Informatica Data Quality \ Power Center Lead

Responsibilities:

  • Created Rules and Profiles in Informatica Data Quality tool.
  • Worked with various Data stewards to gather requirements to build their Monitoring rules and profiles.
  • Worked with Informatica IDQ transformations Standardizer, Parser and Global address validation transformations for data cleansing, data matching, data conversion & exception handling.
  • Worked extensively in Informatica Power Center for creating mappings and mapplets to merge data from Oracle Profile Warehouse and Ab Initio MDR and load data into COGNOS data mart.
  • Assigning work and running daily scrum meetings with offshore team for status on rules deliverables.
  • Made updates in Share Point with Work Intake status and set up alerts.
  • Worked with Business users (Data Stewards) to gather requirements for COGNOS reporting needs and worked with COGNOS developers to leverage the requirements.
  • Worked with B2B Data Transformation tool to Parse MISMO XML file.
  • Profiled large MISMO Property Appraisal XML files which include 400+ hierarchies and 3000 elements.
  • Staged and Profiled EBCDIC format VSAM files using Informatica Power Exchange and Power center.
  • Worked with COGNOS developers to create Materialize views for faster performance of COGNOS reports.
  • Created LDOs (Logical Data Objects) and CDOs (Custom Data Objects) in IDQ tool.
  • Implemented Slowly Changing Dimensions Type 1 and Type 2.
  • Used Power Center debugger to test the mappings.
  • Uploaded \ migrated rule’s Metadata into AbInitio Meta Data Repository.
  • Uploaded tables\views Metadata into AbInitio MDR metadata portal.
  • Created folders in IDQ for new projects and provided privileges to developers.
  • Migrated IDQ code to UAT and PROD.
  • Executed various Ad hoc Queries in Oracle and Teradata for Operational Reports to present them to leader ship team. Interacted with Customers for requirement gathering, effective analysis and also to provide end - user training.
  • Created monthly metrics for Rules and Profiles by running ETL jobs to present it to higher management.
  • Created Score cards on profiles for monitoring Data Quality.
  • Worked with CONTROL-M for scheduling and executing Ad hoc ETL jobs.

Environment: Informatica PowerCenter v9.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica Data Quality v9.6.x, AbInitio MDR Tool V mh-3.1.2.4, Teradata, Oracle 11g, SQL Server 2005,COGNOS Reporting tool, SQL Developer, UNIX. Qlikview, CONTROL-M BMC. Informatica Analyst, Informatica MDM, Informatica Power Exchange.

Confidential, New York, NY

Data warehouse \ ETL Developer

Responsibilities:

  • Worked with Business users to gather the requirements.
  • Created LLD/HLD documents and mapping documents.
  • Involved in designing the project data flow.
  • Developed complex SQL queries and Views using Oracles SYS CONNECT BY PATH to build subscriptions path.
  • Worked with Architect to design the ETL architecture for flexible business analysis.
  • Created mappings/sessions/workflows to move data from various source systems to reporting data warehouse.
  • Designed and developed the exception process using Oracle stored procedures for Unbundling crosswords and Confidential reporting projects.
  • Involved in the Unit testing and System testing.
  • Developed triggers for logging deletions in production database.Tuned the Existing Mappings\Sessions for better performance.
  • Executed complex queries in Oracle database for real time testing.
  • Developed SCD type-I, Type-II and Type III mappings in Informatica to load the data .
  • Created VISIO diagrams to depict Informatica Workflows.
  • Exposure to software development methodologies such as Scrum and Agile Methodology.

Environment: Informatica PowerCenter v8.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Oracle 11g, SQL Server 2005,Bussiness Objects XI R2, Erwin 3.x/4.x, Toad for Oracle 10.6.13, UNIX, SQL Developer (3.1.07).

Confidential, Piscataway, NJ

Lead-Data Quality

Responsibilities:

  • Determine the data quality standards for the organization and ensure adherence of the processes data to these standards.
  • Make certain that the data complies with the user needs and expectations referring to quality and authenticity.
  • Formulate policies and procedures necessary for data management, processing and quality assessment functions.
  • Utilized Informatica toolset (Informatica Data Explorer, and Informatica Data Quality) to analyze data for data profiling.
  • Documented Cleansing Rules discovered from data cleansing and profiling.

Environment: Informatica PowerCenter v9.0.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica Data Quality (IDQ) v9.0, Microsoft SQL Server 2005, Informatica Metadata Manager, IBM DB/2 UDB, Informatica Data Explorer.

Confidential, West Trenton, NJ

Senior Informatica Developer\Architect

Responsibilities:

  • Worked with Informatica Data quality tool for Address standardizing, Name parsing and matching Parties.
  • Created Data maps in Informatica Power Exchange (Striva) to pull data from mainframe sources.
  • Worked with informatica technical support to resolve upgrade issues related to Informatica.
  • Created Mappings with Informatica First Logic IQ Link Match Consolidate, Data right IQ and ACE job files.
  • Re-architecture and Developed the existing application with the Business Objects XiR2 (Data quality 11.7) replacing First Logic.
  • Created Complex matching rules in Informatica IDQ to consolidate customer data using customer’s SSN, Name, address and DOB information.
  • Created various mappings to capture Exception data to generate exception reports for users.
  • Extensively used Lookup, Aggregator, Normalizer, Expression and Filter transformations in
  • Various mappings.
  • Tuned mappings/sessions for better performance.
  • Managed Developer privileges for Informatica folders.
  • Used deployment groups to migrate the code.
  • Worked on Informatica Association matching to implement rule of Transitivity.
  • Interacted with Business users and Analysts for new requirements.
  • Involved in writing complex queries to support adhoc user requests for data.
  • Developed SQL Stored Procedures to replace Informatica Sequence Generators.
  • Created XML targets using the Filename column.
  • Extensively used Informatica debugger to analyze the issues in the application.
  • Worked with XML sources to load data into SQL Server tables.
  • Involved in upgrading the application from Informatica version 6.2.2 to version 8.1.1.
  • Verify load status by accessing Repository Tables and communicate any issues to users.
  • Worked closely with the data architect to design the application flow.
  • Developed Unix shell scripts for workflow execution
  • Trained and provided guidance to junior developers.
  • Created reports in Informatica Power Analyzer.

Environment: Informatica PowerCenter v9.0.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica PowerCenter v8.6.1,Informatica Power Exchange 9.0 (Striva), Business Objects XIR2 (Data Quality tool 11.7), Informatica First Logic IQ Link (ACE Views 7.20c, DataRight IQ Views 7.10c, Match/Consolidate Views 7.30c), Informatica Data Quality (IDQ) v9.0, Informatica PDM, IBM QualityStage, Oracle 11g, Informatica Metadata Reporter, Power Analyzer, UDB Workbench, Oracle 10g, Microsoft SQL Server 2005,SQL Server Reporting Services, Red Gate SQL Compare, Quest Central for db2 V5.2,UNIX, Tivoli, Erwin.

Hire Now