We provide IT Staff Augmentation Services!

Senior Data Quality Analyst, Consultant Resume

New York, NY

PROFESSIONAL SUMMARY

  • Experienced data analyst with eight years of experience in Data Quality, Data Warehousing/Business Intelligence and Data Integration functions complemented by MBA/MS in IT. Additional strengths include:
  • Experienced with Basel Committee on Banking Supervision (BCBS) 239, Anti - Money Laundering(AML), Sanctions Screening
  • Proficient at reviewing data using Information Analyzer, Informatica Data Quality, Oracle, Teradata, MS SQL Server, UNIX
  • Experienced with Customer, Transaction and Account data domains
  • Experienced with completeness, validity, inconsistency and uniqueness data quality dimensions
  • Ability to recommend enhancement to data quality requirement provided by the senior level management of business
  • Reviewed data lineage in Informatica and Excel
  • Skilled at ETL source to target mapping for OLTP and DW
  • Competent at authoring scripts using languages such as SQL, PL/SQL, XML and Regular Expressions
  • Understand conceptual, logical and physical model of ERD, Star and Snow Flake Schema for OLTP and DW
  • Ability to prioritize criticality of data anomalies
  • Strong data analysis and problem-solving skills
  • Ability to analyze and interpret trends or patterns in complex data sets
  • Experienced in visualizing complex concepts into easily understood diagrams
  • Strong oral and written communication skills and interpersonal skills
  • Strong ability to manage time and prioritize multiple requests
  • Ability to work independently and within teams and work with distributed team onshore and offshore
  • Ability to work across peer group on resolving technical/design issues and ability to work effectively in a matrix environment

TECHNICAL SKILLS

Data Quality Tool: InfoSphere Information Analyzer 11.3, Informatica Data Quality 9.1

Operating System: UNIX, Windows 7/10

ETL Tool: Informatica 9.1

Language: SQL, Regular Expressions, XML, Java

Database/GUI: Oracle 10g, Toad for MS SQL Server 6/Oracle 10, Teradata Studio 14, MS Access

Others: IBM Rational Team Concert, Notepad++, Jive, MS Visio, MS Project 2003/2007, MS Excel

PROFESSIONAL EXPERIENCE

Confidential, New York, NY

Senior Data Quality Analyst, Consultant

Responsibilities:

  • Recommended senior management to add new consistency requirement to enhance data quality assessment of Critical Data Element(CDE) which improved quality of financial crime risk reporting
  • Recommended senior management of business to enhance validity requirement of Phone Number CDE which identified 60% more data quality issues which improved quality of financial crime risk reporting
  • Created completeness and validity checks on source system for BCBS 239 which improved financial crime risk reporting
  • Created consistency checks on source system vs (NORKOM and Oracle Watchlist Screening) for AML and Sanctions
  • Reviewed the data lineage form source system to NORKOM and OWS in Excel for AML and Sanctions
  • Corrected the data lineage for CDEs in Collibra by asking CDO lead which made decision making process more efficient
  • Created data mapping for ETL post-process to summarize detail assessment results which increased efficiency by 30%
  • Created SIPOC workflow process to identify impacted parties resulting in improvement of productivity of distributed team
  • Reviewed the data quality requirements on Collibra for BCBS and Global Standards Initiatives with SMEs
  • Engaged distributed development team to get Customer and Transaction Domain source data loaded in UNIX environment
  • Profiled data from multiple sources for cardinality, null values, max/min values and inferred datatype using Information Analyzer
  • Reviewed potential data anomalies with SME to determine if it represents a true data flaw
  • Prioritized the criticality of anomalies in preparation for defining data quality metrics
  • Created rule definition template for completeness, validity, inconsistency and uniqueness data quality dimension to specify the data rules using IBM Information Analyzer using SQL-like expression and Regular Expressions
  • Reviewed model for DW to create required columns in assessment output
  • Validated data rules against the data quality requirements for target population, source column and data rule
  • Reviewed the initial summary metrics results with business for User Acceptance Testing (UAT)
  • Generated detail data quality exception report for completeness, validity, inconsistency for Qlik View visualization tool

Environment: Information Analyzer 11.3/11.5, SQL-like expression, UNIX, Regular Expressions, XML, Collibra, Excel Macros, DW

Confidential, Hillsboro, OR

Senior Data Quality Analyst, Consultant

Responsibilities:

  • Created source to target mapping for ETL pre-process for SQL data quality reports which increased efficiency by 50%
  • Tuned SQL for a data quality report using indexed column in Teradata which increased efficiency by 35%
  • Worked with business to understand requirement to enhance data quality of Sales and Marketing report
  • Profiled data set for cardinality, number of null values, max/min values using Excel and Teradata Studio
  • Checked for completeness, validity and consistency dimensions on huge volume of target data set using database tools such as Teradata and Excel and languages such as Regular Expression and SQL
  • Traced the data quality issues between source and target
  • Recommended improvement to data quality requirements after analyzing the data
  • Validated SQL data quality report output against data quality requirement to refine SQL logic
  • Created exception report using SQL in Teradata and Excel for business
  • Perform slicing/dicing of data using Pivot table and charts
  • Performed source to target mapping of data elements for ETL process to pre-process data set
  • Created Data Flow Diagram using Visio
  • Created conceptual, logical and physical model of ERD using Visio for OLTP Database

Environment: Teradata Studio 14, SQL, Jive, Visio, Regular Expressions

Confidential, Smithfield, RI

Systems Data Analyst

Responsibilities:

  • Reduced the data discrepancy from 9% to 0.07% by doing data analysis and remediation to improve quality of sales report
  • Profiled data set for cardinality, number of null values, max/min values and inferred datatypes using Informatica Data Quality
  • Checked for completeness, validity and consistency dimensions on target data using SQL Server, Toad, Oracle and SQL
  • Created SQL and PL/SQL Stored Procedure for data quality check and remediation
  • Created a data quality exception report for business reporting
  • Documented data quality issues in Issue Management Tool called Rational Team Concert (RTC)
  • Identified and resolved technical issues in production support using Informatica (ETL), Java, UNIX and RTC
  • Interviewed sales and marketing business partners to understand requirements for OLTP, Data Warehouse and ETL systems
  • Reviewed conceptual, logical and physical model of ERD, Star Schema and Snow Flake Schema for OLTP and DW
  • Documented results in PPI (Project Charter) and SRA (Business and Systems Requirements)
  • Diagramed business processes, system ETL workflows and data flows using Visio
  • Mapped from source to target for ETL process for OLPT and Data Warehouse
  • Documented functional design of ETL and reports in Systems Delivery Specification (SDS)
  • Worked across global cross functional team following waterfall and agile development lifecycle
  • Established project plan and made tasks assignments to junior team member

Environment: Oracle 10g, MS SQL Server 2005, Toad for Oracle 10, Informatica Data Quality 9.1, Jive, MS Visio, MS Project 2003, IBM Rational Team Concert(RTC), SQL, UNIX, PL/SQL, Java, Waterfall, Agile

Confidential, Burlington, MA

Business Data Analyst

Accomplishments and Responsibilities:

  • Successfully completed 2 projects involving requirement gathering, data modeling, data integration, process modeling
  • Interviewed and documented objectives, information requirements such as call scripts and data specifications for Interactive Voice Response (IVR) Call Program in Requirements Document
  • Built Call Flow Diagram (CFD) in Visio based on requirement
  • Built dialogue based on CFD by attaching call scripts and audio components using ACUMEN
  • Performed validity and consistency quality assurance of voice and script components of dialogue
  • Tested Outbound and Inbound portion of call to verify if they are operational
  • Validated Silverlink Interaction Results File (SIRF) from test call
  • Mapped between Client and Silverlink attributes using SSIS
  • Transformed client data into ACUMEN format using ETL to integrate data
  • Cleansed SIRF with multiple records with primary key based on ranking of certain attribute
  • Created conceptual, logical and physical model for data design to build Corrective-And-Preventive-Actions (CAPA) Application in MS Access 2007 to store results of Design Document Review
  • Troubleshot technology and data issues using tools such as Microsoft SQL Server and SQL

Environment: MS SQL Server 2005, Process Modeling (Visio and ACUMEN Dialog Editor), MS Access 2007, SSIS, Conceptual Data Model, Logical Data Model, Physical Data Model

Hire Now