We provide IT Staff Augmentation Services!

Senior Data Quality Engineer Resume

2.00/5 (Submit Your Rating)

Miami, FloridA

SUMMARY

  • Around 10 plus years of IT experience in Quality Assurance for ETL, Backend, Web based, Client/Server applications using ETL, Manual, Automated testing tools and involving cloud computing platforms like AWS(Amazon Web Services) and Google cloud Platform(GCP)
  • Around 9 years of experience in Data Analyzing and supporting data extraction, transformations and loading processing, in a corporate - wide-ETL datawarehouse systems
  • Expert in testing the cubes in various subject areas and report dashboards for data analytics
  • Proficient in gathering and analyzing the Business Requirements, documenting System Requirement specification, Functional Requirement specifications, Requirements Traceability Matrix
  • Experience in Travel, Insurance, Healthcare, Retail and Public Services applications
  • Experience in Defining Testing Methodologies; Test Estimation, Requirements Review, creating Test Plans and Test Cases, Verifying and Validating Application Software and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Life Cycle (SDLC)
  • Strong working experience in validating DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using SSIS, Informatica and Data Stage and Pentaho and cloud data warehousing tool like snowflake for data analytics
  • Hands on experience on programming languages such as C# and Python
  • Written test cases to test the application and tracking defects in Quality Center, JIRA and Rational clear quest tools
  • Expertise in understanding Business Requirements, Functional Requirements, decomposing into HLD’s and ETL Specifications into Test Cases for positive and negative test conditions
  • Experience in testing and writing SQL statements
  • Expertise in multiple testing techniques including functional, regression, integration, system, parallel, data base, performance, smoke, User acceptance testing
  • Experience in Dimensional Data Modeling using Star and Snow Flake Schema
  • Expertise in using data centric testing for data migration
  • Extensive experience in testing and implementing Extraction, Transformation and Loading of data from multiple sources into Data warehouse
  • Used Visual Studio Test Professional to configure or record manual tests
  • Solid Back End Testing experience by writing and executing SQL Queries
  • Involved in Developing Data Marts for specific Business aspects like Marketing & Finance.
  • Involved in working on reporting services like SSRS/Business Objects/Crystal Reports, SQL Server Analysis services SSAS, Jasper and Dundas BI
  • Performed backend testing for Database integrity by executing complex PL/SQL queries for validating the data in the backend database tables
  • Experience using query tools for PostGreSQL, MS SQL Server, Oracle, DB2 to validate reports and troubleshoot data quality issues
  • Expertise in working in Agile (Scrum), Waterfall, Spiral methodologies
  • Involved in using agile management tools like MINGLE,JIRA
  • Expertise in subversion control and version control tools like TFS and GIT
  • Involved in standalone and distributed GUI-based applications and SOA Web Services
  • Experienced in executing UNIX shell scripts for monitoring batch jobs
  • Involved in testing both .NET and web based components

TECHNICAL SKILLS

ETL Tools: Pentaho, SSIS, Informatica

Testing Tools: QTP, Mercury Quality Center, JIRA,ZEPHYR,ALM Rational Clear Quest, Microsoft Test Manager 2012,Team Foundation Server 2010/2012,Cucumber

Microsoft Office: Office, Word, Excel, Office Timeline, Power Point

BI/DWH Tools: Jasper, Business Objects, SSRS, SSAS, Snowflake, Dundas

Operating Systems: Windows7, Windows XP, Windows NT, Windows 95/98/2000, OS

Cloud Environment: Amazon Web Services(AWS), Google Cloud Platform(GCP)

RDBMS: PostGreSQL, Oracle 8.1/9i/10g/11g, PL/SQL, SQL Server 2000/2005/2008/2010/ , Azure, DB2, Siebel, SQL*Plus, SQL*Loader, Rapid SQL

Scheduling: Autosys, Tidal, Opcon

Programming Languages: .Net, C#, Python

Repository Tools: GitHub

Environment: UNIX, MVS, HP-UX, IBM AIX 4.2/4.3, Novell NetWare, Win 3.x/95/98, NT 4.0, Sun-Ultra, Sun-Spark, MS Visual Studio 2010 Professional, Visual Studio Test Professional 2010, Sun Classic, and SCO

PROFESSIONAL EXPERIENCE

Confidential, Miami, Florida

Senior Data Quality Engineer

Responsibilities:

  • Identifying the business requirements for multiple projects in each release and participated in meetings with SME and Business Analysts.
  • Testing different kind of analytics and advanced analytics reports data for analytics app of care cloud health care SAAS clients
  • Writing advanced SQL scripts to test different report dashboards in data analytics projects
  • Creating test cases, action words and test scenarios into Cucumber tool
  • Performed system testing, regression testing, integration testing and documented results using Quality Center.
  • Creating and monitoring Automated jobs using Pentaho tool, validated data between source and target datawarehouse and report the issues to developer
  • Running the cubes and testing data for different subject areas of health care providers and clients
  • Participated in fact and dimension table implementation in Star Schema model based on requirements.
  • Tested various types of cubes or mdm data based on different subject areas such as appointments, encounters and health claims, denials writing mdx queries and views on pentaho tool
  • Conducted various management activities by analyzing and verifying test results, providing status reports
  • Analyzed and validated different type of data related to claims, remittance requests, patient and appointments
  • Expertise in understanding and testing MIPS-Advanced Care Information transitional measures and Eligible Professional measures
  • Analyzing the graphs and data generated from the key performance indicators, revenue and practice performances for monthly, daily and yearly data of different clients and matching it against the data source
  • Troubleshooting the issues raised by the different clients and responding or creating defects accordingly
  • Created check and balance monthly, yearly and daily jobs to validate the data from sources for different DWH facts and dimensions tables
  • Involved in creating pie charts, bar charts and various graphical representation of visualized data using Dundas BI tool
  • Involved in testing data generated from cubes into data warehouse tables and reports
  • Effectively communicating with developers and other QA engineers and understand effectively how the business works with the associated data
  • Identifying duplicate records and invalid orphan keys in the target data warehouse before data gets processed
  • Providing reports in excel to the sales and marketing team requested based on business needs

Environment: Pentaho 5.0,PostGreSql9.4,GitHub,Jira,Zephyr,T-SQL,SQL,PL/SQL, Unix, Charles, Dundas, Snowflake, Opcon Scheduling, Windows, OS, MS Office, Visio, MS Excel, AWS, GCP

Confidential

Senior ETL Data Quality Analyst

Responsibilities:

  • Understanding technical specifications and business requirements and translated to test artifacts
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing
  • Writing complex SQL queries for data validation for verifying the ETL Mapping Rules
  • Created weekly status reports and defect tracking dashboard reports to the management team
  • Worked on different projects related to gratuity, pricing, promotion and guest operation projects
  • Presented DWH testing road map timelines to the management team using Power point office timeline tool
  • Used V-lookup and H-lookup in excel in comparing different sets of data extracting from CRM and revenue management data warehouse systems
  • Worked with BI team to extract the data generated from Hyperion Tools and analyze the data necessary for performing Regression scenarios and resolve issues in production that impacted business
  • Created Test input requirements and prepared the test data for Data Driven testing
  • Collected requirements and tested several business reports
  • Used SQL tools to run SQL queries and validate the data loaded in to the target tables
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS
  • Involved in user training sessions and assisting in UAT (User Acceptance Testing).
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Interacting with senior peers or subject matter experts to learn more about the data
  • Extensively written test scripts for back-end validations
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Created ETL execution scripts for automating jobs.
  • Performed extensive DATA validation using SQL queries and back-end testing
  • Tested the different sources such as Flat files, Main Frame Legacy Flat Files to load into the Oracle data warehouse
  • Used workflow manager for session management, database connection management and scheduling of jobs.

Environment: Informatica PowerCenter 9.6.1,Toad, Oracle 10g, Quality Center 10.0, T-SQL,SQL,PL/SQL, Unix, Opcon Scheduling, Windows7, MS Office, Visio, MS Project Timeline, MS Visual Studio 2012 Professional, Power point Office Timeline, Visual Studio Test Professional 2012, Microsoft Test Manager, Team Foundation Server, Rational Clear Quest

Confidential

Test Lead

Responsibilities:

  • Involved in writing test strategies, cases and scripts based on functional and business requirements
  • Involved proactively working with the Networking and Server teams to resolve testing issues
  • Developed technical documentation and giving presentations to diverse types of audiences like technical staff, business users and management
  • Worked on Agile management tools like TFS work bench and enterprise software to track the story and product delivery requirements
  • Involved in test planning, writing test cases and defect tracking using Microsoft Test Management tool
  • Worked on different projects like Customer Payments, Payment services, Canada Insurance Import/Extract, PTMS, Quick Fund/Net Fund dealer payments and different cloverleaf Data warehousing ODS, dimension and fact tables
  • Involved in writing SQL queries using RapidSql programming tool which is used across all platforms including Oracle, Microsoft SQL Server, Sybase
  • Extensively used Informatica PowerCenter 9.5.1 to load data sourcing from Source to Target databases
  • Expertise in executing Informatica workflows and testing the output based on the requirement criteria
  • Experience in handling data coming from various sources like Relational, XML and flat files
  • Tested the service contract for each release and the mode of data that goes through XML
  • Worked on Visual studio test case manager to execute test results and post it to the team foundation server
  • Worked on web services testing using the service endpoints and based on the passed parameters
  • Validated data flow from Source through to FACT tables and Dimension tables using complex queries (left outer Joins, sub queries etc.)
  • Involved in testing the Biz talk applications hitting the web services for validating the data coming from different sources like Relational, XML and flat files
  • Used UNIX shell scripts and compared the files in .csv format to make sure the extracted data is correct.
  • Coordinated with developers in performance tuning of mappings and sessions
  • Documented dependencies of workflows and load process along with logic in specification documents and test results
  • Wrote complex SQL queries to validate EDW data versus EDM source data including identification of duplicate records and quality of data based on Mapping/Transformation rules
  • Involved in peer review of test cases, Test Pool meetings, Impact analysis and test estimate meetings
  • Used Microsoft Test Manager and TFS for defect tracking and reporting

Environment: Oracle SQL\PL-SQL, RapidSql, MS Sql Server2008/2010,Windows7, MS Office, Visio, MSVisual Studio 2012 Professional, Visual Studio Test Professional 2012, Microsoft Test Manager, XML, CSV Files, XML Files, subversion, Informatica Power Center9.5.1,Unix,Putty,SOAPUI 5.0

Confidential

Quality Assurance Analyst

Responsibilities:

  • Using Agile Project Management tool - MingleBI/TFS to determine the Stories in BI and DI projects respectively and derive the test details
  • Preparing and Maintaining the Test strategy, test design, test cases, and traceability for applications under test
  • Involved in the preparation of Technical design documents, Source to target (S2T) document.
  • Worked in various projects like PunchCard, GiftCard, Inventory, Customer Platform Service (CPS), BI/DI data warehouse enhancements
  • Involved in understanding the logic behind the stored procedure and running package from SQL Server Management Studio
  • Create and execute manual and automated tests, document testing practices, and communicate results/recommendations to team
  • Involved in testing of the ETL solutions developed using SQL Server Integration Services (SSIS) 2005 / 2008 and SQL / T-SQL
  • Develop and maintain the manual and automated test scripts, functions / SQL code
  • Involved in testing on Filezilla and CoreSFTP in transferring flat files from local to remote server
  • Responsible for testing of business intelligence reporting solutions developed using SQL / T-SQL and Reporting Services SSRS and SQL Server Analysis solutions SSAS

Environment: SSIS, SSRS, SSAS, TFS, Mingle, MySQL Server2008/2010, SQL\PL-SQL, Unix,Windows7, MS Office, Visio, MS Visual Studio 2010 Professional, Visual Studio Test Professional 2010, XML, CSV Files, XML Files, Filezilla, Putty, CoreSFTP, subversion

Confidential

Datawarehouse Tester

Responsibilities:

  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center
  • Experience with Toad or other SQL query tools
  • Demonstrated experience in QA, SIT, and UAT testing for a large data warehouse including writing and executing test scripts
  • Tested mappings with the Design Documents and also performed testing for various sample data
  • Involved in the post implementation validation of UNO CHASE and HELOC-MSP to VLS CONVERSION projects
  • Expert in understanding ETL Packages using several transformations to perform Data profiling, Data Cleansing and Data Transformation
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Performed system testing, regression testing, integration testing and documented results
  • Worked with ETL group and data analysts for understating and preparation of technical specifications mappings for dimensions and facts
  • Writing complex SQL queries using Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle
  • Automating/ handling and recording scripts using QTP
  • Reviewed and tested database modifications as per Physical Data Model
  • Involved in converting informatica code into SQL queries and generating reverse engineered STTS

Environment: Informatica 8.6.1, Quality Center 10.0, Oracle 9i/10g, TOAD, SQL\PL-SQL, Unix, Putty, Business Objects, Windows 2000/XP, MS Office, Visio, XML, CSV Files,.NET, XML Files, QTP

Confidential

Programmer Analyst

Responsibilities:

  • Involved in all phases of Software Development Life cycle from software requirements such as Analysis phase to design, Development, Integration, Regression, Functionality and Usability Testing and Maintenance
  • Develops programming solutions according to technical specifications while adhering to policies and standards for the integrity and safety of our data
  • Designed Windows forms using Data Grid, Validation, Login Controls, User Controls
  • Designed Custom classes for Data Validations
  • Used Atalasoft controls for building capture applications, document processing and .Net Imaging
  • XML files are created for each document and converting into Tiff and grouped to PDF
  • Involved in creating Zip files using PGP Encryption
  • Worked with ETL group for understating mappings for dimensions and facts
  • Performed testing for integration, functional and unit testing of standalone and distributed GUI-based applications and SOA Web Services
  • Extracted data from various sources like flat files and SQL Server
  • Design new systems or enhancements to existing systems, modify, code, debug, test, and document moderately complex application systems

Environment: .Net framework 3.5, C#, VB.NET, Testing Tools, AtalasoftDotImage, Visual Studio 2008/10, SQL Server 2005/2008,XML,MS Visio, subversions, SSRS, Informatica9.1

We'd love your feedback!