We provide IT Staff Augmentation Services!

Data Quality Analyst Resume

5.00/5 (Submit Your Rating)

Overland Park, KS

SUMMARY

  • Software Quality Professional with 11 years of experience, in Functional testing on web based, Client/Server and data warehousing applications across business domains of Banking, Retail and Telecom.
  • Good experience in Testing as ETL/Functional Test Lead, UAT Coordinator, Sr. Test Analyst, Data Analys t and created test deliverables like Test Plan, Test Strategy, Defect Reporting, Project Metrics and Test Closure .
  • Extensive experience in ETL/ Data warehouse backend testing and BI reports testing.
  • Experience in testing Cognos, Tableau and Salesforce reports.
  • Extensive expérience in Functional, Regression, Integration, Reports, UAT, Black Box and GUI Testing.
  • Extensive experience with ETL tools (Ab - Initio/DataStage), Databases (Teradata, Oracle, SQL Server, Netezza)
  • Knowledge creating Salesforce Reports and Power BI reports and mapping documents using AnalytixDS Data Mapping Tool.
  • Knowledge in Big Data Hadoop testing ( Hive, Sqoop ).
  • Knowledge in API Testing with SOAP UI, POSTMAN tools.
  • Experience working with JSON, XML, Swagger.
  • Knowledge on Selenium, UFT and Jenkins .
  • Good knowledge of Data warehousing concepts. Created ETL mapping rules and test data to test the functionality.
  • Experience in creating Regression environment (prod-like) to meet the needs of baseline testing. Manage the test environments, Database to support SIT/UAT testing.
  • Well-versed with all stages of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
  • Strong understanding of Agile Scrum and Waterfall SDLC methodologies.
  • Experienced in analysing Business Specifications and estimating the testing effort by analysing the requirements.
  • Client Engagements, Contract Negotiations and Resource Augmentation for Project Management.
  • Planning, scheduling, and monitoring projects.
  • Extensive Experience in using HP Quality Centre/ALM/JIRA/qTest for Bug Tracking and Defect Reporting.
  • Participating in Reviews for External Design Document and Functional Spec.
  • Co-ordinate with multiple teams for End-End or System Integration testing.
  • Inter-group co ordinations between Testing and Solutions delivery.
  • Conducting Project-Based Milestone Reviews.
  • Mentoring team members and reviewing their work.
  • Team building and mentoring - Training and retraining.

TECHNICAL SKILLS

Testing Tools: HP Quality Center, Clear Quest, HP ALM, JIRA, qTest

Databases: Oracle, SQL Server, DB2, Teradata, Netezza

Utility Tools: TOAD 8.5, Teradata SQL Assistant 14.1, DB2 client, SQL Developer

ETL Tools: Data Stage, Ab Initio 3.0.4, Talend, SSIS

Operating Systems: Windows Vista, 7, Linux and UNIX. Microsoft Office Excel, Access, Outlook and Power Point, MS Visio, Word, Lotus Notes, Programming Languages SQL, PL/SQL, Unix Scripting

Automation tools: Selenium

PROFESSIONAL EXPERIENCE

Confidential, Overland Park, KS

Data Quality Analyst

Responsibilities:

  • Work with the Business and Data Analyst to identify the scope of testing for each Confidential .
  • Analyse the requirements for DataPro, Bulk uploader, Salesforce Reports, ETL efforts and create test plan.
  • Create test cases for each phase of testing.
  • Understand the source xml layout and create the test data as required.
  • Create and Validate Salesforce reports as per the business requirements and the mappings provided by the Data Analyst.
  • Analyse the DataStage jobs and business requirements and create SQL queries for performing data validations.
  • Use Beyond Compare and perform the Regression Testing.
  • Coordinate with Sycamore for any defects encountered in testing.
  • Attend the daily scrum meeting, provide status, and update Jira accordingly.
  • Responsible for maintaining QA teams test artifacts in qTest and SharePoint.
  • Publish Weekly status and defect reports.

Environment: DataPro, Bulk uploader, Salesforce, DataStage, SQL, qTest, Beyond Compare

Confidential - Overland Park, KS

Data Quality Analyst

Responsibilities:

  • Work with the BA, DA, development, and production support teams to identify the scope of testing for each release.
  • Create test plan for every hop and all phases of testing including System, System Integration, Regression an End-to-End testing.
  • Execute the DataStage jobs and validate the requirements with the provided mapping documents by creating SQL scripts.
  • Validate the data movement from Oracle and Netezza to SQL server via the DataStage and Biztalk jobs. Use Beyond compare to validate huge sets of files and Tableau reports.
  • Utilize the existing SISS packages to import and export data from MS Excel, SQL Server, Flat files.
  • Track the defects, clarifications and suggestions in HP ALM and provide QA sign off for all the requirements.
  • Provide defect report and project status report on a regular basis to the project management.
  • Forecast the future workload and provide estimations for intake of new resources.
  • Publish the test summary report for every release and provide support of User Acceptance Testing (UAT) and Post implementation activities.

Environment: DataStage, BizTalk, SSIS, Netezza, Oracle, Microsoft SQL Management Studio, Tableau, Hadoop, AnalytixDS Data Mapping Tool, Beyond Compare, HP ALM.

Confidential - Chicago, IL

QA Lead

Responsibilities:

  • Work with the BA, development and production support teams to identify the scope of regression testing for every release.
  • Review all the applications and data sources which are affected as part of release. Identify the Abintio jobs involved in the functionality.
  • Create Master Regression Strategy with ETL as well functional test scenarios, schedule the store and publish the strategy with the Business and Team.
  • Create masking scripts using Abinitio to create test data from production files.
  • Execute the ETL batch jobs from Unix and load the respective output tables.
  • Validate the data and record counts for all the output tables. As well verify the performance of the jobs.
  • Prepare and analyze the efforts savings using Impact analyzer and share with the business and project teams.
  • Represent the Regression QA team in the triage and release meetings.
  • Maintains the regression suite in ALM by coordinating with the teams to create E2E test cases as well update the Regression suite.
  • Schedule monthly meeting with the BA to add/update/modify the Regression suite.
  • Prepare Go/No Go deck, review with the business teams and present the decision to Release Manager.
  • Prepare quarterly deck for senior management’s review with the percentage of test cases automated, manual effort saved and defect density metrics.

Environment: UFT, Selenium, Oracle, Abinitio, Unix,Teradata, SQL

Confidential - Overland park, KS

E-commerce QA Lead

Responsibilities:

  • Work with the delivery leads of Ecommerce sales and analyse the requirements.
  • Attend the grooming sessions and provide test estimates in JIRA.
  • Design the test cases and get them reviewed from the business.
  • Setup the test bed and create Front End users based on the requirements.
  • Create users using Oracle as per the requirements and perform API test Execution using SOAP UI and POSTMAN.
  • Onsite-offshore coordination and lead the defect the triage meetings.
  • Support daily Sanity testing and production testing after every release,
  • Work with the SQAR teams to perform end to end stage testing.
  • Publish the test summary report for every release and act as SPOC for production roll-out testing.

Environment: SOAP UI, POSTMAN, JIRA, Oracle, SQL

Confidential - Chicago, IL

ETL Test Lead

Responsibilities:

  • Analysed the jobs under consideration for Migration. Identified the Input and Database dependencies.
  • Created the Test Strategy, test scenarios, reviewed with the project and application teams.
  • Co-ordinated with support teams and DBA to create a replica DB and a pre-prod environment for performance and volume testing.
  • As the input files are sensitive created Abintio data masking scripts to mock the data and push to QA for testing. Also, AbInitio jobs are created for validating the output of Abintio Vs Talend.
  • With the Same input executed and compared the output of Abintio VS Talend jobs. Aanalysed and compared the job execution time, warnings, frequency, source data, ease of use etc. and provided the recommendations for performing query tuning based on the above factors.
  • Loaded data into HDFS from external source system and Oracle DB, Teradata using Sqoop and exported the summarized data to the relational databases to perform visualization and report generation jobs.
  • Performed various data warehousing operations like de-normalization and aggregation on Hive using DML statements.
  • Reported the stats on a daily basis to the business.
  • Conducted Test result walkthroughs on a regular basis with the application team to report and discuss on the issues.
  • As multiple vendors are involved in the project, represented TCS and handled project/process related activities, including training the new team members till reporting the status to the WAG management.

Environment: Ab-initio, Talend, JIRA, Netezza, SQL, Uni

Confidential - Chicagxo, IL

ETL Test Lead

Responsibilities:

  • Identified and analysed all the source systems pushing the card data into EDW. Had discussions with source teams and documented the complete understating of the system.
  • Participated in BRD, FRD discussions. Raised static defects on the requirements.
  • Analysed the existing business and created Test Scenarios that ensure the existing functional behaviour is intact with the changes to the system. Ensured the new encrypted values are not misleading the end users.
  • Involved in preparing Test Strategy, Test Estimation, and test completion report for the projects. Prepared & Reviewed TOM, Test cases, Business scenarios for all the code releases. Conducted walkthrough sessions with business, Project management, Design and development team for the Test plan & Business scenarios.
  • Ensured the new encrypted values co exists with the existing values until all the end users are ready to accept the new ones and not impacting any other projects.
  • Co-ordinated with source teams to generate the test data and verify the end-to-end flow of the data.
  • Loaded the data using Ab initio and Data stage jobs and validated the performance on job execution times.
  • Created Ab-initio graphs for Data creation and validations and performed end-to-end data validations.
  • Ensured the data at rest is properly converted to the new encrypted values.
  • Loaded data into HDFS from external source system and Oracle DB, Teradata using Sqoop and exported the summarized data to the relational databases to perform visualization and report generation jobs.
  • Involved and supported validations for Data Ingestion - Historical, Incremental logic validation with complex business scenarios. Validation of data movement from Source files to Teradata databases for end to end business logic validation.
  • Supported SIT and UAT teams executing the batches and validated the data from source to EDW for historic and incremental loads.
  • Handled data imports from various operational sources, performed transformations using Hive with MapReduce & Spark
  • Loaded data into HDFS from external source system and Oracle DB, Teradata using Sqoop and exported the summarized data to the relational databases to perform visualization and report generation jobs.
  • Verified Pig Latin scripts to support multiple data flows involving various data transformations on input data.
  • Involved and supported validations for Data Ingestion - Historical, Incremental logic validation with complex business scenarios. Validation of data movement from HDFS files to Teradata databases for end to end business logic validation.
  • Performed validation and standardization of raw data from XML, Salesforce and JSON files with Pig.
  • Implemented ad-hoc queries using HiveQL, created partitions to load data.
  • Verified Hive Incremental updates using four-step strategy to load incremental data from RDBMS systems.
  • Performed various data warehousing operations like de-normalization and aggregation on Hive using DML statements.

Environment: Ab-initio, DataStage, Cognos, HP ALM, JIRA, Teradata 14.1, Oracle 9.1, SQL, Unix, Hive, Sqoop, Pig, Spark.

Confidential

ETL Test Lead

Responsibilities:

  • Participated in BRD, FRD discussions. Involved in preparing Test Strategy, Test Estimation, and test completion report for the projects. Prepared & Reviewed TOM, Test cases, Business scenarios for all the code releases. Conducted walkthrough sessions with business, Project management, Design and development team for the Test plan & Business scenarios.
  • Created the Regression environment, helped the team members in executing the jobs and comparing the data
  • Identified the jobs at source level and executed using DataStage as well as Ab-initio with the same set of data.
  • Performed the validations of count and Data integrity.
  • Analysed and compared the job execution time, warnings, frequency, source data, ease of use etc.
  • Provided the recommendations for performing query tuning based on the above factors.
  • Recommended the upgrade as per the metrics analysed.

Environment: Ab-initio, DataStage, HP ALM, JIRA, Teradata, SQL, UNIX

Confidential

UAT Coordinator

Responsibilities:

  • Meet with Business Clients to Begin User Acceptance Test Case development.
  • Analyzed the Test Scenarios created by business user for UAT and identify the test data requirements, investigate dependencies of key upstream systems, and communicating to the corresponding teams for test data requirements.
  • Reviewed and mapped Test Cases to Requirements and to ensure all Test Cases are mapped to Requirements (QC) and test data is requested for all the scenarios.
  • Verify UAT environment availability / overlapping with other parallel projects - Communication with PM / Verification of potential overlap.
  • Execute the jobs in UAT environment in UNIX and Prepare Test Result Report.
  • Performed the Data count and Data Integrity validations.
  • Supported the business users in composing the SQL queries to retrieve the data from the backend.
  • Created and managed all defects at program level in HP Quality Center.
  • Conducted Defect Triage meetings with Developers and Business users on daily basis and created defect management reports using Quality Center for Executive Management and Steering Committee.
  • Responsible for creating testing status reports and summary reports for Go/No-go decision making and supported application deployments.

Confidential

Test Analyst

Responsibilities:

  • Involved in review of all documentation (BRD, SRS) relating to the given Project.
  • Prepared the Test Plan, Test Scenarios and Test cases.
  • Reviewed with the businesses for Signoff.
  • Defining and conditioning test data.
  • Writing and executing Unix Commands, composed the SQL queries to validate the data from source to Confidential and capture the Test Results.
  • Prepared Daily & Weekly Status reports for the entire project.
  • Prepared & Maintained Risk Response tracker, Query Tracker.
  • Documentation and validation of problems and defects found during testing.
  • Reporting all defects and issues to development team via HP QC.
  • Review the Test Results with business for sign-off.
  • Prepare the Test Summary report and publish.

Confidential

Test Analyst

Responsibilities:

  • Involved in review of all documentation (BRD, SRS) relating to the given Project.
  • Prepared the Test Plan, Test Scenarios and Test cases.
  • Reviewed with the businesses for Signoff.
  • Defining and conditioning test data.
  • Writing and executing Unix Commands, composed the SQL queries to validate the data from source to Confidential and capture the Test Results Prepared Daily & Weekly Status reports for the entire project.
  • Prepared & Maintained Risk Response tracker, Query Tracker.
  • Documentation and validation of problems and defects found during testing.
  • Prepared test data for web services validation.
  • Performed API testing/web services using SOAP-UI tool.
  • Reporting all defects and issues to development team via HP QC.
  • Review the Test Results with business for sign-off.
  • Prepare the Test Summary report and publish.

We'd love your feedback!