We provide IT Staff Augmentation Services!

Sr. Data Warehouse Qa Analyst Resume

5.00/5 (Submit Your Rating)

Kansas City, MO

SUMMARY

  • Over eight years of experience in the information technology as a Software QA professional experienced in testing and test management.
  • Expertise in Analyzing Standard Business Process, Standard Operational Procedures (SOP’s), Business and technical requirements, data model, ETL design document, mapping documents, BI Requirements, design and mockups documents.
  • Good understanding of quality assurance testing methodology relative to the Business Analysis, Application Design and overall Software Development Lifecycle (SDLC) & Software Testing Life Cycle (STLC).
  • Experience working in QA Methodologies - Waterfall, V model and Agile.
  • Experience in developing Traceability Matrix based on Detail level requirements and Test scenarios/Test Cases.
  • Experience working with business users, business analyst, technical team and clarifying requirement GAP’s, ETL and BI Design documents, Change requests and Defects.
  • Experience in Project status meetings, Brain storming sessions, identifying requirements, walkthroughs and inspection reviews.
  • Proficient in documenting Test Strategy, Test Plan, Test Scenario and Test case, Test execution and Defect reporting.
  • Hand on experience in functional, integration, regression, system, End to End and Security testing.
  • Strong experience in Data Warehouse (ETL) and Business Intelligence (BI) components testing.
  • Hand on experience working on Master Data Management (MDM) using IBM Initiate Master Data Engine, Workbench, Inspector and Data Stewards.
  • Good understanding of Data Modeling in Star Schema/Snowflake schema design, Fact and Dimensions tables, Physical and logical data modeling, Data Mapping, Data Conversion, and Data Cleansing
  • Hand on experience in writing SQL (DDL, DML and DCL) queries for back-end data validation testing
  • Extensive knowledge on ETL process, OLAP, OLTP and N tier data architecture and RDBMS
  • Strong SQL experience in Data Analysis, Data Validation, Data Profiling, Data Verification, Data Loading.
  • Hand on experience in BI testing on BI tools OBIEE, EPM 11, SAP Business Objects, Micro Strategy including dashboards, summary reports, master detailed, drill down and score cards.
  • Actively participate in internal audits and management reviews, take necessary action on the issues raised on quality management system and process improvement.
  • Extensive working experience in domains like Financial, Healthcare, Retail, Insurance and Banking.
  • Expertise in Defect management, Bug tracking, Bug Reports and generating the graphs using Test Management Tools such as Application Lifecycle Management ALM, Quality Center, Test Director, MTM and JIRA.

TECHNICAL SKILLS

Languages: VB.NET | VBScript | ASP | HTML | DHTML | JAVA | .NET Framework | Visual Basic | SQL

Software: JIRA | Subversion | Microsoft Office 2003/2007/2010 | SharePoint | Confluence

Technologies: MS .Net Web Technology | Client/Server | PC | Citrix

ETL Tools: Oracle Data Integrator 11.0 | Informatics Power Center | Data stage

Tools: Team Foundation Server (TFS) | Microsoft Test Manager (MTM) | ALM | Quality Center | JIRA | SharePoint | Selenium | SOAPUI | Rally | SOAPUI |QTP |UFT

Data Warehouse: Informatica Power Center | Oracle Data Integrator (ODI) | SSIS | AutoSys | DataStage | Master Data Management (MDM) | IBM Initiate Master Data Engine | Workbench, Initiate Inspector

Reporting Tools: OBIEE | Business Object | Hyperion | ESS base | EPM 11

Databases: Oracle | MS Access | DBMS | DB2 | Informix | SQL Server | Teradata

Methodologies: Waterfall | V model |Agile |Validation & Verification

PROFESSIONAL EXPERIENCE

Confidential, Kansas City, MO

Sr. Data Warehouse QA Analyst

Responsibilities:

  • Participate in requirements definition reviews and functional specification reviews to ensure that critical information is included and testability requirements are met.
  • Participate in gap analysis for the given requirement, Mock up reports and dashboards.
  • Analyze the Technical design specification documents, business rules, data validation specifications to identify deficiencies in test plans and procedures, and enhance test overage accordingly.
  • Responsible for developing the test cases to validate the ETL process, business rules, Data transformation rules, Reports layouts and data.
  • Responsible for developing and executing test cases to validate the Tables/View structure, Data conversion, Data Mapping between source to staging, Staging to EDW and EDW to Data Mart.
  • Responsible for writing and executing the test cases to validate the Transformation rules, Records counts, Logical and physical deletes, exceptions and errors for different stages of ETL.
  • Create requirements and Test matrix for ETL and Reporting Test cases in Quality Center.
  • Develop test cases using standard test case template in Excel sheet and upload in Quality Center.
  • Validated the access Security, navigation to a particular screen, dashboard, page, and report for OBIEE.
  • Validate access to the right data, screens, dashboards, pages, and reports, response time to access the application, particular dashboard, page and report, capability and ease of using prompts to select data on a report with specific attributes, use of drilldown and navigation on Reports for OBIEE.
  • Validated all the prompts from the summary reports are getting applied to the detail report.
  • Validated if the summary report are working from charts, tables, table headings,counts are matching between the summary and detail report where appropriate.
  • Created and executed regression test cases for 100+ reports and Dash boards during OBIEE upgrade from version 10.x to 11.x
  • Performed the regression testing after upgrade from OBIEE 10G to 11G and tested functionality and Security testingto making sure the all the OBIEE Security that has been completely upgraded from the previous versions 10g to the newer version of OBIEE 11g.
  • Document software defects in defect tracking system Quality Center and keep defects up to date.
  • Prepare test progress and defect report and communicate to the management.
  • Facilitate defect review meetings and improved existing defect management process, driven team to work in faster manner by reporting defects status.
  • Developed and generated test progress and defect reports from Quality Center.
  • Responsible for validating the files availability, format and data on UNIX server.
  • Utilize defect tracking tool HP Quality Center to trace, assign, verify and close defects.

Environment: ALM 11.5, DataStage, ODI, SQL Server 2012, OBIEE, EPM 11, SQL Developer, Web Client, JIRA, SQL Server, Webservices, SOAP UI 4.5, HTML, XML, API Testing, .Net, JAVA, J2EE, SharePoint, Confluence, Oracle, DB2, Informix, SVN.

Confidential, Denver, CO

Sr. ETL/BI Tester

Responsibilities:

  • Define and document Test Strategy, Test Plans and Test Cases by evaluating functional and non-functional requirements of their user stories.
  • Participate in requirements definition reviews and functional specification reviews to ensure that critical information is included and testability requirements are met.
  • Participate in gap analysis for the given requirement, Mock up reports and dashboards for review by the business users.
  • Analyze the Technical design specification documents, business rules, data validation specifications to identify deficiencies in test plans and procedures, and enhance test overage accordingly.
  • Responsible for conducting QA entrance and exit criteria meetings with project stake holders.
  • Validated the source to target mappings based on Data model and mapping sheet.
  • Validated the data extracts based on the attribute mappings identified in the document “MDM Extract File Attributes with Mapping.”
  • Validated the ETL process produce a file for loading into the MDM system based on the specification in the “Data Extract Guide” document Validated the ETL process from source system to staging area including SFTP process (load process from SFTP to MDM loader).
  • Validation the successful completion of initial and incremental load from staging area to MDM.
  • Validate the successfully installation and configuration of Initiate MDM Engine, Web Sphere, Initiate MDM Workbench and Initiate Inspector in Test environment.
  • Validated reports such as duplication Summary Statistics Report, Event summary and detailed report, Pre-Merge report etc. generated from IBM Initiate Inspector Reports
  • Performed security testing on MDM inspector through eAuth using the LDAP group within the Initiated Workbench to validate the user level security such as Attribute
  • Updated data in source system and validated the same MDM in daily load
  • Validated and compared data between source database/staging and MDM database
  • Document software defects in defect tracking system ALM and keep defects up to date.
  • Write and execute SQL queries using SQL developer, TOAD, Microsoft SQL Server.
  • Preparing Post implementation lessons learned document to keep track of all issues through the release testing process.
  • Prepare test progress and defect report and communicate to the management.
  • Identify, create and communicate Issue and Risk using Microsoft SharePoint.
  • Facilitate defect review meetings and improved existing defect management process, driven team to work in faster manner by reporting defects status.
  • Preparing and maintaining the release and cycles to keep track of code movement and testing using Quality Center.

Environment: IBM MDM, Webspher, IBM Initiate MDM Engine, IBM Initiate Inspector, IBM initiate work bench Oracle 11i, OBIEE 10.1.3, EPM 11, DataStage, TOAD, Microsoft SQL Server 2008, SQL Server Management Studio, MS office 2007, Share Point, Subversion, Application Life Cycle Management (ALM), Quality Center 11.0, Confluence.

Confidential, Roseland NJ

Sr. ETL/BI QA Analyst

Responsibilities:

  • Participate in requirements and functional specification reviews and walk through to ensure that the testability requirements are met.
  • Reviewed and analyzed the requirements documents, Design documents, Data design/flow and Data mapping documents.
  • Coordinate with cross-project teams on business and unit functional requirements and derive integration and non-functional test requirements.
  • Developed Test Scenarios and Test cases for Data Conversion, Functional, Integration, Security and UAT testing.
  • Developed process documents, users manual and defect loging process to support UAT testing.
  • Conducted test execution/status meeting with project stake holders.
  • Conducted meetings and walk through with users and junior team members to discuss the defect management process in Quality Center.
  • Performed Ad-Hoc testing, end-to end testing and System testing.
  • Identified and logged the defects and issues into Empirix and JIRA and closely worked with development team to fix and re-test the defects.
  • Worked with the Business Analysts to verify the issues and create PCR when needed.
  • Facilitated defect review meetings and improved existing defect management process, driven team to work in faster manner by reporting defects status.
  • Wrote and used SQL queries to setup and validate test data.
  • Supported UAT testing and created defect guide lines documents for users using Quality Center.
  • Utilized defect tracking tool HP Quality Center to trace, assign, verify and close defects.
  • Involved working with agile team in SCRUM framework.
  • Involved in understanding, creating and updating Specification Requirement Document (SRD) and Report Specification Document (RSD) to write test cases in different Sprints.
  • Worked with cross functional team including business analyst, Product Owner and agile team to discuss the requirements, design, backlogs and used cases.
  • Created test cases and test data for different Sprints in Agile Testing following.
  • Responsible for develop and execute the test cases for System and Integration testing.
  • Tested XML transmission and verified inbound-outbound XML using a SOAP UI.
  • Developed and customized XML scripts to access and execute embedded in the WSDL file.
  • Identified and replaced actual value and Aut and selected the correct outgoing WSS while running the XML script in SOAPUI.
  • Modified end points when we worked on different servers and different versions of Web Services.
  • Performed web specific testing like Link checking, Browser page testing, Application Testing and Security Testing.
  • Verified the database using SQL queries to make sure right data was created in the application.
  • Facilitated as UAT (User Acceptance Testing) lead and worked closely with users and supported in test execution and defect management during UAT.
  • Involved in sending weekly/Sprint Status report that cover defects, test coverage and test results
  • Worked closely with development team in defect review and kept track of the defects.

Environment: SOAPUI, Webservices, .NET, Microsoft SQL Server 2008, SQL Server Management Studio, MS office 2007, Share Point, Subversion, Quality Center 9.6, Quick test profesional (QTP), J2EE, HTML, XML, VB Scripts, Clear Quest.

Confidential, Richfield, MN

Sr. ETL/BI Tester

Responsibilities:

  • Analyzes data, process flows, and database structures to facilitate issue resolution and identify process improvement opportunities in data management and reporting.
  • Followed Agile and Scrum, Translated Business requirements and technical design into Requirements in HP Quality Center (QC) Analyzed and reviewed Project Initiation Document (PID), System Requirements Document (SRD) and System Requirements Specification (SRS) to ensure testability and identify discrepancies or errors in the functions, interface, data structure and performance of the system.
  • Participate in requirements definition reviews and functional specification reviews to ensure that critical information is included and testability requirements are met
  • Participate in gap analysis for the given requirement, Mock up reports and dashboards for review by the business users
  • Developed Test Approach and Test plan document for different Data Marts testing
  • Validated the source to target mappings based on Data model and mapping sheet.
  • Prepared Test Cases and Test Plans for the mappings developed through the ETL tool from the requirements
  • Extensively used Informatica Workflow Manager to run the workflows/mappings and monitored the session logs in Informatica Workflow Monitor
  • Verified session logs to identify the errors occurred during the ETL execution
  • Created Test Cases, traceability matrix based on mapping document and requirements
  • Performed complex data validation using SQL queries
  • Verified the logs to identify the errors occurred during the ETL execution
  • Written several complex SQL queries for data verification and data quality checks
  • Reviewed the test cases written based on the Change Request document and Testing has been done based on Change Requests and Defect Requests.
  • Preparation of System Test Results after Test case execution.
  • Prepared Test Scenarios by creating Mock data based on the different test cases
  • Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process
  • Debugging and Scheduling ETL Jobs/Mappings and monitoring error logs
  • Have tested reconcile and incremental ETL loads for the project
  • Tested data migration to ensure that integrity of data was not compromised
  • Ran Data Stage jobs by using Data Stage Director and by running UNIX scripts as well
  • Wrote complex SQL queries for extracting data from multiple tables and multiple databases
  • Document software defects in defect tracking system Quality Center and keep defects up to date.
  • Write and execute SQL queries using SQL developer, TOAD, Microsoft SQL Server.
  • Preparing Post implementation lessons learned document to keep track of all issues through the release testing process.
  • Effectively coordinated with the development team for closing a defect
  • Prepare test progress and defect report and communicate to the management.
  • Identify, create and communicate Issue and Risk using Microsoft SharePoint.
  • Facilitate defect review meetings and improved existing defect management process, driven team to work in faster manner by reporting defects status.
  • Preparing and maintaining the release and cycles to keep track of code movement and testing using Quality Center.

Environment: Oracle 11i, Microsoft Team Foundation Server (TFS) 2012, Microsoft Test Manager 2012, JIRA 7.0, TOAD, SQL Server 2010, ASP, Microsoft Visual Studio 11.0, VB scripts, TOAD 8.0,Window NT, SQL, .Net, Web Services, XML, XML Schema, DB2, Unix, Confluence, SharePoint, IBM MDM, IBM Initiate Inspector.

Confidential, Philadelphia, PA

Sr. QA Analyst/Business Analyst

Responsibilities:

  • Reviewed and analyzed the User Stories, Backlogs, Design documents, Data Model, Data mapping documents and existing test cases.
  • Worked in an Agile Methodology and Scrum framework with respect to the change in application requirements.
  • Participated in daily standup, backlog, functional specification and design review meeting
  • Meet with business users, business analyst and technical team to clarify any GAP identified during review.
  • Created and updated, testing task in task board on Visual Studio.
  • Participate and update the testing status in scrum meeting and weekly status meeting.
  • Used JIRA to review the Change Requests (CR), Production Issue Requests (PIT), Issues and Risks.
  • Develop and maintain Test plans and Test Scenarios, Test cases and Test data for Functional, Integration and Security testing including Positive, Negative and Boundary condition.
  • Used Microsoft Visual Studio and MTM for Requirements and Backlogs review, Test Plan and Test Cases development, Test execution, Defect management and Reporting.
  • Used SharePoint to manage and maintain all the testing deliverables and reports.
  • Developed the test cases to validate the ETL process, Data transformation rules, Reports layouts and data.
  • Validated reports features such as Navigation, Column short, Drill down, Drill Up, Filter, Prompts, Page Forward/Backward Feature, Bookmark, Hyperlinks etc.
  • Developed test cases to validate the ETL process, business rules, transformation rules, Reports layouts and data.
  • Create requirements and Test matrix for ETL and Reporting Test cases in Quality Center.
  • Validated the source to target mappings based on Data model and mapping sheet.
  • Validated the Transaction/Audit log path for initial load extract, initial load process.
  • Validated the E-authentication process and access, successfully job scheduling and execution for Initial load
  • Document software defects in defect tracking system Quality Center and keep defects up to date.
  • Write and execute SQL queries using SQL developer, TOAD, Microsoft SQL Server.
  • Preparing Post implementation lessons learned document to keep track of all issues through the release testing process.
  • Supported UAT testing and created defect guide lines documents for users using Quality Center.
  • Facilitated defect review meetings and improved existing defect management process, driven team to work in faster manner by reporting defects status.
  • Utilized defect tracking tool HP Quality Center to trace, assign, verify and close defects.

Environment: Quality Center, DataStage, EPM 11, TOAD, SQL Microsoft SQL Server 2008, SQL Server Management Studio, MS office 2007, Share Point, Subversion, Quick test profesional (QTP), J2EE, HTML, XML, VB Scripts, Clear Quest.

Confidential . Jersey City, NJ

Sr. QA Analyst/Business Analyst

Responsibilities:

  • Analyzed the requirements, understanding the business logic, acquiring sufficient domain knowledge required for the project.
  • Developed STP (Short Test Plan)/ Test Strategy for the project that is delivered before the design phase of the project.
  • Prepared Traceability Matrix to map the test cases authored for all the business requirements and the PRA (Product Risk analysis).
  • Create Test scripts/Test cases according to the requirement document
  • Preparation of Test data and Execution of test cases.
  • Defect Tracking and reporting Bugs through the Defect tracking tool.
  • Report test results (both successful and failed test cases).
  • Re-test all bug fixes and perform Regression Testing for the application.
  • Perform System Testing and Ad-hoc testing as required.
  • Perform Sanity testing for the entire module that has been changed or modified based on requirements.
  • Worked in an Agile Methodology and Scrum framework with respect to the change in application requirements.
  • Participated in daily standup, backlog, functional specification and design review meeting
  • Meet with business users, business analyst and technical team to clarify any GAP identified during review.
  • Developed and maintained Test plans and Test Scenarios, Test cases and Test data for Functional, Integration and Security testing including Positive, Negative and Boundary condition.
  • Used Microsoft Visual Studio and MTM for Requirements and Backlogs review, Test Plan and Test Cases development, Test execution, Defect management and Reporting.
  • Used SharePoint to manage and maintain all the testing deliverables and reports.
  • Wrote and executed SQL queries to perform the backend data validation using SQL developer, SQL server management studio.
  • Prepared test progress and defect report and communicated to management.
  • Facilitated defect review meetings and improved existing defect management process, driven team to work in faster manner by reporting defects status.
  • Responsible for Enhancements and frame work for Automation scripts Using Quick Test Professional (QTP) to run the Regression Suite.
  • Create a New script from scratch based on Applcation plan
  • Modifying on existing script for enhancement
  • Data driven and Keyword frame work
  • Involvment in refactoring automation scripts
  • Code refactor in VB scripts
  • Winrunner TSL script to VB script for QTP frameworks
  • Framework changes from Keyword to Data driven

Environment: MTM, SQL, QTP, Quality Center, SQL SERVER 2008, TOAD, Java, DB2 and UNIX.

Confidential

Business Analyst/UAT Tester

Responsibilities:

  • Gathered, documented, reviewed functional specifications and system requirement specifications.
  • Documented and identified Backlogs for the projects and for individual Sprints.
  • Worked with business users and technical team to clarify the GAP’s into requirements.
  • Developed UAT test scenarios, test cases and test data.
  • Performed UAT functional testing and integration testing.
  • Worked in an Agile environmnet following SCRUM framework.
  • Performed service based testing using the SoapUI tool for web services.
  • Creating Security Matrix based on business rules and user requirements.
  • Logged defects in the defect-tracking tool Test Director.
  • Identified areas for regression testing and written automation scripts for the same using QTP.
  • Gathered the Test Data in the pre-testing phase for positive and negative scenarios.
  • Updating Status Sheet and helping the Lead in preparing the Status report.
  • Wrote SQL scripts to validate data in the database on the back-end and master files.
  • Reported, reviewed, and analyzed bugs created in Test Director.

Environment: Test Director, Quick Test Pro, Windows XP, C++, VB6.0, JavaScript, HTML, SQL Server.

We'd love your feedback!