Dwh Tester/ Uat Tester/ Performance Tester/ App Server Upgrade Tester Resume
Columbus, OH
SUMMARY:
- 9 Years of IT experience in Data Analysis/QA Testing/ Batch Testing/ETL/DWH/BI - Reports Testing & Testing of Software systems in Data Warehouse/ Business Intelligence/ Client-Server/Oracle and Web-based environments in Windows and UNIX platforms.
- Extensively worked in entire QA Life Cycle, which includes designing, developing and execution of the entire QA Process.
- Strong knowledge of Software Development Life Cycle (SDLC), and QA Methodologies like Agile, Scrum, Waterfall, and Iterative process.
- Involved in preparing the Test Plans, Test Strategy documents, Test Cases & Test Scripts based on business requirements, rules, data mapping requirements and system specifications.
- Experience on Functional testing, System Testing, Regression Testing, Integration testing, GUI Testing, Security testing, Smoke testing, System Integration Testing (SIT), Software Validation testing, UAT, Batch Testing, Interface testing, and Performance Testing.
- Involved in preparation of Requirement Traceability Matrix (RTM), Defect Report, and Weekly Status Reports, Checklists, Job Aids, Test Readiness Review Documents, Test Summary Reports.
- Experience in Data Analysis, Data Migration, Data Validation, Data Cleansing, Data Verification and identifying Data Mismatch.
- Experience with Qlikview Sheet objects including various charts types, Trends, KPI's, custom requests for Excel Export, and Fast Change and objects for Management Dashboard reporting.
- Strong Experience in Extract, Transform & loading data in Qlikview Applications & Develop Data models using Star Schema & Snow Flake Schema's.
- Extensive experience in writing complex SQL Queries and PL/SQL scripts.
- Have tested several complex reports generated by Business Intelligence tools Cognos, Business Objects BOXI, MicroStrategy, Cognos including dashboards, summary reports, master detailed, drill down and score cards.
- Expertise in test management using HP Quality Center, Microsoft TFS and maintained Versioned objects using Rational ClearCase.
- Excellent skills in overall Defect Management/Problem Solving, which includes reporting and tracking bugs using HP Quality Center, Rational ClearQuest, Bugzilla.
- Performance, Stress and Load testing using LoadRunner
- Understanding of Data Models, Data Schema, ETL and created extensive store procedures, SQL queries to perform back-end data warehousing testing.
- Have experience in testing reconcile (initial) and daily automated ETL loads using Informatica DVO.
- Have validated the data extracted from source system (Flat file or relation DB), also validated data loaded into target
- Have validated mapping document, validated existence of workflow, validated existence of session logs in case of error
- Have good experience using Informatica Workflow Manager for running/managing workflows and Informatica Workflow Monitor for monitoring the workflows and checking the session logs.
- Have also validated automated work flow using Informatica DVO and make sure we did not loss/duplicate any data during transformation or migration. have also performed count validation, null validation, duplicate validation, event task validation,
TECHNICAL SKILLS:
QA/Testing Tools: HP ALM, Quality Center 10.x, 9.x, Test Director, TFS, Rally, Soap UI, Bugzilla, Rational ClearQuest
Version Control Tools: Rational ClearCase, CVS, Share Point
Requirements Tools: Rational RequisitePro
RDBMS (Databases): Oracle 12c/11g/10g, MS SQL Server2008/2005, MS Access, DB2, Teradata, Netezza 7.x
RDBMS Query Tools: SQL*PLUS, Query Analyzer, TOAD, Oracle SQL Developer, SQL Navigator
BI/Reporting Tools: IBM Cognos, MicroStrategy, Business Objects BOXI, OBIEE
ETL Tools: Informatica Power Center 9.x/8.x/7.x, Informatica DVO, Data Stage, Ab Initio, SSIS
Programming Languages: PL/SQL, C##
Scripting Languages: SQL Scripts, UNIX Shell Scripts
Incident Management Tools: Remedy
Operating Systems: Microsoft Windows Family, UNIX - Sun Solaris, HP-UX MS Office MS Word, MS Excel, MS PowerPoint, MS Outlook
Scheduling Tools: Autosys, Cron-Tab
Other Tools: Putty, Text Pad, CompareIT, WinSCP, Data Flux, GS Data Generator, Load runner.
PROFESSIONAL EXPERIENCE:
Confidential, Columbus OH
DWH Tester/ UAT Tester/ Performance Tester/ APP Server UPGRADE Tester
Responsibilities:
- Analyzed Business Requirements Document and Functional Specification document and analyzed the project needs.
- Worked with Stakeholders updating project status and ensuring the delivery of QS.
- Involved in the development of Detailed Test Strategy for Functional and System Testing.
- Performed automated data validation after the ETL process using Informatica DVO.
- Designed, Developed, Executed and Analyzed Test plan, test cases, test scripts for use in Integration, System, and Regression Testing.
- Reviewed business requirements and added requirements into the Quality Center, and Wrote test cases and Functionality testing based on Requirements.
- Developed dashboards pertaining to KPI monitoring using Qlikview by writing complex SQL queries and importing data from different sources.
- Extensively used HP Quality Center as defect-tracking system to log, close and generate reports and tracked through to resolution.
- Designed and developed extensive Qlikview reports using combination of charts and tables for analysis of air flow of different aircraft companies. Generated Qlikview documents using Dashboards, Analysis and reports approach.
- Clarify and document issues raised during the process of data and system model validation, analyzed and propose changes in valuation methodology to make it precise and transparent
- Validate SDA and inbound data for accuracy and completeness for system processing and results for downstream users consumption
- Leverage MEGA knowledge to discover the business process approach from identification of services provided to customers as to how the results are produced and real case scenario to enable a better understanding of the methods
- Generated Test Scripts using Load runner.
- Used most of the Load runner features, which include Recording, Reporting, Simulation and Monitoring
- Extract, transform and load data from multiple sources into Qlikview applications.
- Performed Inbound and Outbound Data Validation.
- Validated the MicroStrategy reports for the data and the lay out.
- Used Informatica Workflow Manager to schedule and run Informatica mappings and Workflow Monitor to monitor ETL execution and logs.
- Involved in Performance Tuning & Deployment of Qlikview Applications.
- Involved in preparation of test data to test the functionality of the sources.
- Executed Load Testing and Stress Testing Using Load runner .
- Carried out Parameterization using Load runner
- Tested various ETL Scripts to load the data into Target Data Warehouse (Oracle / Teradata).
- Involved in understanding of the Specifications for Data Warehouse ETL Process and interacted with the designers and the end users for informational requirements.
- Responsible for weekly status meetings showing progress and future testing efforts to the QA Manager.
Environment: Oracle 11g, Oracle 12c, Qlikview, Netezza 7.2, Informatica 9.6.1, Informatica DVO, SQL, PL/SQL, TOAD, IBM Lotus Notes, Teradata SQL Assistant, Load runner, Unix log files, HP Quality Center, Rational Clear Case, ClearQuest.
Confidential, Tempe AZ
Sr. QA Analyst/Reports Tester / Data Analyst
Responsibilities:
- Did a thorough walkthrough of business requirements and prototypes with Business Analysts, Code reviews with development team and got a good grasp of requirements, functionality of the airline system
- Developed Test plans by coordinating with other team members on the road map
- Developed test cases by reviewing the requirement documents
- Map-ed the test scripts to the functional requirements in Quality Center 10.0
- Identified Test Data by querying and created Test scripts
- Conducted the Functional, System, Integration, Regression, performance, and Smoke Tests for various phases of these applications
- Used most of the Load runner features, which include Recording, Reporting, Simulation and Monitoring.
- Generated Test Scripts using Load runner.
- Execution of Manual Test scripts and responsible to track and log the defects using QC
- Responsible for data analysis, report validation and functional testing
- Worked closely with Technical Project Managers and Development Team in defect tracking, re-testing and validation
- Verified bug fixes and tested all impacted modules in application.
Environment: UNIX, SQL, Oracle, Teradata, Informatica 9.5.1, Informatica DVO Quality Center, Load runner, MS Office, Microsoft Excel, Rally
Confidential, Scottsdale, AZ
Sr. QA Analyst/ETL Tester/Reports Tester / Data Analyst
Responsibilities:
- Analyzed solution architecture and design documents including Source to Target & detail design documents needed for each tracks and plan test design activities.
- Prepared UAT project planning like creating testing scenario, test plans, and test scripts and test summary reports
- Assisted QA Manager in developing test estimation, scope and test plan for UAT
- Handled the tasks of explaining UAT scope, criteria and entry/exit strategy to less trained professionals and Responsible for driving end-to-end Test scenarios efforts and in identifying test data for UAT
- Drafted complex SQL queries and perform allocation testing from front end / user interface
- Defined test approach, created presentation, reviewed with Business partners and incorporated feedback and distributed final EDW 1.1 test approach.
- Extensively written complex SQL for ETL and report testing.
- Kicked-off test cases design session with business, setup working sessions with Business SMEs, to review functional/technical requirements and responsible to translate business requirements into quality assurance test cases, facilitated test cases sessions, identified test coverage and gaps if any as part of the sessions.
- Validated the responses of web services by passing various requests using soap UI.
- Worked with Infrastructure to define/ redefine what's required from environment as well as data needs perspective for both EDW SIT and UAT environments.
- Reviewed Shakeout test cases from the team and oversee all users has access to the entire databases and OBIEE environment.
- Worked with SIT test team, OBIEE team, Development team and Performance test team in defining, reviewing and addressing issues around test strategy document.
- Reviewed SIT test cases and test scripts, provide feedback and support in terms of testing techniques and test coverage.
- Summarized final Test Summary Results and review with Business and IS in order to obtain approvals
- Maintained all documents, SQL scripts, approvals and test results and saved on sharepoint.
- Managed manual test passes/runs with TFS.
- Prepared test plans using TFS for each release, written test cases and executed them as part of functional testing. Prepared test reports and deliverables and submitted for version releases.
- Used Informatica Workflow Manager to execute Informatica mappings and Workflow Monitor to monitor ETL execution and logs.
- Validated the data between source and target by using Informatica data validation option.
- Validated the data flow and control flow transformations are working according to functionality in SSIS packages.
- Identifying defects and performed root cause analysis by analyzing data quality issues
- Assigned the tasks of preparing test results as well as gather test results from other testers in standardized format.
- Validated List reports, Crosstab reports, Charts, Dash Boards of SSRS reports and SSAS cubes.
- Reviewed the test activities through daily Agile Software development stand-up meetings.
- Used TOAD to develop and execute test scripts in SQL, to validate the test cases.
- Prepared Requirements Traceability Metrics (RTM), positive and negative test scenarios, detailed oriented Test Scripts, Test Kickoff documents, Test Scorecard for test progress status, Test Results, Release Check list, Lessons Learned documents and Regression Test Suite for future use.
- Responsible for testing Initial/Reconcile and Incremental/daily loads of ETL jobs.
- Interacted with design/development/DBA team to decide on the various dimensions and facts to test the application.
- Heavily involved in testing reports in OBIEE, validated both the custom and ad hoc reports.
- Used TOAD Software for Querying ORACLE and Used WinSQL for Querying DB2.
- Participated in KT meetings with Business Analysts and SMEs.
- Responsible for documenting the process, issues and lessons learned for future references.
- Responsible for weekly status meetings showing progress and future testing efforts to the QA Manager.
- Review daily test status and issue/ defects with the project team and business. Prepare communication and create daily status and defect reports
Environment: Informatica Power Center 9.5, DVO, SQL, PL/SQL, Remedy, DB2, Oracle 11g, TOAD, ClearQuest, ClearCase, Web services, TFS, API, Soap UI, XML, SharePoint, Quality Center ALM 11.0, Agile, OBIEE, Autosys, SQL Server 2012, SSIS, SSRS, SSAS, UNIX
Confidential, Richmond, VA
Sr. ETL /Reports QA Analyst/ UAT Tester
Responsibilities:
- Designed & created Master Test Plan based on the BRD and SRS. Also referred Technical Design document, Source to Target Detailed mapping document & Transformation rules document, ER Diagrams to derive test cases for functional testing.
- Prepared Test cases and Scripts in Quality center and executed them in test lab module.
- Performed pre-process validation for existence of specified database objects like tables, views, synonyms and grants for the same to corresponding schema in database.
- Validated the target table column definition with source column definition (including type and length).
- Involved in creating CR for any type of migration/changes using Remedy.
- Validated database structure with ERD (Entity Relationship Diagram).
- Written SQLs using joins, sub queries, correlated sub queries, group functions, and analytical functions to validate the data between source and target.
- Participated in migrating UNIX code from DEV to QA and QA to UAT environments.
- Used JIRA tool for test/ defect/bug management.
- Involved in executing/scheduling ETL jobs through Autosys.
- Validated the custom Cognos reports using Cognos report studio, ad hoc Cognos reports using query studio and cubes using analysis studio.
- Hands on experience on using Web Service such as SOAP and WSDL.
- Validated data stage mappings that load files into staging tables.
- Involved in extensive end-to-end DATA validation and back-end testing using SQL queries.
- Created BRD x SRS and SRS x Test Cases Traceability matrix for all the projects.
- Validated the WSDL using schema compliance assertion in Soap UI.
- Validated PL/SQL, data stage batch processes which are scheduled using Autosys.
- Validated various SSIS and SSRS packages according to functional specifications.
- Heavily involved in performance testing and worked with analyzing tables, gathering statistics, tuning of SQL queries used to fetch source data.
- Validated Data Stage jobs. Used Data Stage director to execute and verify Data Stage jobs.
- Used TOAD GUI tool for querying oracle database.
- Presented QA team work update/Status to Project/Program Manager in status meeting and also worked with business team to get approval on test plan/test cases.
- Validated that design meets requirement and functions according to technical and functional specifications.
- Involved with Design and Development team to implement the requirements.
- Developed Test Scripts and executed them manually to verify the expected results and published the same to wider audience upon successful completion along with QA signoff.
- Participated in KT meetings with Business Analysts and SMEs.
- Frequently used Perforce to store documents and update them with the latest versions and additions.
- Participated in walkthrough and defect report meetings periodically.
Environment: SQL, PL/SQL, Remedy, DB2,Oracle 11g,Data Stage 8.0, TOAD, Rational ClearQuest, ClearCase, JIRA, Oracle SQL Developer, Soap UI, XML,MS TFS, Putty, VB Script, Quality Center 10, Agile, Cognos 8, Autosys, SQL Server 2008, SSIS, SSRS, SSAS, XP, UNIX
Confidential, Bentonville, AR
Data Analyst/ Web Interface Tester/ETL Tester
Responsibilities:
- Involved in meetings with Business Analysts and End Users to review functional/technical requirements and responsible to translate business requirements into quality assurance test cases.
- Used TOAD to develop and execute test scripts in SQL, to validate the test cases.
- Involved in reviewing test scenarios, test cases and test results for data warehouse/ETL testing.
- Prepared Requirements Traceability Metrics (RTM), positive and negative test scenarios, detailed oriented Test Scripts, Test Kickoff documents, Test Scorecard for test progress status, Test Results, Release Check list, Lessons Learned documents and Regression Test Suite for future use.
- Responsible for testing Initial/Reconcile and Incremental/daily loads of ETL jobs.
- Interacted with design/development/DBA team to decide on the various dimensions and facts to test the application.
- Planned ahead of time to test the mapping parameters and variables by discussing with BA’s.
- Extensively used Rational ClearQuest to track defects and managed them.
- Validated various Ab Initio Graphs according to business requirement documents.
- Extensively tested several Business Objects reports for data quality, fonts, headers, footers, and cosmetics.
- Validated the cardinality of joins and data integrity on the business objects universe.
- Conducted user acceptance testing (UAT) to validate that the developed application meets the business requirements.
- Extensively involved in testing the ETL process from different data sources (SAP, PeopleSoft, Teradata, SQL Server, Oracle, flat files) into the target Oracle database as per the data models.
- Used Autosys for scheduling the ETL mappings, PL/SQL subprograms.
- Written Test Cases for ETL to compare Source and Target database systems.
- Mocked test data to test all the scenarios and test cases planned.
- Used UNIX commands for file management; placing inbound files for ETL and retrieving outbound files and log files from UNIX environment.
- Written several complex SQL queries for data verification and data quality checks.
- Analyzed the testing progress by conducting walk through meetings with internal quality assurance groups and with development groups.
- Responsible for documenting the process, issues and lessons learned for future references.
- Reviewed the test activities through daily Agile Software development stand-up meetings.
Environment: Ab Initio 2.13, Teradata V2R6, Teradata SQL Assistant, SQL, Business Objects XI, ClearCase, ClearQuest, RequisitePro, HP Quality Center, SharePoint, Web Logic, Oracle 10g, Data Flux, SQL Server 2008, DB2, PL/SQL, UNIX, Putty, Flat files, Session Logs, Rally, Remedy, Windows XP.
Confidential, Chicago, IL
QA Consultant / ETL Tester / Reports Tester
Responsibilities:
- Involved in writing the test plans, test cases and RTM involved in the process of analyzing the expected and actual results.
- Worked on Extracting Data from Mainframe and load the data into Oracle.
- Performed Integration, End-to-End and System testing.
- Responsible for source system analysis, data transformation, data loading and data validation from source systems to Transactional Data system and Warehouse System.
- Extensively used ETL to load data from Flat files to Oracle.
- Executed various SQL Queries to perform the backend testing.
- Tested data stage jobs for various ETL functions and transformations.
- Had regular meeting with Developers to report various problems.
- Clearly communicated and documented test results and defect resolutions during all phases of testing.
- Stimulated several production cycles. Worked with data validation, constraints, record counts, and source to target, row counts, random sampling and error processing.
- Performed Black box testing, White-box testing, System Testing, Data Integrity Testing and end to end testing.
- Implemented SDLC, QA methodologies and concepts in the Project.
- Involved in Developing Test Plans and Developed Test Cases/Test Strategy with input from the assigned Business Analysts.
- Tested UNIX batch jobs according to the specifications and the functionality.
- Developed test scripts based upon business requirements and processes. Processed transactions from system entry to exit.
- Used ETL methodologies for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica Power Center.
- Involved in developing Unix Shell wrappers to run various SQL Scripts.
- Extensively created UNIX shell scripts for scheduling and running the required jobs.
- Tested the ETL Data Stage jobs and other ETL Processes (Data Warehouse Testing).
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases.
- Performed the tests in both the SIT, QA and contingency/backup environments.
- Involved in extensive DATA validation using SQL queries and back-end testing.
- Heavily involved in Cognos reports, extensively used SQL for testing reports.
- Used TOAD Software for Querying ORACLE and Used WinSQL for Querying DB2.
Environment: Oracle 9i, Toad, DB2, SQL, Data Stage, PL/SQL, Cognos, MS Visio, Informatica, DB2, Rapid SQL, Mainframes, TOAD, .Net, XML, COTS, RUP, Windows, Autosys, Remedy, SQL Server 2005UNIX, Putty