We provide IT Staff Augmentation Services!

Senior Quality Analyst Resume

2.00/5 (Submit Your Rating)

Detroit, MI

Professional Summary:

Over 8 years+ of professional experience in Information Technology with extensive experience in performing Manual and Automated Testing using HP Automated tools.

  • Extensive experience in using HP Automated tools with n-tier, web-based, Mainframes,client server and ERP applications.
  • Extensive Experience in all phases of Software Development Life Cycle and QA Life cycle.
  • Expertise in performing Integration, System, End-to-End, Regression, Performance and User Acceptance Testing.
  • Experience in Black Box Testing of applications related to Insurance, Healthcare,Finance, Automobile and Banking verticals.
  • Experience in performance, Load and stress testing of system by generating web traffic and performing sensitive measurements using Load Runner & Performance Center.
  • Experience in Functional, System, Integration, Regression, Performance, Back End/Front End and User Acceptance Testing.
  • Proficient in using Mercury provided automated testing tools such as Performance Center, Win Runner, Quick Test Pro, Load Runner and Quality Center.
  • Expertise in software management tools, Test Director, PVCS tracker and Clear Quest for Defect tracking and reporting.
  • Experience in writing Test Plans, defining Test Cases, developing and maintaining test scripts, analyzing the results.
  • Experienced Quality Assurance Professional with Multiple skill sets and a proven record of successfully completing projects.
  • Solid hands on experience in management and testing of complex IT systems.
  • Posses a unique mixture of management, technical and people skills critical to the success of IT projects.
  • Strong experience in developing Test Plans, Test Cases and Test Scripts from Requirements and Use Cases.
  • Experienced in testing under UNIX and WINDOWS XP/ NT/98/2000 environments.
  • Good knowledge of Load runner and Performance center Architecture
  • Capable of working as an individual and also as a part of team.
  • Strong application knowledge, interpersonal skills, good verbal and written communication capabilities, presentation Skills and Leadership Qualities.
  • Experience with testing environments like Java, Visual Basic, ASP, HTML, DHTML, XML, JavaScript, VB script, and Com/ Dcom and in Relational DBMS like Oracle, SQL Server and writing SQL queries.

TeCHNICAL sKILLS:
Testing Tools: Performance center 8.1, Win Runner 7.x/6.x, Load Runner 7.x/8.x, QTP 6.5/7.0
Bug Tracking Tools: PVCS TRACKER and Quality Center
Test Management Tool: HP Test Director 8.x/7.x
Operating Systems: Windows XP/2000, Windows NT and UNIX
Languages: C, C++, Java, SQL and PL/SQL
GUI: Visual Basic, Visual Interdev and Front Page
Web-Technologies: HTML, XML, and ASP
Databases: Oracle, MS-Access, DB2 and SQL Server
Script Languages: JavaScript, VB Script and UNIX Script
Other Tools:Toad, MS Office, MS Visio

Education:
Bachelors in Computer Science Engineering

Professional Experience

Confidential,Detroit, MI Nov'06 - Aug'09

PQA (Performance Quality Analyst) & (V & V)
GMAC, General Motor Acceptance Corporation Financial Services has a diversified portfolio of
Business operations, including Automotive Finance, Personal Line in Insurance, Dealer, Real Estate Finance and other Commercial Business. .Working as Performance Quality Analyst, Admin for performance center and TLab activities (Test Lab) handling Applications like GMAC Dealer, Smart Auction, Smart Cash, Open pages, Account access, PLCP, Gmacfs, SAM 1, SAM2.

Roles
Performed the role of a Sr.QA and Performance Center Administrator.

Responsibilities:
Senior Quality Analyst

  • Assisted the management to develop standard processes and principals(SDP) for a successful implementation of QALC(Quality assurance Life Cycle)
  • Facilitated knowledge transfer by conducting various sessions with the V & V (Verification and Validation) team.
  • Generate or Validate or Modify the test scripts for the required business process and make sure the scripts reflects the objectives of the test.
  • Generate or Validate or Modify the test data required for the desired load tests.
  • Execute or Monitor the Performance tests execution to achieve the desired results as per the test plan.
  • Organized meetings with the project manager to go through the V & V process and engaging the V & V team for the testing.
  • Organized daily or weekly status meeting of the application and reported to the higher management where necessary.
  • Making sure that the application team are in compliant the testing process (SDP).
  • Issue resolution with open ended communication with various testing teams.
  • Created and articulated guidelines for the Master Test Plan.
  • Supporting the testing activities during all the phases of QALC.
  • Reviewing and approving the Test Plan, Test Cases and also published a Test Schedule.
  • Providing recommendations to the Test Plan and Test Cases.
  • Monitoring day to day testing achievements during the execution phase.
  • Facilitated sessions for the Acceptance review.
  • Conducted brainstorming session with the SMEs, BAs to validate the requirements and provide input from a testing standpoint..
  • Ensure Test results represent the complete view of an application\'s performance characteristics
  • Ensure performance tests are accurate predictors of production performance.
  • Review the Test plan, Test cases for all phases (Systems Testing, Systems Integration Testing, Regression Testing, Performance Testing, UAT Testing) and perform acceptance for the test results.
  • Implemented best practices with the application to get the best product deployed.
  • Creating monthly metrics to monitor the status of the applications.

Responsibilities:
Performance Center Administrator

  • As a Administrator of Performance center, managed different projects and Maintained Test calendar
  • Work along with TLAB Admin to make sure the TLAB infrastructure is ready for testing along with ensuring hardware and software are procured and installed or ready for installation at the test environment.
  • Making sure the connectivity between the TLAB Infrastructure and the Capacity environment and load generators are available and the environment is working well.
  • Guiding and walk through the application team through the TLAB process and provide them with the TLAB toolkit.
  • Organize meetings to discuss about the architecture of the application and provide best solutions to perform the performance testing.
  • Understanding the requirements of the application which involves number of V-users, Transactions
  • Volumes and Scenarios to be tested along with the targeted response time.
  • Estimated the application network bandwidth based on the requirements
  • Make sure the existing protocols are sufficient for scripting or initiating the procurement for the new protocols.
  • Ensure performance tests use a consistent and repeatable process.
  • Perform Test plan review with the application team to make sure all the projections made have met the requirements.
  • Generate or Validate or Modify the test scripts for the required business process and make sure the scripts reflect the objectives of the test.
  • Generate or Validate or Modify the test data required for the desired load tests.
  • Monitor the performance tests execution as to achieve the desired results per the test plan.
  • As an Administrator of Performance center, managed different projects, Maintaining Test calendar.
  • Managing User Privileges, Host machines, Servers and accessing Privilege Manager.
  • Managing Test Runs, User request and Event log.
  • Demonstrated strong analytical skills in preparing performance reports.
  • Analyzed the results for bottlenecks and checked the CPU utilization, throughput, bandwidth verification.
  • Reviewing the test results to determine if the results fall within the expected ranges.
  • Generated analysis reports by comparing the test results with the acceptance criteria to validate if the goals of the performance tests is achieved.
  • Identify areas where the performance goals were not met, or where other performance vulnerabilities and risks are involved.
  • Analyzed the performance test results, created reports for the signoffs and approval.
  • Presented the analysis report to the technical and business owners followed by meetings to discuss the outcome of the test.
  • Generated Performance Analysis Report for every application release.

Environment: Performance Center8.1, Quality Center 9.0, Load Runner 8.1, SQL 2000, Windows 2000

Confidential,Hartford Aug'05-Aug'06
Sr Quality Analyst

The project comprises of various applications across various platforms. These applications cater to provide data to various marketing groups of a major pharmaceutical company. The data is received from an external vendor, is processed and then loaded in various databases and supplied to the respective business owners for further analysis and processing

Responsibilities:

  • Facilitated Joint Application Development Sessions (JAD) to identify business rules and requirements and then documented them in a format that can be reviewed and understood by both business people and technical people.
  • Created Business Requirements and converted them into System Requirement Specifications (SRS) and Functional Requirements Specifications (FRS).
  • Worked on Developing, executing and maintaining Test Plans, Test Scenarios, and Test Cases Document based on business requirements.
  • Involved in User Acceptance Testing (UAT).
  • Manual execution of generated Test cases for both the user and administrator modules.
  • Performed Development Integration, System Integration, End to End and User Acceptance Testing for the data services.
  • Performed Functional testing, Batch testing, System testing Performance testing and Regression Testing to see the entire functionality.
  • Involved in application security testing for securing the application with proper authentication & authorization.
  • Creating test case and test sets, tracing them to requirements and executing them in Mercury Quality center.
  • Reporting defects in Mercury Quality center and generating reports for the daily status meetings
  • Preparing status reports and defect reports for daily status meetings and sending them to all the stakeholders of the project
  • Extensive use of UNIX commands to move source files (XML and txt) and trigger files to ETL QA server.
  • Using workflow manager to edit, validate and run workflows, worklets and tasks.
  • Using workflow monitor to monitor the status of workflows and sessions.
  • Performing test data set-up in XML for testing and update test data based on the requirements.
  • Validating the data in staging (umf) and permanent (EDW and data mart) tables using Aqua data studio.
  • Querying data from different database tables as per the requirement and executing the source queries from workflows.
  • Involved in Database testing by writing & executing SQL & PLSQL queries by using SQL & TOADto validate that data is being populated in an appropriate tables & manually comparing the results with front-end values.
  • Analyze all Report Bugs for continuous process improvement and keep tracking on the SDLC critical path.
  • Responsible for reporting bugs through the Quality Center and verified known bugs against new builds.
  • Used Quality Center to store all testing results, metrics, implemented Test plan Document, created Test Cases, Defect Tracking & Reporting.
  • Assigned those bugs to Programmer by using "Quality Center" a bug tracking tool (Verifying that all the reported bugs fix thru software developers)
  • Interacted with the developers in fixing all the problems.
  • Performed the web testing of the application for browser dependency.
  • Conducted User Acceptance Test (UAT) with users and customers and wrote issues log based on outcome of UAT
  • Initiated and participated in conference calls, walkthrough and review meetings.

Environment: Windows NT4.0,Quality Center 9.2 , UNIX, XML, . SQL Server 2005, oracle 10g Shell Scripts, HTML

Client: - Confidential,IL Feb' 04 - Jun' 05
Project: ADW (Allstate data ware house)
Quality Analyst
Allstate started migrating data from existing EDW (Enterprise data ware house) in DB2 to ADW (Allstate data ware house) in Oracle for consolidated management reporting. I was involved in testing Medical Management, Claims reporting and Household relations management subsystems in ADW. The entire project is divided into two parts. One is History migration (move history data from EDW to ADW) and second is re-directing the interfaces to ADW.

Responsibilities:

  • Understanding the source to target mapping rules and technical requirements documents.
  • Documented Test cases corresponding to business rules in Test Director.
  • Executing test cases/test scripts documented in Test Director.
  • Involved in Developing Test Plans & Test Scripts.
  • Executed test scripts and reported the bugs to development team using Test Director.
  • Wrote SQL Statements to extract Data from Tables and to verify source to target mappings. Backend testing using TOAD.
  • Entered, tracked and followed up on issues (bugs) are found during testing.
  • Performed system, Integration, End-to-End and functional testing.
  • Reported the bugs, Emailed notifications to the developers using the Test Director.
  • Monitored Test Director to close the bugs/cases as and when developers fix the bugs.
  • Responsible in providing regular test reports and test matrices to the management.
  • Created Vuser scripts for load and performance testing using Vuser generator in Load Runner
  • Parameterized and Correlated the Vuser script to the real Scenario Execution.
  • Generated various business scenarios to execute the real world scenario using Performance Center
  • Monitored run time, transactions, web resources, transaction response time, network delay, hits rate, through put etc.. using Controller
  • Analyzed test results, reported defects, tracked defects and maintained Test Results

Environment: Load Runner 7.8, Test Director 7.0, Oracle 8i/9i, SQL, TOAD, MS Office, and Windows NT 4.0, and UNIX.

Confidential,Windsor - CT Jul' 03 - Dec' 03
QA Analyst

Fleet Boston Financial Lockbox Operations is a service that is provided to Companies looking to outsource the collection of payments. This business process benefits companies in their efforts to expedite the process of preparing payments for deposits to the company\'s Lockbox account. This Process also allows for companies to post their account receivable file (posting file), which the customer collects via dialup (Connect Mailbox), FTP, NDM (Connect: Direct) or Via Diskette. The CEP Process makes the Daily Deposits to the customer\'s Lockbox accounts & Treasury Express. This is a re- engineering Project from the Fleet Boston Financial Banc Tech Environment to New platform. This project also involves in setting up new customers & on going maintenance.

Responsibilities:

  • Formulated Test Plans that contains test scenarios for testing the functionality of the application.
  • Conducted walkthroughs of the Test Plans with business analysts, Development and Design teams.
  • Conducted White Box TestingandBlack Box Testing.
  • Performed Manual and Automated Testing.
  • Wrote PL/SQL scripts to verify the database updates, inserts and deletes of the records.
  • Prepared test data for the negative and positive test cases.
  • Performed the Data Driven tests that deal with different sets of data.
  • Verified the images to make sure that they are loading properly.
  • Executed Error Handling and Exception cases.
  • Worked with the development team to ensure testing issues are resolved in timely manner
  • Generated the detailed reports of the Bugs, Pass-Fail report, Comparison Charts.
  • Prepared detailed documentation to include Test procedures and Test execution results.
  • Supported and generated the performance test scripts using loadrunner.
  • Executed the loadrunner scripts to generate the real world scenario and to determine the performance of the application.

Environment: Microsoft Windows 2000/Windows NT4.0, Microsoft IIS 5.0, Embarcadero Rapid SQL 6.0.2, Microsoft SQL Server 7.0/2000, Erwin 3.5.2, Visual Studio 6.0

Testing Tools: LoadRunner 6.0,Win runner 6.0, Test Director

Confidential,, OH Dec' 02 - Jun' 03
QA Tester

SWR (Supplier Warranty Reporting) portfolio is a web-based application accessed by Suppliers, STA\'s, Buyer\'s and other Internal Resources to track a part\'s performance on Vehicles. The Project\'s Goal is to interface existing and new SWR processes with actual Vehicle production history from MVP repository to better associate an engineering supplier to a given vehicle warranty claims and to provide part usage summaries so that Warranty costs can be more accurately attributed to Honda Suppliers for Cost reclamation and future strategic sourcing decisions.

Responsibilities:

  • Reviewed Business Requirement Documents and the Technical Specification.
  • Writing test cases by reviewing business requirement documents, use cases, program specifications and process flow diagrams.
  • Documenting test cases for various screens of the application by using the user interface documents.
  • Involved in Positive testing, negative testing, performance testing and regression testing using Win Runner.
  • Maintained and executed Test cases and Test Scripts using Test Director by preparing Test Sets.
  • Editing the test scripts. Documented the testing methodology and procedures.
  • Parameterized and Correlated the user script to the real scenario execution
  • Generated various business scenarios to execute the real world scenario
  • Monitored Run time, Transactions, Web resources, Transactions Response Time, Network delay etc., using controller
  • Involved in helping Business Analysts during User Acceptance Testing.
  • Reported the defects in a timely manner and ensured that they are fixed and Re tested.

Environment: Oracle, UNIX, JAVA, JSP, JDBC, and SQL Plus
Testing tools: Win Runner 7.5, Test Director 7.0

Confidential,INDIA Apr'01 - Aug'02

Project: Credit Card Billing System
The application keeps track of the customers, monthly minimum amount paid, balance amount to be covered /recovered, claims if any and generates detailed reports regarding the monthly activity of the customers.

Responsibilities:

  • Developed test cases referring the business requirements documents, program specifications, design documents.
  • Participated in project review meetings to better understand the applications under test.
  • Involved in preparing test data for various scenarios identified.
  • Involved in performing different types of testing Functional, Integration, system
  • and regression testing.
  • Involved in testing the functionality and usability of the system from end user perspective.
  • Documented and communicated test results.

Environment: Oracle, Visual Basic, Test Director, Windows, SQL, and MS Access

We'd love your feedback!