Qa Analyst Resume
Charlotte, NC
SUMMARY
Five years of experience in Quality Assurance testing of web based and software application
- Good understanding of Quality Assurance Methodologies.
- Proficient in software development life cycle (SDLC) .Tested application s in water fall and agile development process. Including test casesusing test plansandfunctional specifications.
- Analyzed business requirements specification, experienced in various types of testing for example, black box testing ,white box testing ,functional testing ,system, regression, performance testing.
- Excellent skills in both manual testing using Mercury Interactive Tools such as Test Director, Quick Test Professional (QTP)and Quality Center.
- Extensive experience using Test Director as the Test Management tool
- Profound experience in detecting, logging, reporting and trackingthe bugs present in Application under Test (AUT).
- Excellent working knowledge of Automation tools HP Mercury Quick Test Pro, Win Runner and Load Runner
- Expertise in creating Manual Test scripts for Functional and GUI testing using SQL queries on different databases like My SQL Serverand Microsoft Access
- Detail oriented with excellent problem solving and analytical skills
- Positive, quality-oriented and reliable team player with capability to work independently under minimal supervision
- Strong work ethics, excellent oral and written communication skills
- Excellent time management skills with ability to multitask and prioritize tasks as needed
TECHNICAL SKILLS:
Software Testing Tools: Rational Quality manager, Rational Functional Tester, QTP, HP Quality Center
Programming Languages: SQL
Operating Systems: Windows XP, Windows Vista, DOS
Application Package: MS Office 2003, MS office 2007
Databases: Oracle, My SQL, MS Access
Web Technologies: HTML, JavaScript
Browsers: Internet Explorer, Mozilla Firefox
Professional Experience
Confidential,Charlotte NC Feb ’11 June’11
QA Analyst
- Went through the entire project and read whole SDLC includingRequirements, Analysis/Design, and Development and fully understood the concept of SDLC methodology
- Participated in various meetings with business and development teams to ensure project feasibility
- Reviewed Business Requirement Document and Design Specification Documents
- Formalized test strategyand prepared test plan to define the scope of testing, test approach, test environments, types of testing, test milestones, etc.
- Conducted walkthroughs with testing team to communicate strategy stated in Test Plan
- Developed and executed Test Cases for Functional TestingandBackend Testing
- Performed test cases throughout the Integration and Regression environment
- Involved in the test case walkthrough, assessment meeting
- Performed Backend Testingbyretrieving the data from SQL server database using SQL Queries
- Used Mercury Quality Center to write,review and execute test cases, log defects, prepare bugreports and prepared therequirement traceability matrix
- Performed Test Data Automation and Data driven test using Win Runner
- Created and Configured the GUI maps in for the standard and custom objects. Created virtual user for the system to check the performance of the system using Load Runner
- Extensively used Quick Test Pro (Version 11) to carry out Functional, Integration, End-to-end and Regression Testing
- Reported bugs and problems, testing completed via prepared test plan. Hold several meetings with software developers andtechnicalcontent writers to update the test documents
- Coordinated test environment set up and maintenance through various release
- Environment: Mercury Tool (Version 9.0), STB, Windows
Confidential,IL Apr’10 - Dec ‘10
QA Analyst
- Reviewed business requirements design documents and prepared test plan for assigned project
- Worked according to the activities laid down in each phase of Software development life cycle and coordinated the environment setup for the testing
- Participated in various business meetings along with the business team to gather system requirements
- Analyzed the system requirements to prepare a detailed Test Planthat included the scope, approach and objective of the testing
- Prepared test data for Positive and Negative test scenarios as documented in the test plan using MS access database
- Documented the requirements and Test Cases using Quality Center
- Performed Smoke Testing to ensure the system stability before going for automation testing
- Used Mercury Quality Center to write,review and execute test cases, log defects, prepare bugreports and prepared therequirement traceability matrix
- Conducted walkthroughs with development team and end-usersto communicate the testing strategy and to analyze the associated risks
- Collected statistics and graphs after running the tests and analyzed the resultsand reported performance bottlenecks as necessary
- Identified critical performance results such as throughput, hits per sec, time and network delay
- Organized meetings with development and testing team to communicate transaction response the summary of test results
- Used Quality Centerfor Defect Tracking and Reporting
- Environment: Quality Center, Windows, MS Access, Rational Tool, Internet Explorer
Confidential, Pueblo, CO Oct ’07 Feb’09
QA Analyst
- Involved in testing system functionalities
- Analyzed requirement documents, broke down the requirements into different modules and documented the details in TestDirector’s Requirement Module
- Created Test Plan and Test Cases for different test types such as Black Box Testing,FunctionalTesting, Integration Testing, System Testing, Compatibility and UAT Testing
- Created and executed Automation Test Scripts for Functional and Regression Testing using Win Runner
- Authored and updated Test Cases based on use cases and design documents
- Prepared Test Summary Reports
- Used Test Director for bug tracking, reporting and analysis
- Environment: Window, Internet explorer, Excel
Confidential,CA Jan ’07-Dec’08
QA Tester
- Examined business requirements and use cases to understand the functionality of the project
- As a QA tester, I created Test Plan and Test Cases based on requirements and specifications using Mercury Test Director
- Involved in the team for analyzing user environment and setting up the same environment for testing
- Executed some test cases manually to verify the actual results against expected results.
- Identified test cases to be automated and created test script in for Functional, GUI and Regression Testing.
- Administered defect tracking and monitored change management using Mercury Quality Center.
- Worked accordingly in each phase of Software development life cycle and coordinated the environment setup for Testing.
- Environment: Mercury Test Director, Windows 2000/XP, JAVA
Confidential, Long Island, NY Nov ’05 -Sep ‘06
QA Engineer
- Analyzed requirement documents, broke down the requirements into different modules and documented the details in Test Director’s Requirement Module
- Created Test Planand Test Casesfor different test types such as Black Box Testing,FunctionalTesting, Integration Testing, System Testing, Compatibility and UAT Testing
- Created and executed Automation Test Scripts for Functional and Regression Testing using Win Runner
- Performed Test Data Automation and Data driven test using Win Runner
- Enhanced the created test script using both recording and programming features. While recording test, various checkpoints likeStandard, Text, Bitmap, TableandDatabase, were inserted to check the behavior of the application
- Authored and updated Test Cases based on use cases and design documents Prepared Test Summary Reports
- Tested functionality, navigation, alignments, and broken links within the firewall and outside the firewall
- Used Test Director for bug tracking, reporting and analysis
Environment: Test Director, Java, Oracle, SQL, Excel, and Windows