Qa Analyst Resume
NJ
SUMMARY
- Diversified experience in Software Design, Development and Testing. Expertise in Designing and Developing Test plan, Test cases and Generating Test Scripts.
- Hands on experience in Manual and Automated testing using WinRunner, LoadRunner, Quick Test Pro and Test Director. Involved in Integration testing, GUI testing, Volume testing, Stress testing, functional testing, system testing, End to End Testing and Cross Browser Testing.
- Has very Good experience in C, C++, Java, Oracle, SQL, J2EE. Excellent communication and interpersonal skills with clear understanding of business and ability to work as a part of team.
TECHNICAL SKILLS
Languages: Java, C, C++, SQL,VB.
Internet Skills: JSP, ASP, HTML, DHTML, XML, Vbscript, Javascript.
Databases: MS Access, DB2 7.0, Oracle 10i/8.i/7.x.
OS: Windows NT/95/98/2000, AS 400, Unix Solaris
Servers: IBM Web sphere5.0, Weblogic 5.1, Java Web Server
Other Tools: Rational Clear case, Clear Quest.
Testing Tools: WinRunner 7.6/8.0, LoadRunner 8.0, Quick Test Pro 8.0, Test Director
PROFESSIONAL EXPERIENCE
Confidential, NJ
QA Analyst
Responsibilities:
- Worked with business analysts and other groups to gather the requirements.
- Created Test plans and test cases for various applications.
- Actively contributed in preparing the test plan and wrote test cases according to the business requirements, risk analysis or severity/hazard classifications
- Create UAT Test Strategy, UAT Test Plans and UAT defect reporting.
- Managing testing activities like regression, browser, accessibility, performance and cache ability testing according to the testing schedules and scope.
- Used Quality Center for logging Defects and tracking.
- Developed Test environment tool for QA to maintain effective and productive Software Project Test operation using MS Access.
- Expertise in Analysis of Requirement documents, Design documents & Business Rules.
- Involved in designing and Writing System test Scripts.
- Involved in executing test scripts in Integration/Regression Testing phases.
- Fully involved in UAT testing phase to support client.
- Extensively used TFS (Team Foundation Sever) for version control for testing different modules of the code in a project.
- Used TFS as a Project portal for the status of the project and also for communication between the team members.
- Also used TFS for Requirement and defect tracking.
- Involved in Client Management calls and delivering solutions on time.
- Fully involved in Defect Prioritization, Defect Tracking and reporting to the test lead.
- Maintain Defect Log which gives a clear indication of quality and stability of product.
- Developed Tractability Matrix which is a proof of document to ensure that all the specifications are been tested and the application is bug free
- Worked closely with the other members of the Development Team and infrastructure team members to resolve application issues.
- Responsible to resolve production issues.
- Supported production releases every week.
- Actively communicated with the end users and business users to create realistic test cases.
- Excellent problem solving skills, experience in working group projects, desire and ability to learn and apply new technologies. Self motivated and detail oriented.
- Involved in providing corporate training and facilitating trainings and presentation regarding the application context.
- Extensively involved in Build Acceptance Testing, Functional Testing, Database Testing using SQL SERVER 2005 & Good Experience in Black Box Testing and white box testing.
- User Acceptance testing (UAT)
- Validating Data Migration.
- Used Build Forge to deploy applications in QA environments.
- Used IBM console to change the data sours, bounce the application servers.
- Verified various system log files to make sure application is properly deployed and working in UNIX environment.
- Executed shell scripts to generate statistics of an application and slao to load data in LDAP server.
- Used Toad, SQL Navigator, SQL Developer to Change database properties for different applications in test environment to the test database test cases.
- Used database to trouble shoot production issues.
- Prioritized test cases for manual VS automation.
- Used TOAD, SQL Navigator, SQL Developer to run SQL for data validation.
- Used Test client for functional testing XML, HTTP, SOAP requests to verify appropriate request and responses from server.
- Defined and documented test cases to exercise the product to reveal potential defects
- Optimized QTP scripts for Regression testing of the application with various data sources and data types.
- Designed the Automation Testing Framework for the regression testing and actively involved in review of manual test scripts and QTP automated test scripts.
- Planned and executed systems testing and supported UAT testing.
- Worked with Agile Environments for various projects.
- Written and Executed SQL Scripts for Backend validation.
- Mapped the custom objects to the standard objects where necessary, and inserted GUI, Bitmap and Text checkpoints where needed, to compare the current behavior of the application being tested to its behavior in the earlier version using Quick Test Pro (QTP).
- Parameterized the fixed values in checkpoint statements, created data tables for the parameters and wrote functions for the parameters to read new data from the table upon each iteration - Performed Data-driven testing.
- Extensive experience writing and executing automated testing scripts primarily with Quick Test Pro (QTP 9.1). Optimized QTP scripts for Regression testing of the application with various data.
- Back end Testing (ORACLE & SQL SERVER 2005) · Knowledge of UNIX · Having knowledge of Bug Tracking tools like JIRA, Quality Center 9.1.
- Tested functionality of the different application on Smartphones with browsers IE Opera Mini, Blazer, Blackberry browser; feature phones with browsers Open wave, Access NF and hybrid phones with browsers Teleca Browser, Open wave.
- Supported production releases of Vzstart and other applications for every change control.
- Worked with Jmeter for performance testing and performance tuning.
- Gathered performance test requirements from Business users and Analysts.
- Created various performance test scenarios to identify application bottlenecks.
- Worked with System administrators to resolve performance issues.
- Worked with Jmeter for Creating Performance test scenarios.
- Performed load testing to compare two different servers like message broker server and Application server.
- Tested different Mobile Handsets for various applications to check functionality.
- Used Device Anywhere for Mobile app testing.
Environment: Windows, SQL, Oracle, IBM Build Forge, Jmeter, TFS, Device Anywhere, Quality Center, Mozilla Fire fox, MS Access, Toad, SQL Navigator, QTP, UNIX, Black berry simulators, Jira.
Confidential, NJ
Sr.QA Analyst
Responsibilities:
- Developed detailed Testing Strategy for the entire application and developed various test cases.
- Initiate, coordinate and implement the system testing Process.
- Involved in the Manual/Automated Testing.
- Extensively worked on Insurance Modules pertaining to commercial and personal lines.
- Involved in Testing the Quotes for Auto insurance, Health Insurance for different states.
- Also involved in testing the Property and Casualty Insurance pertaining to Small Business, Large Business.
- Created various Vuser Scripts for the Application using LoadRunner.
- Enhanced and modified the scripts according to the test case scenarios.
- Extensively worked on Virtual User generator and Controller.
- Parameterized the scripts and enhanced them according to the test case.
- Used Performance monitor and LoadRunner graphs to analyze the results.
- Performed regression testing for the application.
- Used Quick Test Pro to perform automated testing.
- Converted all manual test cases into QT Pro automated scripts.
- Performed Database, XML, Table, Gui, Check Points in Quick test pro scripts to check the validity of data.
- Worked on XML and SOAP Services.
- Modified QT Pro scripts in expert mode and various user-defined functions were written.
- Developed Scripts for performance and data driven test using QT Pro.
- Performed Black Box Testing, User Acceptance Test and extensively used Win Runner for Regression Testing.
- Regression Testing is performed and the additional scripts are generated for each version.
- Extensively used Test Director, for test planning, bug tracking and reporting.
- Scheduling, Interacting, invoking the WinRunner 7.6 Scripts are done using Test Director.
- Used Controller to Perform Load Test, Longevity test and Stress Test.
- Different Loads at the increments of 10 starting from 5Virtual Users 20 Iterations to 250 Virtual Users were ramped until it reached 100% CPU.
- Web Interface Protocols are initially determined before generating the test scripts and later the scripts are generated.
- Worked with HTML and JSP pages to get the response times.
- The Average CPU usage, Response time, TPS are analyzed for each scenario.
- The Event Log Viewer was used to analyze the Application, System and Security errors. Depending on the severity of the errors the developers were contacted and the problems were rectified.
- Worked on SQL for backend Validations.
- Performed backend testing on Oracle, executed various DDL and DML statement.
- Worked on WinRunner 7.6, LoadRunner 7.8, Test Director 8.0.
- Performed both Manual and Automated testing.
- Worked on Web Interfaces functionality and Load Testing them.
- Extensively used NT Performance Monitor to analyze the System Bottlenecks like Memory Leaks, CPU Utilization etc and also Network Bottlenecks.
- Also analyzed the LoadRunner reports to calculate Response time and Transactions Per Second (TPS).
- Performed database testing in Real Time Inventory module.
- Regular meetings and updates are made to the Management team of the ongoing QA process.
Environment: Java, JSP, JavaScript, HTML, WebLogic, J2EE, VBScript, Oracle, Clear case, Clear Quest Windows NT, WinRunner 7.6, LoadRunner 7.8/8.0, Quick Test Pro, Test Director 8.0, UNIX, Windows NT.
Confidential, CA
QA Analyst
Responsibilities:
- Coordinate, Mentoring of various tools and process flow of the application.
- Used WinRunner to perform regression and Integration Testing.
- Master GUI Map is maintained in the central repository and changes are made to it for every version.
- Regression Testing is performed and the additional scripts are generated for each version.
- Used Quick Test Pro and generated various scripts for performing functional testing.
- Data driven testing is performed and the scripts are enhanced.
- Test Director is used for maintain the scripts and all the Automated scripts are invoked.
- Volume and stress testing is performed using LoadRunner.
- The LoadRunner Web Scripts were generated and modified to perform the data Driven Testing.
- The web server performance is monitored using performance monitor and various reports were generated.
- Various System bottlenecks like CPU utilization. Memory Utilization were found and analyzed.
- Used Test Director to generate the reports regression testing is performed for the application.
- Performed Functionality, Regression, Integration and Compatibility Testing.
- Performed Stress/Volume Testing using Load Runner 7.0
- The application was developed using ASP and SQL Server as the backend. Windows NT Cluster Server was used for fail over.
- Also executed various SQL Queries to perform the backend testing.
- Extensively worked using DDL and DML Statements.
- Had regular meeting with Developers to report various problems.
- Parameterized the Scripts for different login Id’s, they were initially run in Stand-alone mode and various Bugs were rectified.
- The generated Scripts were invoked into the Controller and four workstations were made active to run the Scenario.
- Using Controller Various Scenarios for 10, 20, 30 and ramped up to 500 Virtual users were created and they were run each for 20 iterations.
- The Output log was analyzed after running each scenario to check whether the entire Asp’s in the script are properly redirected.
- The Scenario Execution, Performance Summary, Failed Transaction Reports were generated and analyzed.
- Worked with various teams to fix problems and duplicate online problems in the lab.
Environment: ASP 2.0, VBScript, JavaScript, HTML, DHTML, MS-SQL Server 6.5,Clear case, clear quest, IIS 4.0, Windows NT/95, UNIX, LoadRunner, Quick Test Pro, WinRunner, Test Director .
Confidential, CA
Performance Engineer/ Quality Analyst
Responsibilities:
- Played significant role in planning and implementation phases of Testing. Part of Process team and was extensively involved in improving the Testing Strategy.
- The Application was divided into various sub Application for which the test specifications were prepared.
- Extensively used LoadRunner 6.0 for Performance and Load testing.
- Used Astra Quick test to generate various Scripts and in the Virtual User Generator these Scripts were parameterized and modified.
- Used Controller to Perform Load Test, Longevity test and Stress Test.
- Different Loads at the increments of 10 starting from 5Virtual Users 20 Iterations to 250 Virtual Users were ramped until it reached 100% CPU on FEW.
- The Average CPU usage, Response time, TPS are analyzed for each scenario.
- The Event Log Viewer was used to analyze the Application, System and Security errors. Depending on the severity of the errors the developers were contacted and the problems were rectified.
- Extensively used NT Performance Monitor to analyze the System Bottlenecks like Memory Leaks, CPU Utilization etc and Network Bottlenecks.
- Also analyzed the LoadRunner reports to calculate Response time and Transactions Per Second (TPS).
Environment: Java, JavaScript, HTML, DHTML, Oracle, Windows NT, loadRunner5.02/6.0
