Qa Analyst Resume
Career Objective:
Over Seven years of experience in the Software Testing Industry with diversified experience in Manual and Automated testing of Web and Client/Server based applications on Windows and UNIXenvironment.
Looking for a secureposition ofQuality Assurance Analyst/ Software Test Engineer (manual/automation).
TECHNICAL SKILLS:
- Testing Tools: Quick Test Professional (QTP), Win Runner, Load Runner, Quality Center.
- Bug Tracking Tools: Test Director, Quality Center,Tracker and Bugzilla.
- Database: MS Access, SQL Server 2000s/2005, Oracle (10g, 11g) and PL/SQL Developer
- Software Application: MS Office, FrontPage, MS Outlook, Adobe Photoshop, and MS Visio.
- Operating system: Windows Vista, 2003, XP, 2000, NT, UNIX, Linux and DOS.
- Languages: C++, Java, XML, J2EE, VB Script, SQL, PL/SQL.
Professional Summary:
Strong experience in Information Technology industry as a Quality Assurance Analyst. In depth understanding about Software Development Life Cycle (SDLC), CMMI, ITIL and different types of methodologies such as waterfall, Agile, Iterative. Writing and executing test plans and test cases in highly structured environments and performing all types of testing. Solid Hands on experience inFunctional, Integration,andRegression testing, Black box testing, GUI testing, Back-end testing, User Acceptance testing (UAT), and Load/Performance testing. Expertise in testing tools like Test Director, Quality Center, PVCS Tracker, Quick Test Professional (QTP), Win Runner, Load Runner and Rational Robot. Performed Manual and Automated testing on both Client - Server and Web - based applications. Performed Functional, Regression, User Acceptance and performance test using Manual test Methodology and Automated testing tools. Ability to document and trackdefects using Quality Center, Test Director, and PVCS Tracker.Excellent in creating automated test scripts using Quick Test Pro (QTP) and Load Runner. Expert in Querying and TestingRDBMS such as Oracle, PL/SQL Developer, MS Access for data integrity. Proficient in MS Access, SQL/PL, Oracle (10g,11g) and UNIX (Shell). Conduct research and performed Data Analysis, giving me a good perspective on Statistical Methodologies. Excellent Communication Skills, ability to work as part of a team and on own.
WORK EXPERIENCE:
Confidential
Foster plaza, pittsburgh.
Job Title: QA Analyst
December 2012 to Present.
Working at Clinical Business Unit's IT dept.
This unit performs drug utilization review and uses the Aetna Rx Check DUR application. The clinical operations for the Aetna Rx Check Drug Utilization Review Program.
Responsibilities:
- Testing software build, version, new releases and any modifications and updates.
- Writing Test Plans, Test Cases
- Attending walkthrough meetings with the Business Analysts,
- Project Managers, Business Managers and QA Lead.
- Attending requirement review meetings
- and providing feedbacks to the Business Analysts.
- Working in different databases like Oracle and DB2
- Writing SQL queries to retrieve data from the databases using PL/SQL Developer and executing unix commands extensively.
- Doing various kind of manual testings to make sure the validation and verification of the project
Environment:
Oracle database ; Unix operation systems ; Power-builder application.
Confidential
Bethesda, Maryland
Job Title: Software Test Engineer
November 2009 to December 2012
Responsibilities:
- Expertise in preparing Test Plan and Test cases.
- Developed test cases for automation team for regression testing.
- Involved in Manual and Automation testing of Web and Client Server Application.
- Analyzed business requirements, functional specification, and the required documents for testing.
- Performed Functional, Negative, Smoke, System, Integration, Regression, UAT and Performance Testing.
- Involved in System Intercession Testing.
- Allocated priorities to all the test cases, taking into consideration the product module priorities.
- Conducted elaborate manual testing on test cases and provided feedback to the development team.
- Validated test to check boundary conditions and error messages.
- Used Quality Center for requirements management, planning, scheduling, running tests defect tracking and managing the defects.
- Wrote SQL statements in script to retrieve data from database in order to verify accurate data stored in database as well as back end testing.
- Analysis of automation tools and usage on their web based application and recommendations for Performance testing lab methodology as well as scripting for intranet/internet.
- Designed Automation Testing Strategies and implemented automation scripts that are linked to, and validated with end user service level requirements developed.
- Developed test approaches /test designs. Wrote test plans, created production simulations, and test data.
- Installed, configured, and administered the automated test tool.
- Defined the automated testing process, developed coding standards, and created and implemented a 3-tier architecture that served as the automation backbone and configured Hardware (Server / Client) for Project.
- Identified testing methodology functional, load, stress testing based on the business processes and analyzed the business requirements along with Product Manager & Project Manager to make Test determine Synchronization points using LoadRunner and QTP.
- Conducted GUI, Data driven, Backend and Functionality Testing using QTP.
- Responsible for performance testing script development and analysis reporting for applications as well as testing methodologies and software development lifecycle process improvement.
- Created statistically valid conclusions from quantitative test results.
- Documented the Test Results and presented the daily and weekly report to QA Manager.
Environment: Quality Center Enterprise Edition (QC), Quick Test Professional (QTP), HP Load Runner, .Net, Windows NT, SQL+, Java, J2EE, XML and IIS. MS Access, SQL Server 03/2005,Oracle (10g, 11g).
Confidential
Reston, VA
Job Title: QA Analyst
September 2007 to October 2009
Responsibilities:
- Reviewed Business Requirements Documents and the Technical Specification
- Involved in Planning, coordinating, developing Test Plans, Test Procedures, and Test cases based on the Requirements and Design Documents
- Conducted Functional, System, Integration, Regression, Performance, UAT and Smoke Testing of Sprint's Customer Care and Billing application (Ensemble) with specific focus on CSM (Customer Service Management), nView (CRM), Resource Management, (WLNP) Wireless Local Number Portability, and eCare modules.
- Developed Test Scripts and conducted Regression testing for various application modules using QTP.
- Analyzed Test results to meet the end user needs.
- Tested the application manually by executing test cases prior to automation.
- Used Quality Center to create and maintain the Test cases and the Test scripts for both System Testing and Regression Testing.
- Developed Test Plan for System, detail & over all Test Plans using the Business Specifications.
- Conducted the system integrated testing of the application manually for different Modules.
- Interacted extensively with development groups and end-users.
- Member of the team of test engineer, developed test plans to outline the scope, approach, resources, and schedule of testing.
- Designed manual test cases based on functional requirements and technical documents, and executed test cases during System, Regression and User Acceptance testing.
- Executed SQL queries to validate the front-end data with the database (backend). Used DML to manipulate the data wherever applicable to verify the functionality.
- Performed End-to-end Testing which involves complete application environment in a situation that mimics real-world use, such as interacting with a database, applications or backend system.
- Evaluated and documented the application's ability to meet the established requirements.
- Involved in the team meetings with representatives from Development, Database Management, Configuration Management, and Requirements Management to identify and correct defects.
- Tested many functional areas including uploading of published texts into the database, classification, and search results relevance estimation and sorting, Graphic User Interface, and sharing information.
- Performed testing of a web application developed for the customer access: developed test cases for functionality testing based on functional specification and feature descriptions, executed test scripts, verified bugs fixed.
- Attended weekly meetings to update project managers and discussed different issues and solved.
- Occasionally met with developers to solve issues before opening defects.
- Parameterized the scripts to use different data sets.
- Responsible for automated testing of Smoke Test and Regression Test using QTP.
- Used Rendezvous point to better control and generate peak load on the server thereby stressing it, and measuring its performance.
- Monitored CPU utilization, Network traffic, Database Operations, Disk I/O Utilization, Web Server, and Weblogic Specific Parameters.
- Monitored the system under test for areas of concern. This included, but is not limited to, CPU utilization, memory usage, and network load.
- Monitored Application server, BD server, Web Server, and Window Resources and identify the bottlenecks during performance test.
- Using Sitescope to monitor Server performance.
- Monitored the time was being spent, memory leaks and the efficiency of I/O operations.
- Analyzed Transaction Profile diagrams to identify the business process that needs load testing using LoadRunner.
Environment: HP Quick Test Pro (QTP), HP LoadRunner, TestDirector, Windows NT, SQL, Java, J2EE, HTTP, XML and IIS.
Confidential
New York, NY
Job Title: Software Test Analyst
June 2005 to August 2007
Responsibilities:
- Generated and implemented Test Plans, Test cases, Test Scripts on different applications.
- Involved in Manual and Automated testing of the front-end application.
- Performed Functional, Regression, Integration and End to End testing using WinRunner.
- Created Database Checkpoints and conducted Back End Testing.
- Performed Data Driven testing using WinRunner.
- Written the scripts manually on Excel spreadsheets and electronically record using WinRunner along with the required Test Cases.
- Tested the functionality of most commonly used panels, using WinRunner, created and added logic to the script with conditional statements, loops and arithmetic operators to create more powerful and complex test.
- Performed Data Validation and Data Integration for Back End Testing using SQL Queries manually.
- Reported issues into the bug reporting system and follow up with the development team until it gets resolved using Quality Center.
- Performed Performance/Stress Testing using tools such as such as LoadRunner and Batch testing.
- Actively participated in the team meeting with the developer and business to understand the test requirements effectively.
- Involved in testing the applications during System testing, Integration testing and Regression testing.
- Involved in Performance Analysis, documentation, code, and provided an assessment of potential performance bottlenecks.
- Provided a comprehensive final document which contained detailed statistical analysis of testing results, conclusions and guidance, and suggestions of ways performance may be improved.
- Worked with the users to write requirements, design scripts related to the application testing.
- Linked the Test requirements and procedures in Quality Center.
- Actively involved in testing the online screens, batch process that was used to run and maintain the functional process of the application.
- Involved in capturing the test results, which were analyzed for completeness of coverage, percentage executed and percentage passed.
- Tracked the Defects and generating reports along with the Access products.
- Utilized Quality Center as a request tracking system for tracking the Defects and generating reports.
- Performed Stress, Performance and Endurance testing using LoadRunner.
- Conducted Server Load test by enabling/disabling Rendezvous point.
Environment: WinRunner, Mercury Load Runner, TestDirector, Windows XP, SQL Server 2000.
EDUCATION:- Bachelor of Computer Information Systems (CIS)