Performance Test Engineer Resume
Piscataway, NJ
SUMMARY
- Over 8 years of experience in defining Testing Methodologies, Designing Test Plans and Test Cases, Verifying and Validating Application Software and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Life Cycle (SDLC).
- Strong experience in preparing Performance Test Plans, Performance Test Strategy, Performance Test Analysis Reports.
- Technical Expertise starting in Automation, Regression, System, Security, integration, User Acceptance and Functional Testing.
- Experienced in manual testing and automated testing using Test Suite like QTP, Quality center WinRunner and Load Runner.
- Expertise in Problem solving and bug reports using Bug Tracking Tools, worked on agile and waterfall models.
- Proficient in analysis of Business Requirements, Use Case Documents, Functional Requirements and System Requirement Specifications.
- Experience in implementing complex functional tests that require an understanding of the application logic and excellent problem analysis and bug reporting.
- Good experience in Setting up Performance environment, Monitoring Strategy and Configuration.
- Experience in Configuring and using Sitescope Performance Monitor to monitor and analyze the performance of the server by generating various reports from CPU utilization, Memory Usage to load average etc.
- Extensively Worked in Web, Citrix Click and Script, Oracle Protocol, Seibel, Winsock, SOAP protocols.
- Good experience in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
- Hands on experience with Functionality Testing, Integration Testing, System Testing, GUI Testing, Regression Testing, Performance Testing, Stress Testing, Load Testing, Volume Testing, User Acceptance Testing, Database Testing, Smoke Testing and Sanity Testing.
- Proficient in working with VuGen, Performance Center, LoadRunner, WinRunner, Quick Test Pro, Site Scope, JIRA, Test Director and Quality Center.
- Experienced in testing of Web - based applications and Client/Server applications.
- Extensive experience working with VBScript for implementing business logic in Quick Test Professional.
- Experience with programming languages C, C++ and VB Script.
- Experienced in UNIX (Use of Process, Network, Information, Pattern Searching, directory & File command), Shell and Perl Scripting with use of Testing.
- Experienced in Developing complex SQL Queries and Procedures to perform database testing.
- Experienced in Developing and Maintaining Test Scripts, analyzing bugs and interacting with development team members in fixing the defects.
- Strong working experience in fast paced environments, taking initiative in project planning, handling multiple assignments, excellent understanding of testing methodologies, test tools and an eagerness in honing my testing skills.
- Highly motivated, self-starter able to work independently and collaboratively within a diverse technical team. Excellent verbal and written communications skills.
TECHNICAL SKILLS
Software Programming: SQL, PL/SQL, TSL, FoxPro, SQA Basic, C and C++, UNIX
Software Tools: MS Office, Visual Source Safe, Ms Project
Application Software: MS Access, MS Excel, MS PowerPoint, Telnet, and FTP
Operating Systems: Windows Family
Web Technologies: HTML, DHTML, CSS, XML, XSD, XSL, XSLT, XPATH, AJAX, STRUTS, ECLIPSE.
Databases: MS SQL Server, Oracle, MS Access, Sybase.
Testing Tools: Load Runner, Wily Introscope, WinRunner, Mercury Quality Center, QTP, Selenium IDE, Selenium RC, SoapUI, LoadUI.
PROFESSIONAL EXPERIENCE
Confidential - Piscataway, NJ
Performance Test Engineer
Responsibilities:
- Analyzed System or Business requirements and identified the Test scenarios based on requirements and involved in preparation of Templates for Test cases.
- Responsible for developing Performance Testing Plan and Performance Testing strategy based on the business specifications requirements and user requirements.
- Script and execute load tests using LoadRunner.
- Developed VUser scripts and enhanced the basic script by adding Custom code.
- Prepared data for Parameterization of the values in the scripts for multiple scenarios by querying the Oracle data.
- Introduced rendezvous points in the script for stressing the application for specific transactions.
- Responsible for developing baseline Scenarios and Load Testing Harnesses for load/performance testing of the application.
- Performed testing for No load, Medium Load and Full Load and analyzed the system response.
- Responsible for performance monitoring and analysis of response time & memory leaks using throughput graphs.
- Analyzed the LoadRunner reports to calculate Response time and Transactions per Second.
- Monitored system resources such as CPU Usage % of Memory Occupied VM Stat I/O Stat.
- Responsible for monitoring Net Stat to check the connectivity Load Balance and network traffic in each of the JVM's by using Unix Shell Scripting.
- Captured Java threads and Exceptions in the application logs for analysis.
- Collected and maintained PBDs metrics using Wily Introscope.
- Responsible for monitoring & tracking network traffic using Big F5 default graphs.
- Developed weekly reports of Performance data and metrics.
- Created performance narrative documents.
- Participated in defects meeting to discuss the bottlenecks and long running queries.
- Involved in walkthroughs and meetings with Performance team to discuss related issues
Environment: LoadRunner, Performance Test Center, Quality Center, HTML, WebLogic, XML, SQL, Windows XP/Vista, Sun Solaris 10.x, CA Wily Introscope, Unix Shell Script.
Confidential - Dallas, TX
Performance Engineer
Responsibilities:
- Responsible for overall project QA activities which includes system requirements and design; test strategy and test case development; test results documentation, prioritization and resolution and also end user acceptance testing.
- Analyzed business and user requirements /specifications to ensure the application adheres to business standards.
- Involved in entire QA Life Cycle, which includes Designing and Developing and Execution of the entire QA Process and documentation of Test Plans, Test Cases, Test Procedures and Test Scripts exclusively on QC as well as manually on MS-word and MS-Excel.
- Creation of Project test plan, Requirement traceability matrix and execution plan for the entire release.
- Designed end-to-end test environment to simulate the real time production user scenario.
- Extensively worked on web services, SOAP, http/web, web click, and .net protocols.
- Generated, parameterized and modified VUser Scripts in LoadRunner.
- Automated the test scripts for the performance and verification of the response time under different load conditions.
- Used Performance Center to Perform Load Test, Longevity test and Stress Test.
- Responsible for developing Test scenario with the specified number of users based on load distribution percentages.
- Developed test cases and testing the service layer components using JUnit and java.
- Created common functions for load runner scripts using C Pointers and Structures.
- Defects were tracked, reviewed, analyzed and compared using QC and also for requirement gathering and maintaining different test plans and test cases as a repository and doing analysis, scheduling, generating and running test cases, generated reports and documented them and communicated test results on daily basis with my manager and development team.
- Used Star team as a repository for all specifications, PowerPoint presentations, UI specifications for whole group of QA team, developers, Business Analyst and other groups.
- Diagnosed performance bottle-necks, performed tuning (OS and the applications), retesting, and system configuration changes for application performance optimization.
- Involved in preparation of End-End Scenarios which covered complete flow of the application.
- Configured a large number of metrics for windows and Unix servers based on set templates and custom creations using SiteScope.
- Used SQL queries were to retrieve data from tables and to perform Back-End testing.
- Published test strategy document and final project reports for the changes required to be made in the production environment.
Environment: LoadRunner, QTP, Performance Center, Quality Center, JUnit, HTML, UNIX Shell Script, JavaScript, Windows/UNIX, SQL.
Confidential, Chantily, VA
Performance Tester
Responsibilities:
- Analyzed System Requirement Documents (SRDs) and Business Requirement Documents (BRDs) and contributed to test plan, which includes test objectives, test strategies, test environment, test priorities, traceability matrices, test cases etc and bug isolation testing.
- Developed comprehensive test plans and test cases based on business requirements, system requirements, high-level design document and detailed design documents for various modules in QC as well as in MS-word and Excel.
- Responsible for designing and developing QA test strategy/test scope during test plan and also involved in test execution during QA phase.
- Coordinated with Developers for defect analysis and also with Business Analysts for any changes in specification while performing Regression Testing.
- Met with client groups to gather performance requirements and goals in order to determine test strategies.
- Extensively worked on Citrix, Oracle NCA and Web protocol in LoadRunner.
- Involved in Preparing Test Plans and Test Cases based on business requirements.
- Developed VuGen Scripts and Scenarios using LoadRunner Controller.
- Analyzed LoadRunner/Performance Center test result.
- Used QTP to develop scripts to perform Functionality and GUI testing.
- Inserted rendezvous points in order to simulate heavy loads for conducting Load Testing.
- Simulated real-time scenarios by using ramp-up and ramp-down in LoadRunner.
- Enhanced the scripts by adding control and conditional statements using VB script.
- Analyzed and identified bottlenecks in the server using LoadRunner's Online Monitors.
- Responsible for Identifying functionality and performance issues including: deadlock conditions database connectivity problems and system crashes under load.
- Responsible for performing vertical scaling and garbage collection.
- Confirmed the scalability of the new servers and application under test after the architecture redesign.
- Conducted weekly meetings with Project development teams.
Environment: LoadRunner, Quality Center Performance Center, J2EE, Oracle, QTP, Site Scope, MS Office, MS Access, MS Vision, MS Project.
Confidential, Auburn Hills, MI
Performance Tester
Responsibilities:
- Involved in gathering Business Requirements, developing Test Plan and Test Cases, coordinate code promotions with Software Configuration Team, regression testing, and certifying the product.
- Extensively interacted with Business Analysts, Developers and end users to test according to their requirements.
- Followed the Agile Methodology for Software Development Life cycle.
- Created Vuser scripts using VUser Generator for multiple protocols like (HTTP/HTML) to verify the performance of the application under heavy or peak loads.
- Created and scheduled the scenarios using Loadrunner controller.
- Configured the load generators to ramp up and exercise the application with the required number of Virtual users using Loadrunner.
- Created the test data for interpreting positive /Negative results during functional testing.
- Conducted test against the requirements in line with the Test Plan and Test Schedule.
- Conducted different types of Performance testing such as Load, Stress, Volume Endurance and Fail over Testing.
- Performed Functionality testing of the web page objects such as HTML links and topic path.
- Validated the application, fetches the trade details correctly during market open and market close hours.
- Verified that the traded funds could be exchanged and margin borrowing functionalities are processed as expected.
- Created the automated test scripts for different modules using Quality center.
- Validated reports that reflect the trades processed for the day.
- Identified reusability test cases / test scenarios from existing test case repository for regression tests.
- Logged defects using PVCS Tracker.
- Conducted multiple runs to validate the fixes to ensure that performance bottlenecks have been resolved and identified SLA’s are achieved.
- Involved in defect tracking and monitoring till defect closure.
- Responsible to lead testing effort with business analysts, customers, engineering and offshore QA testers.
- Created detailed QA documentation including QA reports, actively participated in SQA and project status meetings.
- Written specified SQL scripts to test the areas defined in each requirement.
- Worked as a Project Coordinator and performed all QA management activities to ensure QA milestones are achieved.
Environment: Java, HTML, JSP, VB Script, SQL Server, Windows XP, Loadrunner, Quality center, PVCS Tracker.
Confidential, San Francisco, CA
QA Tester
Responsibilities:
- Worked with the Recoveries team to analyze portfolio of all the loans granted by the company to determine all the bad loans.
- Worked extensively with the Recoveries team to analyze the bad loans to determine root cause of the problem.
- As a part of the Recoveries group, developed new promotional strategies trying to help out the loan defaulters pay their loans.
- Involved for developing the strategies like no penalties, less interest rate etc. to help recovery from the bad loans.
- Conducted meetings with loan officers and loan defaulters to help mediate the problem and come up with new solutions and recommendation.
- Designed Test cases from Requirements, Functional Specifications and Design Documents.
- Involved in GUI Testing and Functional Testing while testing the screens to view, enter, update, and delete customer information who is defaulter.
- Performed database testing by writing SQL Queries for validating the data.
- Define & implement QA Processes & Standards.
- Used Test Director to write the test cases.
- Used Test Director to create reports and graphs.
- Successful in handling and executing of manual testing.
- Tested the software for peak load with maximum users using LoadRunner.
Environment: Visual Basic, Oracle9i, Windows NT, MS Word, MS Excel, MS Project.