Performance Tester Resume
Palo Alto, CA
SUMMARY:
- Over 7+ years of experience in IT Industry in automation, performance/load and manual testing for Web - based applications.
- Strong experience in Manual, Automation and Performance Testing of client Server and Web based Applications.
- Expertise in preparing Test Plans, developing, reviewing and executing Test Cases and Test Scripts based on Functional Requirements, Business Requirements and Use Case Documents.
- Experience in Smoke Testing, Functional Testing, Integration Testing, GUI Testing, Regression Testing, Load/Performance Testing, System Testing, User Acceptance Testing.
- Expertise in developing, documenting and executing test cases manually and generating automated scripts using Quick Test Professional, Selenium and WinRunner
- Extensive experience in Performance Testing using Load Runner.
- Hands on experience in modifying Vuser scripts using Load Runner.
- Experienced in Performance Diagnostics tools such as HP Diagnostics and CA Wily Introscope in detecting bottlenecks such as very high CPU usage, memory leaks and assist developer to optimize the code.
- Expertise in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report
- Expertise in designing, developing and executing customized performance tests in Load Runner.
- Experience in Test Management and Defect Reporting Tools like Quality Center and Test Director for analyzing requirements, documenting and executing test cases, defect tracking and status reporting.
- Strong hands-on knowledge in software models: Waterfall. Very good understanding of Agile (Scrum) and Lean methodologies.
- Executed test scripts to validate test cases, which includes creating and executing complex automated test scripts.
- Experience in Creating test documents like QA Status Report, QA Summary Reports, QA Test Logs, Test sign-off, Maintain issue and defect databases.
- Working Experience with Quality Center, QTP, Netstrom, Load Runner tools.
- Good team player with the ability to manage and work independently.
- Good analytical and communication skills and ability to work independently with minimal supervision and also perform as part of a team.
TECHNICAL SKILLS:
Testing Tools: Selenium (IDE, RC, WebDriver), SOAP UI, HP QTP, HP QC. Load Runner, Postman, JMeter, Netstorm, Quality Center, QTP, WinRunner
Defect tracking tools: Jira, Bugzilla confluence.
Programming Language: Linux, C, C++, Java 1.7, 1.8, PL/SQL, SQL.
Web Technologies: HTML5, CSS3, XML, JSON.
Domain Knowledge: Banking, E-commerce, and Web Based Application.
Version Control Tool: SVN, Github
Database: SQL Server 2014/2012, Oracle, DB2
Operating Systems: Window XP/Vista/7/8.1, Linux, UNIX
PROFESSIONAL EXPERIENCE:
Confidential, Palo Alto, CA
Performance Tester
Responsibilities:
- Involved in complete Software Testing Life Cycle from analyzing requirements, test planning, developing test cases, setting up the test environment, test execution, defect reporting and test closure.
- Analyzed the business requirements, functional specifications and use case documents and developed test plan.
- Documented test cases and test scripts using Quality Center.
- Designed, developed and executed performance test scripts in Load Runner for performance testing.
- Created and scheduled the scenarios using Load Runner controller.
- Involved in recording and modifying Virtual User Scripts using Load Runner VuGen.
- Performed load testing and generated scripts for the stress testing using Load Runner.
- Created scenarios using Load Runner controller and executed them.
- Using LoadRunner, executed multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller.
- Designed the performance test scenarios for smoke test, baseline test, scalability test and stress test.
- Analyzed performance, transaction, and server resource monitors for meaningful results for the entire test run using LoadRunner Analysis.
- Used dynaTrace to measure web site performance in test environment to capture performance metrics of key product features.
- Configured various online monitors to capture runtime measurements, transaction measurements, web resource measurements, network delay monitors, application server monitors and documented them.
- Performed Functionality testing, Usability testing and interface testing.
- Developed performance test scripts and scenarios to simulate user actions, performance test runs, analyze the test execution using JMeter.
- Conducted cross browser testing with internet explorer, Mozilla Firefox, Google chrome and Safari.
- Used Quality Center for documenting all phases of QA process.
- Created VU Scripts with HTTP and web services protocols.
- Validated the Requests and Responses in SOAP UI.
- Used the SQL’s and verified the SQL server database for the purpose of data verification.
- Performed Link verification using XENU.
- Defect reporting and tracking is done using Quality Center.
- Participated in QA status and bug review meetings.
Environment: Load Runner 12.0, DynaTRace, JMeter, Quality Center 11.0, VB Script, VU Scripts, HTML, DHTML, SQL, SOAP UI, Windows.
Confidential, San Antonio, TX
Performance Tester
Responsibilities:
- Analyzed System or Business requirements and identified the Test scenarios based on requirements and involved in preparation of Templates for Test cases.
- Responsible for developing Performance Testing Plan and Performance Testing strategy based on the business specifications requirements and user requirements.
- Script and execute load tests using Load Runner .
- Developed VUser scripts and enhanced the basic script by adding Custom code.
- Prepared data for Parameterization of the values in the scripts for multiple scenarios by querying the Oracle data.
- Setup Monitors for CPU Utilization and Memory Utilization on Web, Application and Database Servers using HP SiteScope.
- Introduced rendezvous points in the script for stressing the application for specific transactions.
- Responsible for developing baseline Scenarios and Load Testing Harnesses for load/performance testing of the application.
- Used Performance Center to Perform Load Test, Longevity test and Stress Test.
- Generated, parameterized and modified VUser Scripts in Load Runner.
- Performed testing for No load, Medium Load and Full Load and analyzed the system response.
- Responsible for performance monitoring and analysis of response time & memory leaks using throughput graphs.
- Analyzed the Load Runner reports to calculate Response time and Transactions per Second.
- Monitored system resources such as CPU Usage % of Memory Occupied VM Stat I/O Stat.
- Responsible for monitoring Net Stat to check the connectivity Load Balance and network traffic in each of the JVM's by using Unix Shell Scripting.
- Captured Java threads and Exceptions in the application logs for analysis.
- Collected and maintained PBDs metrics using Wily Introscope.
- Responsible for monitoring & tracking network traffic using Big F5 default graphs.
- Developed weekly reports of Performance data and metrics.
- Created performance narrative documents.
- Involved in walkthroughs and meetings with Performance team to discuss related issues
Environment: Load Runner, Performance Test Center, Quality Center, HTML, WebLogic, XML, SQL, Windows XP/Vista, UNIX Shell Script.
Confidential, Chicago, IL
Performance Tester
Responsibilities:
- Gathered and Analyzed Business User Requirements.
- Designed, executed and monitored execution of scenarios with Loadrunner based on the test plan, task distribution diagrams and usage profiles.
- Developed and managed Loadrunner scripts during development phase.
- Responsible for the parameterization and manual correlation of Vuser scripts in order to create reusable and flexible scripts to better satisfy the business requirements.
- Assisted in the development and maintenance of the performance testing framework, which provides efficient and reliable error handling logic, easy and automatic run-time data management, easy access to test data from an external database, reliable and reusable custom functions, and error reporting functionality.
- Developed custom C functions that interact with Vuser script to load test critical application functionality as specified in the business testing requirements.
- Executed full load test of 30000 concurrent users for Internet banking domain.
- Responsible for monitoring Database, Network, Application server during the execution to identify bottlenecks, bandwidth problems, infrastructure problems, and scalability and reliability benchmarks.
- Participated in cross-functional team meetings to identify and communicate risks, provide overall test direction, and communicate critical information and to understand the business requirements in detail.
Environment: LoadRunner 9.0, Oracle DB, WebSphere and Windows.
Confidential, San Francisco, CA
QA Analyst
Responsibilities:
- Designed and developed Test strategy, Test procedures, Test Cases, Test scripts test plan and Schedules by understanding the business logic and user requirements for manual and automated testing.
- Involved in requirement management, planning, scheduling, running tests, defect tracking and managing the defects and executing the test cases.
- Involved in all phases of test life cycle such as test design, test execution and defect logging and tracking.
- Manually executed non-automated Test Cases/Steps using different testing techniques such as: Positive/Negative, Security, Business Cases/Use Cases, Valid/Invalid data, Field Validation, Test Data Setup and Graphic User Interface Testing.
- 1Analyzed the results & reported bugs using Quality Center.
- Responsible for Functionality, and Regression Tests to insure successful integration with existing products.
- Participated in Walk through and Defect report meetings periodically with business users and developers
- Performed manual testing and Data integrity/Backend testing by executing SQL statements.
- Involved in Automation Script Creation Process using the HP’s QTP.
- Provided back end testing for database auditing and data validation using SQL scripts.
- Performed Smoke test after each build.
- Penned positive, negative test case and conducted tests as needed.
- Performed Verification of text, Database checkpoints and Synchronization points of application using QTP.
- Verified various logs in Application, web and database server.
Environment: Quick Test Professional 9.1, Quality Center, SQL server 2005, VB Script, Shell script, Oracle 9i, Win Runner, MS Office, Windows NT/XP, Mozilla Firefox, IE.
Confidential, Plano, TX
QA Analyst
Responsibilities:
- Involved in testing the functionality of Banking System application.
- Developed test cases by reviewing functional and business requirements.
- Tested the application manually for smoke testing to verify the functionality of the system.
- Extensively tested the functional areas of account creation, maintenance, reports, interest posting and certificate of deposits.
- Involved in testing back end and front end of the application.
- Automated the test scripts using WinRunner.
- Generated GUI maps for the application using Win Runner’s rapid test script.
- Performed system integration testing to ensure the integrity with database and used database checkpoint in WinRunner.
- Used WinRunner for regression testing and Load Runner for performance testing.
- Involved in different kinds of testing like functional testing, load testing, installation testing.
- Effectively used Test Director for Test design, development and defect tracking.
- Involved in the execution of test cases for system testing, regression testing, integration testing and user acceptance testing.
- Worked closely with Developers, Test Leads and Business Analysts to discuss about the environment issues, prioritizing the defect fixes, functional issues.
- Responsible for documentation of test results against each work order.
Environment: WinRunner 8.0, Load Runner 8.0, Test Director 8.0, SQL Server, Oracle, UNIX, Windows XP.