Performance Test Engineer Resume
MinneapoliS
SUMMARY:
- About 8+ years of experience in Information Technology with emphasis on Software Quality Assurance.
- Experience in analyzing Business requirement documents and developing test scripts.
- Expertise in scripting SAPGUI, Citrix, HTTP, Web Services and SAP HTML protocols using HP Load Runner.
- Experienced in managing offshore/onshore delivery model
- Effective Time Management Skills and consistent ability to meet deadlines.
- Excellent management skills and possess ability in gathering requirements.
- Proven ability to manage a team of QA testers and set test strategies.
- Presided over walkthroughs and other related meetings
- Excellent communication, presentation and interpersonal skills.
PROFESSIONAL EXPERIENCE:
Confidential, Minneapolis
Performance Test Engineer
Responsibilities:
- Conduct performance testing in an agile development environment
- Developed performance test scripts using HP Load Runner 12.02
- Experienced in managing offshore/onshore delivery model
- Scripts were executed with normal and peak loads to observe the system characteristics.
- Conducted load, stress, and volume testing.
- Day in the Life Scenario was executed simulating the real user load.
- Managed and coordinated testing activity using HP Performance Center 12.02.
- Results were analyzed and shared with the team.
Environment: Load Runner 12.02, Quality Center 10, SOASTA, and MS Office
Confidential, Minneapolis
Performance Test Engineer
Responsibilities:
- Conduct performance testing in an agile development environment
- Developed performance test scripts using HP Load Runner 12.02
- Experienced in managing offshore/onshore delivery model
- Scripts were executed with normal and peak loads to observe the system characteristics.
- Conducted load, stress, and volume testing.
- Day in the Life Scenario was executed simulating the real user load.
- Managed and coordinated testing activity using HP Performance Center 12.02.
- Results were analyzed and shared with the team.
Environment: Load Runner 11.52, Quality Center 10, New Relic, SOASTA, and MS Office
Confidential, Minneapolis
Performance Test Engineer
Responsibilities:
- Conduct performance testing in an agile development environment
- Setup test data to support performance testing activities
- Developed and enhanced scripts using HP Vugen using web http protocol.
- Conducted load, stress, and volume testing.
- Gathered user behavior patterns and analyzed the data during open enrollment period.
- Managed and coordinated testing activity using HP Performance Center with multiple teams
- Monitored system and resource utilization for the application using New Relic.
Environment: Load Runner 11.52, Quality Center 10, New Relic, SOASTA, and MS Office.
Confidential, Minneapolis
Performance Test Engineer
Responsibilities:
- Involved in gathering business requirements and identified Performance Test Scenarios.
- Involved in the preparation of test plan and test strategy
- Participated in the designing and development of performance test scripts
- Streamlined defect tracking process using HP ALM
- Developed performance test scripts using HP Load Runner
- Identified the Test environment for Performance Testing.
- Helped to ensure overall products qualities and delivered on time.
- Communicated with Development team and Clients and achieved the quality product.
Environment: Load Runner 11.0, Quality Center 10, and MS Office.
Confidential, Minneapolis
Performance Test Engineer
Responsibilities:
- Involved in the preparation of performance test plan
- Coordinated with the functional team documenting test cases and test data.
- Key performance metrics were captured while gathering requirements
- Experienced in managing offshore/onshore delivery model
- Involved in VTO (Volume Test Optimization) along with SAP resources.
- Scripts were executed with normal and peak loads to observe the system characteristics
- Day in the Life Scenario was executed simulating the real user load.
- Communicated with Development team and Clients and achieved the quality product.
Environment: Load Runner 11, SAPGUI 7.2, MS Office 2007, Quality Center 10
Confidential, Minneapolis
Performance Test Engineer
Responsibilities:
- Documented the test plan, test cases for the application under test.
- Involved in the preparation of performance test plan and strategy
- Scripts were executed with normal and peak loads to observe the system characteristics
- Day in the Life Scenario was executed simulating the real user load.
- Communicated with Development team and Clients and achieved the quality product.
Environment: Load Runner 11, SAPGUI 7.2, MS Office 2007, Quality Center 10, Win XP, Fiddler
Confidential, Minneapolis
Performance Test Engineer
Responsibilities:
- Coordinated with the functional team documenting test cases and test data.
- Business critical templates identified under Expense ARR / GL / POS application were tested.
- Coordinating between off - shore and onshore teams.
- Key performance metrics were captured while gathering requirements.
- Used fiddler for capturing http/https request between client and server application.
- BPC Load Runner tool kit is used for developing load runner script.
- Scripts were executed with normal and peak loads to observe the system characteristics
- Day in the Life Scenario was executed simulating the real user load.
Environment: Load Runner 11.0, Quality Center 10, Fiddler 2.3.5.2, SAP BPC NW 7.5 SP 8.1, and MS Office.
Confidential, Minneapolis, MN
Performance Test Engineer
Responsibilities:
- Involved in gathering requirements, capturing key metrics, and documenting the business process.
- Involved in the preparation of test plan and test strategy document.
- Coordinated with the functional team documenting test cases and test data.
- Coordinated team size of 5 performance testers between off-shore and onshore.
- Script development was shared between off-shore and onsite in the ratio 60:40.
- SAP Web, SAP GUI, and Http/Html protocols were used for developing scripts.
- Scripts were tuned and benchmarked for normal and peak user loads.
- Test the groups of transactions identified as “Day-in-the-Life” scenarios to confirm that they meet their performance expectations when run concurrently.
- Conducted network latency testing using SHUNRA.
- Designed load, stress, endurance, and Day in the Life testing scenarios using Load Runner.
- Analyzed load runner reports and results to identify bottlenecks in the application.
- Results were analyzed, documented and shared with the business for review.
- Quality Center is used for documenting the results and logging defects.
Environment: Load Runner 11.0, Quality Center 10.0, and MS Office.
Confidential, Linthicum Heights, MD
Performance Tester
Responsibilities:
- Prepared test cases and documented the test data.
- Involved in script development and enhancement using Load Runner.
- Inserted Transaction points to measure the application transaction response time.
- Coordinated effectively between onshore and offshore teams.
- Test Data is developed by executing SQL queries and extracting the data from the database.
- Conducted Load, Stress, Scalability, and Endurance testing using Load Runner for Day in Life scenario.
- Analyzed Load Runner results and graphs to identify bottlenecks in the application
Environment: Load Runner 9.5, SQL, and MS Office.
Confidential, Boston, MA
Performance Tester
Responsibilities:
- Involved in both Manual and Automated Testing.
- Developed Test Cases based on the requirement documents.
- Generated Scripts using Vuser Generator based on workflow gathered from business users.
- Inserted Transaction points and Rendezvous points to measure the performance/response of the application.
- Created scenarios in Load Runner Controller based on user loads.
- Conducted load, stress, and volume testing.
- Analyzed the results using graphs and report generated from the test.
Environment: Load Runner, SQL, and MS Office.
Confidential, Washington
Test Analyst
Responsibilities:
- Developed and validated the test scripts based on the business critical scenarios.
- Conducted Regression Testing for subsequent versions of the application using Load Runner and WinRunner
- Analyzing scenario performance, cross results, graphs and reports in Load Runner
- Categorized test cases and test data for functionality, integration and end to end testing.
- Performed various types of testing, such as Functional, Negative, Regression and User Acceptance Testing.
- Performed integration testing during different testing phases.
- Used Quality Center to identify, logging, reporting and prioritizing defects
Environment: TOAD, QTP, Load Runner, Quality Center.