Performance Test Engineer Resume
Menomonee Falls, WI
SUMMARY
- Performance Tester with 8+ years of IT experience.
- A self - driven, adaptable and quick learning professional with an in-depth understanding of Software Development Life Cycle and Performance Testing.
- I am an effective communicator, good team worker and passionate about constantly enhancing my competence in technical and functional process development to proficiently deliver results.
- Good knowledge of Silk Performer LoadRunner JMeter and other load generation software applications.
- Strong documentation and report-writing skills.
- Extensive understanding of Weblogic Operating systems Networks Software (JAVA).
- Excellent time-management and time tracking abilities.
- Good collaboration skills for systems tuning.
- Very detail-focused and highly analytical.
- Above-average oral and interpersonal abilities.
- Extensive experience in Performance Testing of Client-Server, Web Services, Mobile and Web based applications in various domains like Retail, Logistics, Telecommunication and Finance.
- Proficient in test script and scenario development with 8 years of hands-on experience in LoadRunner, Silk Performer and JMeter.
- Proficient in Testing Tools such as HP Quick Test Professional (QTP) HP Load Runner, HP Performance Center, SOAP UI, HP Quality Center (QC).
- Well versed with various types Software Development methodologies - Waterfall, Agile, and SDLC.
- Solid understanding of requirements and creation of test cases out of Business Requirement Documents and Functional Requirement Documents.
- Strong experience in preparing Performance test plan, developing Performance test strategies, test cases and test scripts based on the requirements.
- Experienced in conducting Smoke, System, Functional, Regression, Stress and Load testing and User Acceptance Testing.
- Experience working with Network and Java Application Performance Testing using JMeter, LoadRunner.
- Experience in performance monitoring tools like SiteScope, Wily Introscope, AppDynamics, Dynatrace, SPLUNK, TeamQuest, and HP Performance Manager.
- Experience in Creating the Scripts to meet load-testing requirements according to the SLA (Service Level Agreement) agreed upon.
- Extensive experience and expertise in Production performance issue resolution, working with architects in resolving performance issues in critical applications and working closely with the capacity team on resource utilization decisions.
- Worked on creating small Web services such as SOAP AND REST API scripts using requests like web service call and web custom request.
- Have good experience in concepts of JVM, Heap Utilization, and Web Server Settings and in tuning.
- Hands on working knowledge on back-end testing using SQL, PL/SQL.
- Expertise in Problem solving & defect Tracking Reports using Defect Tracking Tools (JIRA, Quality Center).
- Good experience in successfully managing onsite-offshore model as an Onsite coordinator.
- Solid presentation skills including experience with Power Point, Word, Excel.
- Excellent verbal, written and analytical skills with ability to work in a team as well as individually in fast paced, dynamic team environment.
TECHNICAL SKILLS
- HP LoadRunner V 12, HP Performance Center V 12
- Microfocus Silk Performer
- Apache JMeter 5.0 with Taurus and Jenkins, Blaze Meter
- Quick Test Pro, HP ALM 11
- HP ALM/Quality Center/Test Director, Bugzilla and Clear Quest
- Splunk, Wily Introscope, HP SiteScope and Windows Resources
- TeamQuest Vityl Monitor, Tableau, Grafana
- Dynatrace APM
- MS Office Suite 2007, 2003, 2000
- WebSphere, WebLogic, Tomcat
- Mongo DB Cloud, Oracle 10g/11g, DB2, MS SQL Server
- Windows, UNIX, LINUX
- Groovy scripting, C, Shell Scripting, Java
- Web (HTTP/HTML), Web Services, TruClient and Flex
- Agile, Waterfall, V Model
PROFESSIONAL EXPERIENCE
Confidential, Menomonee Falls, WI
Performance Test Engineer
Responsibilities:
- Collaborating in multiple projects with DBA, Leads, and managers and identifying Performance bottleneck issues and Gathering baseline performance metrics and compare against test metrics.
- Performing Test Data Management/Automation and Data Driven Testing Using JMeter with Jenkins and Taurus. Jenkins was use for building and deploying for every release into the performance environment.
- Developed scripts and scenarios for testing new and enhanced web-based products using JMeter and Cavisson NetStorm.
- Generated JMeter automation scripts and prepared the test data accurately with the help of additional sub-scripts using Groovy scripting and JSON manipulation.
- Performance tested all the API web services across Confidential ’s Planning Suite.
- Analyzed the report and validated that the forecasted load levels can be reached with acceptable response times of Open Pages for given functionalities.
- Identifying the breakpoint for the Product by the increasing number of users using the application support without degrading the performance using Stress test.
- Used Java JConsole monitoring tool to understand if there are any memory leaks during the Capacity & Endurance Tests.
- Monitoring performance of the application and database servers during the test run using Grafana.
- Involved in agile testing practices and participated in different meetings to gather specifications and requirements (Load Metrics, Performance Requirements, SLA, Workflows, etc.) prior to testing.
- Monitored End User Experience, Overall Application Performance, Business Transaction Performance, and Application Infrastructure Performance across all tiers (web/app/DB) of the applications using Grafana.
- Collected the frequency of JVM Heap & Garbage Collection in DCT Server during Test.
- Worked on performance testing report and made recommendation for systems application performance improvement.
- Identifying the problems, prioritizing them, and communicated the bugs to the developers using Bug Tracking Tool Quality Center.
Environment: Apache JMeter 5.0, Cavisson NetStorm, Web services, SOAP UI, Mongo DB, Fiddler, Grafana, Confluence, JIRA, Jenkins and Taurus, DevOps.
Confidential, Green Bay, WI
Performance Test Engineer
Responsibilities:
- Involved in designing performance Test Plans including test objectives, environment requirement, test data, types of testing, and real time production like scenario.
- Developed scripts and scenarios for testing new and enhanced web-based products using JMeter and Silk Performer.
- Monitoring performance of the application and database servers during the test run using TeamQuest Vityl Monitor tool.
- Participated and implemented agile testing practices for widely distributed teams, involved in different meetings to gather specifications and requirements (Load Metrics, Performance Requirements, SLA, Workflows, etc.) prior to testing.
- Generated JMeter automation scripts and prepared the test data accurately with the help of additional sub-scripts.
- Executed different performance tests like Smoke Test, Baseline Test, Load Test, Stress Test, High Availability Test, Capacity Test, and Endurance Test.
- Created load test scripts using JMeter in following protocols: HTTP, AJAX, SOAP, JDBC, and Java (Web Services) and enhanced the basic script by adding Custom code.
- Performance tested SOA based application using Web Services Protocol.
- Involved in Localization testing and Performance testing of web-based modules, handled Load testing using JMeter.
- Analyzed the report and validate that the forecasted load levels can be reached with acceptable response times of Open Pages for given functionalities.
- Identifying the breakpoint for the Product by the increasing number of users using the application support without degrading the performance using Stress test.
- Identify the specific system components response times to troubleshoot performance bottlenecks.
- Working with SDLC team to troubleshoot root cause of the issues related to DB and Application servers using Vityl Monitor Tool.
- Used SPLUNK tool to check whether the messages are triggering Confidential back end
- Used Java JConsole monitoring tool to understand if there are any memory leaks during the Capacity & Endurance Tests.
- Used APM tools TeamQuest Vityl Monitor, Grafana to monitor End User Experience, Overall Application Performance, Business Transaction Performance, and Application Infrastructure Performance across all tiers (web/app/DB) of the applications.
- Collected the frequency of JVM Heap & Garbage Collection in DCT Server during Test.
- Involved in creating Dynatrace and AppDynamics dashboard and reports using built-in and/or custom measures to present testing and analysis results effectively
- Responsible for Setting up user profiles, configuring and adding application servers on Dynatrace tool
- Worked on performance testing report and made recommendation for systems application performance improvement.
- Identifying the problems, prioritizing them, and communicated the bugs to the developers using Bug Tracking Tool Quality Center.
Environment: Apache JMeter 5.0, HP Quality Center, Performance Center, Web services, SOA, Dynatrace, SOAP UI, SQL server, XML, Fiddler, Oracle, Remote Desktop, Splunk, TeamQuest Vityl Monitor, Grafana Influx DB.
Confidential, Providence, RI
Performance Tester
Responsibilities:
- Involved in different meetings to gather specifications and requirements (Load Metrics, Performance Requirements, SLA, Workflows, etc.) prior to testing.
- Participated and implemented agile testing practices for widely distributed teams and executed Performance Test Plan and Test Cases with a standard format.
- Generated LoadRunner automation scripts and prepared the test data accurately with the help of additional sub-scripts.
- Develop performance test suites, creating threads and setting up sampler using JMeter Testing tools.
- Executed different performance tests (Smoke Test, Baseline Test, Load Test, Stress Test, Capacity Test, and Endurance Test).
- Created load test scripts using VUGen in following protocols: HTTP, AJAX, SOAP, ODBC, Terminal Emulator, and Java (Web Services)
- Developed VUGen scripts and enhanced the basic script by adding Custom code.
- Using Blaze meter executed the performance tests in cloud performance testing.
- Used rendezvous concept of LoadRunner to generate peak load onto the server thereby stressing it and measuring its Performance.
- Identify the performance bottleneck of application and recommendation areas of performance improvements to developers.
- Working with SDLC team to troubleshoot root cause of the issues related DB and Application servers using Dynatrace Tool.
- Used Dynatrace to measure web site performance in test environment to capture performance metrics of key product features.
- Used APM tool Dynatrace, AppDynamics to Monitor End User Experience, Overall Application Performance, Business Transaction Performance and Application Infrastructure Performance across all tiers (web/app/DB) of the applications. Adding Dynatrace headers to the VUGen scripts to monitor response times closely.
- Used SPLUNK tool to check whether the messages are triggering Confidential back end.
- Used HP Diagnostics & Wily Introscope for further monitor various graphs like VM Heap, GC, threads status, Java Process utilization, JVM exceptions, collection leaks and context switch\sec to pin point issues.
- Collected the frequency of JVM Heap & Garbage Collection in DCT Server during Test.
- Identifying the problems, prioritizing them and communicated the bugs to the developers using Bug Tracking Tool Quality Center.
Environment: HP Load Runner 12.5, JMeter, HP Quality Center, Performance Center, VUGen, Web services, AppDynamics, Cloud watch monitoring tool, DynaTrace, SOAP UI, ASP.net, SQL server, XML, XML Spy, Fiddler, Oracle, TOAD, Splunk, Linux, FitNesse and Wily Introscope.
Confidential, Dallas, TX
Performance Tester
Responsibilities:
- Involved in creating performance Test Plans including test objectives, environment requirement, test data, types of testing, and real time production like scenario.
- Developed scripts and scenarios for testing new and enhanced web-based products using LoadRunner and JMeter.
- Created Correlation as well as Parameterization using LoadRunner VuGen.
- Designed the LoadRunner scenarios to meet the User load requirements close to production environment.
- Simulated heavy load using JMeter Master-Slave model to test applications strength and to analyse overall Performance under different load types.
- Performed the monitoring performance of the application and database servers during the test run using tools like AppDynamics and SiteScope.
- Analysis of Server logs and in-depth analysis using Dynatrace.
- Extensively used Wily Introscope to analyse the system resources bottlenecks like Memory Leaks, CPU utilization, Response time, TPS as well as problematic application and DB components.
- Executed Load Test, Stress Test, Endurance Test, Spike Test.
- Worked on JMeter to create Thread Groups and test Web Application using SOAP/REST protocols for various loads on key business scenarios.
- Worked closely with business team, application team and identified their requirements and objective for performance testing.
- Added various monitoring parameters (CPU, Memory) to the LoadRunner controller for monitoring, also using SiteScope for monitoring database and application servers.
- Analysed the Application and Database server resource utilization for any bottlenecks.
- Prepared and reviewed final Performance test reports.
- Used Performance Center to enhance the VuGen, Analysis and Help Center.
Environment: HP LoadRunner 11.52, Controllers, Performance Center 11.52, Apache JMeter 3.0, Ajax Truclient, AppDynamics, SiteScope, Wily Introscope, HP Diagnostics, Web Servers, Cloud watch monitoring tool, Dynatrace, Oracle 11g, Java, HP ALM (Quality Center).
Confidential
Performance Analyst
Responsibilities:
- Designed performance test plans as required by customer for providing necessary support and ensured that development process carried in accordance to strategy.
- Responsible for performance testing (Load, Stress and Volume) using HP LoadRunner (Controller, Virtual User Generator, Analysis).
- Involved in installation and Setup of Performance Center HP LoadRunner.
- Developed automated tests, measured, validated performance of system against requirement, and maintained automated tests environment and scripts for performance tests.
- Monitored CPU usage, Idle Thread Counts, GC Heap, Open Session Current Counts by using Wily Introscope and WebLogic Console.
- Developed Work Load Model (WLM) for every release.
- Collected performance monitoring statistics coordinated with tech architects, business analysts to analyse the performance bottlenecks & provided recommendations to improve the performance of the application.
- Created scripts for Regression, Security, GUI, Integration and Database testing.
- Preparation and execution of test scripts using JMeter and SOAP UI tool to perform Web Services testing and load testing in Blaze Meter.
- With JMeter did test performance for both, static resources as well as dynamic resources, as well as handle a maximum number of concurrent users than the website can handle and providing the graphical analysis of performance reports.
- Conducting WSDL review meetings to understand the requirement of each Web Service.
- Execute each Web Service manually by testing each operation in the WSDL.
- Performance tested SOA based application using Web Services Protocol.
- Designed and developed automated scripts using LoadRunner based on business use cases for the application.
- Designed and conducted Smoke, Load, Soak, Stress, and Scalability tests using Performance Center.
- Used Random Pacing between iterations to get the desired transactions per hour.
- Extensively used SQL queries to check the business transaction flows, editing existing batch jobs.
- Created VuGen scripts using different protocols like Web-HTTP, Web services.
Environment: Load runner 11.0, JMeter, Agile methodology, HP Diagnostics, Quality Center 11.0, SOA, Remote Desktop, Java, Load Testing tool, Wily Introscope.