Performance Engineer Resume
Mount Horeb, WI
SUMMARY
- 6 years of experience in functional and performance testing.
- Highly proficient in performance testing using LoadRunner, Performance Center, StormRunner Neoload, Apache Jmeter and Soasta CloudTest.
- Experienced in Monitoring tools like Dyantrace and New relic.
- Experienced in development and implementation of testing processes to ensure successful testing of cloud - based products and delivery.
- Hands on Experience in conducting POC, writing test plan, NFR and Test Report documents.
- Involved in all phases of performance testing like Requirements gathering, Scripting, testing, Monitoring, Analysis, Reporting, and Tuning.
- 1 year experience working with Neoload
- 2 years' experience working with Loadrunner.
- 2 years' experience working with Jmeter and Smart meter.
- Ran test scripts from a java program using jmeter API.
- Created JMX file and executed it from a java program.
- Analyzed results like server response times, JVM, Heap memory and CPU utilization.
- Experience using jprofiler tool to conduct profiling
- Proficient in developing scripts in various samplers in apache Jmeter and executing various load tests.
- Extensively used Neoload for executing and creating test scripts.
- Execution of Manual test scripts and responsible to track and log teh defects using Quality Center ALM.
- Expert in conducting different varieties of tests like Load test, Stress test, Endurance test, Failover test.
- Worked on different domains like Banking/Finance, Insurance, Ecommerce and Warehouse Management etc..
- Expert in analyzing teh results, reporting and recommending teh changes.
- Experience in using and configuring Jenkins with Load Runner/PC as part of CI/CD process.
- Experienced in vb scripting in Quality test professional tool
- Experience in test script creation, test Execution and analyzing teh results of each test for teh productive outcome.
- Expertise in recording/coding Vugen scripts using different protocols in all types of environments.
- Expertise in manual correlation, parameterization and C.
- Experienced in scripting teh application using different protocols like Web (Http/Html), Web (Click and Script), Web services, RTE and AjaxTru Client.
- Worked majorly for Dot net and Java based applications.
- Excelled in testing teh end to end performance of teh application with different types of testing like Benchmark/Load/performance/Stress/Soak/Capacity.
- Expert in monitoring Linux based & dot net-based applications
- Monitoring teh performance of teh application along with teh Web, System, Network and Database Server Resources.
- Involved in test status review meetings.
- Good Analytical and communication skills.
TECHNICAL SKILLS
Testing Types: Neoload, HP Loadrunner, Soasta CloudTest,Neoload, JMeter, performance Center,smart meter
Performance testing and functional testing: Profiling/Tuning
Performance Tools: App Dynamics, Dynatrace, splunk,wyly,new relic.
Testing: Statspack,JProfiler
Defect/Bug/Issue Tracking: Bugzilla, Quality Center (QC), JIRA.
Operating Systems: Linux, UNIX, Microsoft Windows-XP, Windows-Server 2003, Windows-Server 2008, Vista and Windows7, Windows 10, MobaXterm, winscp
Deployment Tools: Octopus, Jenkins
Web/appliclation servers: Apache, Tomcat, Websphere, Weblogic
Version Controlling Tools: Perforce
Programming Languages: C, C++, Core Java
Database: Oracle, SQL Server, MySQL, Db2
PROFESSIONAL EXPERIENCE
Confidential - Mount Horeb, WI
Performance Engineer
Responsibilities:
- Completed Performance testing two projects
- Completed Requirement gathering, Test strategy, Test Design, Test execution and Results analysis of two projects
- Discussing with application architects and product owners to come up with business requirements which will be documented and identifying teh business-critical scenarios for non-functional testing, to enhance and scale up teh existing and new products
- Defining strategy to carry out all Performance Engineering activities for teh application and plan teh resources required, including teh toolset used to accomplish test execution.
- Completed Performance testing e-commerce website to measure teh performance of teh system in peak load condition
- Requirement gathering for performance tests for e-commerce website
- Prepared Test strategy and Test plan document for e-commerce website
- Prepared Workload for performance tests for e-commerce website testing
- Prepared test scripts for teh test scenarios and gatheird teh required data to pass into teh test scripts
- Pull data from teh database using teh execution of sql quarries for preparing teh test data.
- Execution of teh above test scripts and analyzing teh performance of teh system.
- Responsible for analyzing teh results like CPU usage, memory usage, garbage collection/heap size, server response times, database response times, active/idle threads using Dynatrace etc.
- Preparing Final performance report for teh e-commerce project.
- Completed performance testing on a Retail Point of sale system
- Requirement gathering for performance tests for POS system
- Prepared Test Strategy and Test plan document for POS system
- Prepared workloads to performance test POS system
- Prepared test scripts for teh test scenarios and gatheird teh required data to pass into teh test scripts.
- Execution of teh test scripts and analyzing teh performance of teh POS system.
- Prepared Final performance report for teh POS system.
- Experienced in setting up CI / CD pipeline and using Jenkins and Neoload for performance / load testing scripts.
- Created Vuser scripts in Load runner 2020.610
- Developed Vuser scripts using LoadRunner for Web (HTTP/HTML), and Web services protocols based on teh user workflows
- Defined and configured SLAs for hits/sec, throughput, transactions per second in LoadRunner.
- Configured Web, Application, and Database server performance monitoring setup using LoadRunner Controller
Environment: Neoload, LoadRunner 2020.610 Dynatrace,VMware, Splunk, Javascript
Confidential - Lehi,UT
Sr Performance Test Engineer
Responsibilities:
- Created performance test Plans
- Conducted Load tests on different Environments using Soasta CloudTest tool
- Created and Debugged test scripts for Load tests
- Created clear documentation to scripts for teh benefit of peers who are not involved in teh script's creation.
- Analyzed requirements, detailed design, and formulated test plan for teh functional testing of teh application.
- Developed scripts in Web HTTP/HTML and Web service protocols in Soasta CloudTest tool
- Inserted transactions, checkpoints into Load test Scripts and Parameterized, Pacing and correlated teh scripts.
- Execution of test cases and interaction with teh coding team to report and correct errors for every version release.
- Responsible for Integration and regression testing.
- Monitored JVM and GC with jprofiler tool to conduct profiling
- Editing of automated scripts by inserting logical commands to handle complicated test scenarios.
- Monitored performance Measurements such as end-to-end response time, network and server response time, server response time, middleware-to-server response time.
- Performing System, functional, and regression testing using selenium.
- Used Jenkins to automate teh batch testing and Load testing.
- Run SQL queries to retrieve data from backend systems.
- Supporting testing during code deployments to production
- Collected teh frequency of JVM heap metrics and Garbage collection.
- Involved in checking teh functionality of teh system in pre/post production release processes.
- Responsible for monitoring teh Infrastructure behaviour using New Relic during Load test execution to identify performance bottlenecks if any.
- Monitored Test through performance center and Controller.
- Using Performance Center to create scenarios and set up monitors to track load generator for performance testing; Involved in determining scalability and bottleneck testing of applications
- Responsible for monitoring different graphs such as Throughput, Hits/Sec, Transaction Response time and Windows Resources while executing teh scripts from Performance Center.
Environment: Soasta CloudTest, Performance center 12.53, New Relic,,Web services, VMware, Confluence, Kibana, Splunk,Docker services, Javascript.
Confidential - Nashville, TN
Sr Performance Engineer
Responsibilities:
- Worked in agile development environment with frequently changing requirements and features set.
- Created and analysed relevant testing environment so that teh application to be tested meets teh production requirements
- Interacted with business analysts and developers in requirements analysis, design reviews, testing and documentation for application developed in agile environment.
- Educate clients for teh need to performance test teh applications before deploying in production and categorize different load test scenarios.
- Develop scripts using Web (HTTP/HTML) and web services protocol through Loadrunner tool and enhance teh scripts to support error/data handling.
- Involved in various performance testing such as Load, Stress and Volume testing using StormRunner,Smart Meter and Apache Jmeter.
- Ran test scripts from a java program using jmeter API.
- Created JMX file and executed it from a java program.
- Evaluated Think time and pacing calculations in preparing teh scenario designs for testing with various loads ranging from 5 TPS to 600 TPS.
- Implemented continuous integration of Load Tests with Jenkins, Devops CI, CD.
- Execute single user test from front end to do page load analysis for test transactions using HTTP watch tool.
- Analyze teh tomcat server metrics,JVM Heap Usage metrics,Used Heap after Collection, percentage of time spent in GC, Young Generation Usage, Old Generation Usage, Perm Gen Usage, Process CPU.
- Develop scripts for Data testing (DT).
- Collected teh frequency of JVM heap metrics and Garbage collection.
- Analyze teh front-end metric involving response times, hits and throughput obtained from Loadrunner tool for each executed load test.
- Assisting Project Manager with deliverables, maintaining project plans and schedules.
- Defining performance test Strategy and development of performance test Plan including all necessary details of performance testing (Objectives, Scope, Resources, Scenario Details, test cases and Schedules).
- Performed Web log analysis to deduce workload and understand teh peak workload use cases and peak connected sessions with different timings involved for performance testing.
- Involved in creating test scripts and developed custom scripts for different workflows, user roles and business transactions.
- Created basic scenarios for execution in Controller.
- Developed scripts in Web HTTP/HTML and Web service protocols in StormRunner tool.
- Responsible for executing load tests.
- Measuring performance metrics (response times, throughputs, hits/sec etc) and monitored resource demand metrics (percentage CPU, memory, etc.).
- Involved in preparing Load test summary report based on teh performance metrics and summarized result findings into meaningful charts and graphs.
- Adding new graphs to teh Analyzer reports, comparing results with SLAs, merging two or more graphs to compare results, exporting HTML reports.
- Coordinated with developers, database administrators while running teh tests to monitor teh server performance thus halped in identifying teh performance bottlenecks.
Environment: Performance Center, HP StormRunner, Oracle, UNIX, SQL.
Confidential - Anaheim, CA
Senior Performance Test Engineer
Responsibilities:
- Interact with business leads, solution architects and application team to develop and mimic production usage models by collecting non-functional requirements for multi-year rollout of large volume SOA.
- Involved in gathering all teh requirements from various teams and worked on teh test plan and test strategy documents for projects based on teh NFR's.
- Monitored system performance using AppDynamics.
- Testing cloud applications using Soasta cloud test tool.
- Work on mobile/app testing on cloud testing services.
- Monitored JVM and GC with Statspack tool to conduct profiling.
- Analyze teh tomcat server metrics,JVM Heap Usage metrics,Used Heap after Collection, percentage of time spent in GC, Young Generation Usage, Old Generation Usage, Perm Gen Usage, Process CPU.
- Development and implementation of testing processes to ensure successful testing of cloud-based products and delivery.
- Worked with developers in understanding teh code and application in order to develop teh Load scripts.
- Created performance Load test detailed plan and test Case Design Document with teh input from developers and functional testers.
- Involved in various performance testing such as Load, Stress and Volume testing using LoadRunner and Apache Jmeter.
- Developed scripts in Web HTTP/HTML and Web service protocols in Jmeter tool
- Collected teh frequency of JVM heap metrics and Garbage collection.
- Developed performance scripts using VuGen.
- Analyzed scalability, throughput, and load testing metrics against AWS test servers to ensure maximum performance per requirements.
- Developed scripts and executed load test in production simulating peak day volume.
- Developed test plans, test scenarios, test cases, test summary reports and test execution metrics.
- Developed robust Vuser Scripts using Web (HTTP/HTML), AjaxTruClient, & web services, Citrix protocols in loadrunner for applications.
- Created TEMPeffective NeoLoad scripts using Web and Web services protocol.
- Used NeoLoad for load on a server, group of servers, network or object to test its strength or to analyze overall performance under different load types.
- Worked extensively with JSON/XML data and SOAP protocols in Non-UI Web services (SOA) testing.
- Conducted testing on vendor environment to capture baseline and benchmark after migrating to Azure.
- Writes and executes SQL queries in validating test results
- Responsible for setting up SiteScope monitors to monitor network activities and bottlenecks and to get metrics from App/Database servers.
- Configured and used DynaTrace for performance monitoring and performed trouble shooting on bottlenecks with performance testing along with response times, analysis and profiling teh application to find out where teh performance issue.
- Analyze teh tomcat server metrics,JVM heap Usage metrics, used heap after collection, percentage ofu time spent in GC, Young Generation Usage, Old Generation Usage, perm gen usage, process CPU.
- Monitored metrics on application server, Web server and database server.
- Used Splunk to monitor and collect teh metrics of performance test servers.
- Monitored system performance using AppDynamics
- Reported various performance analysis graphs and reports collected from various performance tools and discuss its bottlenecks such as memory leaks, JVM Heap, CPU utilization, network time.
- Analyzed JVM GC verbose logs and heap dumps to find out potential memory leak issues.
- Responsible for Analysis, reporting and publishing of teh test results.
- Tested mobile native apps on different mobile platforms such as Android mobile device to track new feature performances and bug fixes to ensure stability of releases.
- Tested teh mobile application for UAT, usability, performance, compatibility and load testing for iOS and Android devices.
- Involvement in automation environment setup using Eclipse, Java, Selenium WebDriver Java language bindings and TestNG jars.
Environment: HP LoadRunner 12.53, HP performance center,, HTTP/HTML, JIRA, Dynatrace, Windows, Linux, Excel, SQL, Oracle Database 11g/12c, Oracle SQL Developer, Splunk.
Confidential
Performance Test Analyst
Responsibilities:
- Preparing PDD- performance Design Document with teh project and portfolio details along with teh plan.
- Copying teh latest build version of NAF from teh Installer box to teh performance environment servers.
- Deploying latest build on our perf Linux server box both Node and Rest server.
- Scripting teh identified business flows in Jmeter.
- Monitored JVM and GC with jprofiler tool to conduct profiling.
- Load testing & analyzing teh aggregate report and JTL files.
- Pulling teh node & restserver components access and server logs.
- Executing SAR, TOP, VMSTAT, MPSTAT, IOSTAT, and TAIL to pull teh system statistics accordingly.
- Pulling teh AWR report through teh Linux server box.
- Involved in thread dump and heap dump analysis.
- Analyzing teh results to debug teh triggered errors and exceptions.
- Reporting teh same to teh concerned portfolio team.
- Involved in performance testing teh regression business scenarios.
Environment: Oracle 12g, JMeter3.0/2.13, Winscp, Loadrunner 11, 12.53