We provide IT Staff Augmentation Services!

Performance Engineer Resume

Wilmington, DE

SUMMARY:

  • 8 + years of Experience in Quality Assurance Methodologies using HP/Mercury Interactive Tools mainly HP Load Runner, Quality Center (QC).
  • Extensive experience with HP Load Runner tool for monitoring, testing of Web based as well as client Server systems on multiple platforms such as in .NET, Java, SQL.
  • Worked with Agile project development lifecycle in different cross - functional teams as well as strong experience in Software Development Life Cycle and Testing Methodologies from project definition to post-deployment documentation.
  • Developed Scripts to meet load-testing requirements according to the SLA (Service Level Agreement) agreed upon.
  • Excellent working knowledge in Developing & Implementation of complex Test Plans, Test Cases and Test Scripts using automated test solutions for Client/Server and Web-based applications.
  • Enhanced Vuser script in Load Runner with Parameterization, Correlation, Transaction point, check points.
  • Very strong in custom coding, error handling, file handling etc
  • Excellent knowledge of software development and testing life cycles, SQA methodology and test process documentation and End user training.
  • Developed Vuser scripts using Web (HTTP/HTML), AjaxTruclient, Citrix and Web Services protocol.
  • Experience in understanding Business Process from the requirements.
  • Knowledge on Manual and Automated testing of applications developed on Windows environment.
  • Extensive experience with Load, Stress and Performance testing using Load Runner and developed Vugen test scripts.
  • Expert in finding performance bottlenecks both in client side and server side.
  • Analyze the CPU Utilization, Memory usage, and Garbage Collection and DB connections to verify the performance of the applications.
  • Developed vugen script formatting tool using Python programming language.
  • Developed WebLogic log parser using python programming language.
  • Good knowledge on JVM, Garbage Collection, Heap Dump, Thread Dump.
  • Hands on experience in creating Restful service to avoid requests going to the production during performance test.
  • Good knowledge on Python.
  • Knowledge on Javascript and AngularJS
  • Good in Linux commands to stop and start services, to change property files for services, to delete log files after performance test using putty and winscp
  • Good experience in entering into servers using winscp and looking at logs, changing property files
  • Good knowledge of Core Java
  • Good experience in working with SQL Developer, writing sql queries.
  • Good experience working on monitoring tools like HP Sitescope, HP Diagnostic, Dynatrace, Wily, FogLight, Perfmon
  • Hands on experience in configuring DynaTrace for IIS servers, installing .Net and IIS agents in servers.
  • Expert in Analyzing results using HP Load Runner Analysis tool and analyzed Oracle database connections, Sessions, Log files.
  • Good knowledge on JMeter
  • Expertise in writing reusable modular scripts for automation testing for various Business Applications like Banking& Finance.
  • Comfortable with various Industry Leading Operating systems(Windows NT/98/95/2000/XP/Vista/Windows 7 and UNIX).
  • Excellent inter-personal abilities and a self-starter with good communication & presentation skills, problem solving skills, analytical skills and leadership qualities.
  • Experience in coordinating on shore and off shore resources.

TECHNICAL SKILLS:

Testing Tools: HP Load Runner tool, JMeter, SOAP UI, QualityCenter, Performance Center.

Languages: Core Java, C, C++.

Database: MySQL, DB2, Oracle, SQL Server

Web Technologies: HTML, VBScript, JavaScript.

Environment: Windows7/2003 server/95/98/NT/00/XP, UNIX, Soap UI.

Communication: MS-Outlook2003, MS-Office.

Methodologies: Water fall, Agile methodologies (SCRUM).

PL/SQL: Oracle PL/SQL.

Browsers: IE 8/9/10/11, Firefox, Chrome, Safari.

WORK EXPERIENCE:

Confidential, Wilmington, DE

Performance Engineer

Responsibilities:

  • Gathered Test Plan and Test Specifications based on Functional Requirement Specifications and System Design Specifications.
  • Responsible for creating test scripts using Load Runner using various protocols including Web Http/Html, Web service, Citrix etc.
  • Planned and generated Vuser scripts with VuGen and enhanced them with correlation, parameterization and functions.
  • Working on Dynatrace to get the response time at the server level
  • Working on Linux and Window servers
  • Using NMON to get server metrics as wily is consuming more server resources
  • Using VCOP monitoring tool to monitor VMs
  • Expert in developing Work load Model for performance testing.
  • Parameterized large and complex test data to accurate depict production trends.
  • Scheduling the scenarios using the Load Runner's Controller and analyzing the results using Analyzer.
  • Monitoring the servers and logging the metrics using the monitoring tools.
  • Identified Disk Usage, CPU, Memory for Web and Database servers and how the servers are getting loaded using Sitescope
  • Worked in association with the DBAs in making sure that the databases are re-pointed to the original environments once we are done with the environment for the load test in question.
  • Using Jmeter as projects are getting moved from LoadRunner to Jmeter.
  • Good hands on experience on Jmeter.
  • Knowledge on Blazemeter
  • Created test cases based on the requirements and the test conditions in Mercury Quality Center and identified test data in order to match with requirements.
  • Executed SQL Queries for backend testing of the application to ensure business rules are enforced, and data integrity is maintained.
  • Performed usability and navigation testing of web pages and forms.
  • Analysis of cross results, cross scenarios, overlay graphs and merging different graphs.
  • Developed couple of automated tools using Python for repeated tasks, eg NMON start and stop
  • Responsible for database rolled back after the load tests are completed.
  • Independently executed the test scenario, analyzed the execution statistics by monitoring the online graphs
  • Coordinated with Technical Teams to monitor Database Query, CPU Utilization and Memory.
  • Worked closely with the Development team in the performance tuning efforts of the various sub systems.
Confidential, Charlotte, NC

Performance Test Lead

Responsibilities:

  • Gathered Test Plan and Test Specifications based on Functional Requirement Specifications and System Design Specifications.
  • Responsible for creating test scripts using Load Runner using various protocols including Web Http/Html, Web service, Citrix etc.
  • Planned and generated Vuser scripts with VuGen and enhanced them with correlation, parameterization and functions.
  • Working on Dynatrace to get the response time at the server level
  • Expert in developing Work load Model for performance testing.
  • Working on Jmeter for poc.
  • Working on selenium.
  • Parameterized large and complex test data to accurate depict production trends.
  • Scheduling the scenarios using the Load Runner's Controller and analyzing the results using Analyzer.
  • Monitoring the servers and logging the metrics using the monitoring tools.
  • Identified Disk Usage, CPU, Memory for Web and Database servers and how the servers are getting loaded using Sitescope
  • Worked in association with the DBAs in making sure that the databases are re-pointed to the original environments once we are done with the environment for the load test in question.
  • Created test cases based on the requirements and the test conditions in Mercury Quality Center and identified test data in order to match with requirements.
  • Executed SQL Queries for backend testing of the application to ensure business rules are enforced, and data integrity is maintained.
  • Performed usability and navigation testing of web pages and forms.
  • Analysis of cross results, cross scenarios, overlay graphs and merging different graphs.
  • Responsible for database rolled back after the load tests are completed.
  • Independently executed the test scenario, analyzed the execution statistics by monitoring the online graphs
  • Coordinated with Technical Teams to monitor Database Query, CPU Utilization and Memory.
  • Worked closely with the Development team in the performance tuning efforts of the various sub systems.

Environment: Load runner 12.02, performance center 12.02, Dynatrace, Java, SQL, Mercury QC.

Confidential, Philadelphia, PA

Senior Performance Engineer

Responsibilities:

  • Gathered Test Plan and Test Specifications based on Functional Requirement Specifications and System Design Specifications.
  • Responsible for creating test scripts using Load Runner using various protocols including Web Http/Html, Web service, Citrix etc.
  • Planned and generated Vuser scripts with VuGen and enhanced them with correlation, parameterization and functions.
  • Working on Dynatrace to get the response time at the server level
  • Challenge of creating Restful service to avoid request going to the production during performance test.
  • Good in Linux commands to stop and start services, to change property files for services, to delete log files after performance test using putty and winscp.
  • Hands on experience in executing sql queries.
  • Expert in developing Work load Model for performance testing.
  • Good experience in using JIRA
  • Good experience in using GrayLogs
  • Working on HDFS, KAFKA, ODS services
  • Parameterized large and complex test data to accurate depict production trends.
  • Scheduling the scenarios using the Load Runner's Controller and analyzing the results using Analyzer.
  • Monitoring the servers and logging the metrics using the monitoring tools.
  • Identified Disk Usage, CPU, Memory for Web and Database servers and how the servers are getting loaded using Sitescope
  • Worked in association with the DBAs in making sure that the databases are re-pointed to the original environments once we are done with the environment for the load test in question.
  • Created test cases based on the requirements and the test conditions in Mercury Quality Center and identified test data in order to match with requirements.
  • Executed SQL Queries for backend testing of the application to ensure business rules are enforced, and data integrity is maintained.
  • Performed usability and navigation testing of web pages and forms.
  • Analysis of cross results, cross scenarios, overlay graphs and merging different graphs.
  • Responsible for database rolled back after the load tests are completed.
  • Independently executed the test scenario, analyzed the execution statistics by monitoring the online graphs
  • Coordinated with Technical Teams to monitor Database Query, CPU Utilization and Memory.
  • Worked closely with the Development team in the performance tuning efforts of the various sub systems.

Environment: Load runner 12.02, performance center 12.02, Dynatrace, Java, SQL, Mercury QC.

Confidential, Phoenix, AZ

Senior Performance Engineer

Responsibilities:

  • Gathered Test Plan and Test Specifications based on Functional Requirement Specifications and System Design Specifications.
  • Responsible for creating test scripts using Load Runner using various protocols including Web Http/Html, Web service, Ajax Truclient etc.
  • Planned and generated Vuser scripts with VuGen and enhanced them with correlation, parameterization and functions.
  • Working on Dynatrace to get the response time at the server level
  • Challenge of allocating LGs for increasing number of users with more work flows coming in for load test.
  • Trying to bring browser monitoring using DynaTrace.
  • Expert in developing Work load Model for performance testing.
  • Used Ramp Up/Ramp Down, Rendezvous point, Start and End Transaction, Parameterization, Correlation features of Load Runner.
  • Maintained Defect Log, Test Log and Status Report, Traceability Matrix, which gives a clear indication of quality and stability of product for UAT.
  • Executed in Load, Stress and Endurance Testing to simulate a process, which allowed using more 1000 virtual users.
  • Parameterized large and complex test data to accurate depict production trends.
  • Scheduling the scenarios using the Load Runner's Controller and analyzing the results using Analyzer.
  • Monitoring the servers and logging the metrics using the monitoring tools.
  • Identified Disk Usage, CPU, Memory for Web and Database servers and how the servers are getting loaded
  • Worked in association with the DBAs in making sure that the databases are re-pointed to the original environments once we are done with the environment for the load test in question.
  • Studied application performance and maximum scalability, critical parameter such as number of users, Response times, hits per seconds (HPS) and Throughput using Load runner.
  • Created test cases based on the requirements and the test conditions in Mercury Quality Center and identified test data in order to match with requirements.
  • Executed SQL Queries for backend testing of the application to ensure business rules are enforced, and data integrity is maintained.
  • Performed usability and navigation testing of web pages and forms.
  • Analysis of cross results, cross scenarios, overlay graphs and merging different graphs.
  • Responsible for getting the database rolled back after the load tests are completed.
  • Independently executed the test scenario, analyzed the execution statistics by monitoring the online graphs
  • Coordinated with Technical Teams to monitor Database Query, CPU Utilization and Memory.
  • Worked closely with the Development team in the performance tuning efforts of the various sub systems.
  • Accurately produce regular project status reports to senior management to ensure on-time project launch.
  • Actively participated in Defect Review meetings involving Test Coordinator, Developers, Business Analysts and Project Managers to report the status of defects to the management.
  • Prepare testing status report every week.

Environment: Load runner 11.52, performance center 11.52, Dynatrace, Java, SQL, Mercury QC.

Confidential

Senior Quality Analyst

Responsibilities:

  • Assisted the team lead in the preparation of the Test Plan and Test Strategy documents.
  • Responsible for Load Testing Co-ordination with various other projects involved in load testing activity.
  • Developed Scripts in HTML/HTTP, web services for Load runner.
  • Analyzed graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server.
  • Analyzed Performance Bottlenecks using Load Runner Monitors, HpDynatraceand HP Site scope.
  • Also involved in Vuser Setting for different scenarios and business processes in Controller and analyzed graphs to find out the performance of the system.
  • Presenting the results to the team and analyzing the bottle necks and resolving the issues from their end.
  • Responsible for scheduling the Load tests using HP Performance center involving a variety of load scenarios combination.
  • Using Load Runner, execute multi-user performance tests, used online monitors, real-time output messages.
  • Executed different Scenarios for different applications in controller and created Load Runner Analysis Reports and Graphs.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web and App levels.
  • Analyzed results using HP Load Runner Analysis tool and analyzed sessions and log files.
  • Responsible for generating and publishing Load Test Results and publishing the results in share point.
  • Worked closely with development team to narrow down defect reproduction cases and scenarios.
  • Responsible for performance monitoring and analysis of response time & memory leaks using throughput graphs.
  • Responsible for configuring and installing the Performance center Infrastructure for executing and scheduling the load tests.
  • Gathered user requirements and designed the Test Plans and Test Scenarios accordingly.
  • Responsible for coordinating the new Transports to the Performance testing environments.
  • Analyzed the server resources such as Available Bytes and Process Bytes for Memory Leaks.
  • Interacted with developers to Report and Track Bugs using Quality Center.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
  • Interacted in Daily standup meetings with the Management and report day-to-day activities and updates.

Environment: Load Runner, Quality Center, Agile, Dynatrace Performance Center, SQL.

Confidential

Performance Test Engineer

Responsibilities:

  • Involved in developing the scripts to check the Servers connectivity.
  • Importing the WSDL’s and performing the unit tests using the SOAP UI project.
  • Worked with business analyst in gathering the requirements and the SLA details from the client.
  • Developed Load Runner test scripts according to test specifications/ requirements.
  • Involved in full life-cycle of the project from requirements gathering to transition using Agile Methodology.
  • Worked extensively on HP Diagnostic and HP Sitescope to find out the performance issues at the server side and application code level
  • Worked on JMeter to prepare test scripts for GE client.
  • Involved in getting performance metrics using perfmon when the access to server was provided
  • Involved in getting the application logs from IIS servers.
  • Develop and implement load and stress tests with Load Runner, and present performance statistics to application teams, and provide recommendations on how and where performance can be improved.
  • Developed and enhanced scripts using Load Runner VuGen and designed scenarios using Performance Center to generate realistic load on application under test.
  • Logging defects in JIRA and updating test cases with respective defect ids.
  • Coordinated with Functional Teams to Identify the Business Process to be Performance Tested.
  • Inserted Transactions, Checkpoints into Mercury Load Runner Web VuGen Scripts and parameterized & correlated the scripts.
  • Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests.
  • Customized scripts for error detection and recovery.
  • Worked in shared environment and tested different applications.
  • Independently executed the Mercury Load Runner test scenario, analyzed the execution statistics by monitoring the online graphs.
  • Involved in planning and coordination effort throughout QA life cycle.
  • Designed tests for Benchmark and load testing.
  • Ability to diagnose Web/App server performance issues/troubleshooting using Load Runner J2EE Diagnostics/Deep Diagnostics.
  • Worked with Load Runner in analyzing application performance for varying loads and stress conditions.
  • Responsible for the generation of the Load Runner Analysis files based on the Load Runner.
  • Results file generated by the load test and filtering the analysis file data based on the durations required.
  • Generated detailed test status reports, performance/capacity reports, web trend analysis reports, and graphical charts for upper management.
  • Assisting the QA Lead with administrative tasks such as meeting notes, defect database clean up.
  • Provide daily/weekly application availability reports to the management.

Environment: Load Runner 9.52, HP Performance Center, Layer 7, WebLogic App Server,XML, Web Services, HP Diagnostics, HP Sitescope, Oracle 10g/9i, SharePoint, Windows 2000/XP.Vista.

Hire Now