Lead Performance Engineer Resume
Augusta, ME
SUMMARY:
- Over 10 years of experience in Performance Testing for Health Care, Banking, and Insurance domains.
- Experience in Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC).
- Good understanding of collection business requirements, development, testing, deployment and documentation.
- Hands on experience of the project management tools like HP ALM, Atlassian JIRA and Microsoft SharePoint.
- Hands on experience of the performance testing tools like HP LoadRunner, JMeter, Neoload and VSTS.
- Hands on experience in using automated tools like Load Runner, Performance Center, and Quality Center.
- Expertise in setting up environment for performance in HP Hyper VM and environment identical of production environment.
- Good understanding of java programming, JavaScript, WebSphere and WebLogic.
- Expert in writing, executing test cases, usage of various databases, and generating reports.
- Expertise in Manual and Automated Correlations to Parameterize Dynamically Changing Parameters values.
- Experience working on On - Prem and Cloud based Environments. Worked on Salesforce, Microsoft Azure and AWS Cloud.
- Hands on experience with Dynatrace, New Relic, SiteScope and App Dynamics for triaging performance bottlenecks on client side and server side.
- Expertise in designing and planning of Stress testing, Load testing and Endurance testing, executing test scenarios using performance tools.
- Expertise in finding root cause of the performance degradation using Dynatrace, SQL Activity monitor, session monitory in SQL developer.
- Expertise in various load runner protocols like Web HTTP/HTML, Web services, Ajax protocols, TruClient, JDBC, Sap Web, Sap Gui and Winsock protocols.
- Expertise in executing full testing life cycle from gathering testing requirement from stack holders to plan, develop, executing test and log any defects.
- Hands on experience on Oracle database and AWR reports for identifying performance bottlenecks with respect to slow running sql queries, indexing and locating top consuming processes on CPU.
- Hands on experience on Perl scripting for developing performance testing framework.
- Hands on experience in SQL to verify test results in database using Microsoft SQL development studio, SQL Developer, Toad.
- Experience with Team foundation Server, Rally, and JIRA/Confluence.
- Experience working with Parasoft Virtualization tools for stubbing down stream services.
PROFESSIONAL WORK EXPERIENCE:
Confidential, Augusta, ME
Lead Performance Engineer
Responsibilities:
- Involved in gathering business requirement, studying the application and collecting the information from developers, and business.
- Developed test plan which include scope, test strategies, test scenarios, types of tests, test summary reports and test execution metrics.
- Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production.
- Designed test scenarios to properly load/stress the system in a lab environment and monitor/ debug performance & stability problems.
- Hands on experience with LoadRunner using HTTP/HTML, Ajax Tru-client and Web Services protocol.
- Developed performance test scripts using Vugen component of LoadRunner and customized scripts invoking various custom logics.
- Experience with JMeter for developing performance test scripts using http and Web Services protocols.
- Used JMeter to execute test run through Gui and Non- Gui mode and integrated it with Maven and Jenkins.
- Experience with Kubernetes for setting up nodes for performance testing in AWS cloud.
- Responsible for developing and executing baseline, stress, regression and volume performance tests.
- Experience with Putty for monitoring the application server, core tier server and Database Server logs.
- Hands on experience with Dynatrace for monitoring server side and client-side metrics.
- Configured dashlets in Dynatrace to monitor performance counters like CPU Utilization, Memory Utilization, Processes, DISK I/O, Network Utilization, deadlocks and SQL locks.
- Configured pure-paths in Dynatrace to monitor the client-side metrics like average transaction response times, throughput, hits per second and other key metrics.
- Experience with New Relic for monitoring performance counters. App Server, Core Tier and DB Server.
- Experience with ANTS Profiler and J-profiler for performance tuning.
- Experience working with build tools bit bucket, Maven and Jenkins and performed integration of the same with JMeter to execute scripts in Non Gui Mode.
- Hands on experience with Oracle database for Db tuning and resolving performance bottlenecks.
- Hands on experience with Application Server-side tuning on Tomcat and JBoss application server.
- Experience with Jira/Confluence for project management and defect Management.
- Monitored system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat.
- Experience with AWR reports for triaging database related performance bottlenecks.
- Hands on experience on thread dump and heap dump analysis for debugging performance bottlenecks.
- Analyzed Application CPU Usage, Heap Memory, GC activity, threads, using Visual VM Java profiling tool.
- Experience with http watch and fiddler for validating request-response of service calls.
- Conducted result analysis and communicated with Developers, Architects, and Managers in the project team.
- Experience with Tortoise SVN for check-ins.
- Experience with Oracle database tuning for resolving performance bottlenecks.
Environment: LoadRunner/ Performance Center12.53, Unix, Dynatrace, New Relic, Apache Tomcat, JBoss, Oracle 11G QC/ALM., JMeter5.0, C#, J-Console, J- Profiler, HTTP Watch, Fiddler, JIRA, Confluence, Bit Bucket, Maven, Jenkins, TortoiseSVN, AWR.
Confidential, Jersey City, NJ
Performance Test Lead
Responsibilities:
- Responsible for system study & information gathering. Participated in initiative walkthrough group meetings.
- Worked alongside business lead to identify relevant test data, followed by data preparation for the test cases.
- Used Quality Center for documenting requirements, view, modify requirement tree and converting requirements to tests.
- Customized the VuGen scripts using Load Runner with Web HTTP/HTML protocol.
- Performed Load Testing, with creation of scripts, configuration of Controller and Agent machines, setting up Scenarios, execution of Load Tests and Preparation of Load Test Results and Reports Using Load Runner.
- Test scripts development in HP LoadRunner Vu Gen, modify scripts with required Correlations, Parameterization, logic, think times, iterations, pacing, logging options and ps.
- Create Image and Text Verification checks in Vuser Scripts using Load Runner Vuser generator for validation purpose.
- Performance testing with SOAP and Restful services using LoadRunner and JMeter.
- Create various Transactions to note the response times using Load Runner Vu Gen.
- Automated performance test scripts and verified the response time under different load conditions using load runner.
- Experience with APP Dynamics to monitor the app tier, core tier and database server logs.
- Experience with App Dynamics to drill down on to methods taking high response times and SQL queries consuming high response times.
- Experience with Splunk Log for triaging performance issues and service failures.
- Experience with APP Dynamics for monitoring server side and client-side metrics.
- Executed multi-user performance tests in Load Runner Controller tool, used online monitors, real-time output messages and other features of the Controller
- Used controller to launch 1500 concurrent Vusers to generate load using 16 load generators.
- Used Load Generators to generate load from different locations onto servers.
- Experience with J-boss application servers for tuning performance issues.
- The Average CPU usage, Response time, TPS are analyzed for each scenario by creating Graphs and reports by using LR analysis tool.
- Observed for failure/errors and monitored metrics (Transaction Response Times, Running Virtual Users, Hits per Second and Windows Resources graph) in tests.
- Documented average response times, 90% response times and reported them to the application team.
- Experience with Perf-mon, Resource monitor for monitoring performance counters and server-side metrics.
- Identified bottle necks, performance issues using multiple user test results, online monitors, real-time output messages and Load Runner Analyzer
- Created, Analyzed the load test results and reported the load test results to Project Manager.
- Used Quality Center for defect management- adding defect, tracing changes and sending defect e-mail messages
- Created PL/SQL and SQL Queries against SQL Server that can reproduce the data on the metrics.
Environment: LoadRunner/ Performance Center11.00, APP Dynamics, Putty, Microsoft SQL Server, QC/ALM., JMeter4.0, C#, J-Console, J- Profiler, HTTP Watch, Fiddler, JIRA, Confluence, Bit Bucket, Maven, Jenkins, TortoiseSVN
Confidential, Morristown, New Jersey
Performance Test Lead
Responsibilities:
- Coordinate Requirements Gathering sessions with the Business Analysts in the project, to understand the Non-Functional Requirements of the applications, the peak volumes and the performance testing needs.
- Created performance testing artifacts like Performance Test Strategy document, Performance test Plan and Script design document.
- Hands on experience with Microsoft visual studio for developing web performance scripts and Coded Ui Scripts.
- Set up extraction rules for dynamic value for script development in Microsoft visual studio.
- Configured Test Agents and Controllers in Microsoft Visual studio for setting up a distributed load test.
- Configured Virtual Environment for Load Generators, Application Servers and Database Servers for Performance Framework in Microsoft Azure Cloud
- Configured Microsoft Visual Studio Ultimate in Cloud Environment along with its components of Test Controller and Test Agent.
- Hands on Experience with Load Runner using HTTP, Webservices and ajax tru-client Protocol.
- Developed Performance test scripts based on the script design document and invoked various customizations to it like setting up the Extraction rule, adding data source and context parameters.
- Experience SOAP and Restful services using Soap UI for testing the various test scripts and interfaces.
- Utilized SQL Server 2012 to create various tables and view in the database and used it for Load Test execution of applications.
- Hands on Experience with driving root case analysis with Dyna-Trace and creating pure paths deck for performance counters.
- Used Http watch for recording the response times of various transactions and reporting it to development team.
- Hands on experience with ANTS Profiler for debugging the page response times and figuring out which Methods and SQL calls have high response times.
- Understand and define the Performance testing strategy for the project, across releases, by analyzing the requirements of the project.
- Hands on experience with Microsoft test manager to create test cases and organize them into test plans and suites.
- Facilitate testing discussions and planning sessions with test leads from the other tracks of the project, i.e. Customer Experience Management, Communication Services, Release Management tracks, Automation Testing, to ensure optimal coverage of performance testing.
- Hands on experience with Microsoft Test Manager (MTM) for project management.
- Utilized TFS for requirement gathering, tracking defects and scenarios.
- Utilized TFS for managing project management functions to shape the project team based on a user-specifiable software process, and which enable planning and tracking using Microsoft Excel and Microsoft Project.
- Configured Microsoft visual Studio across multiple test machines along with Test Controllers and Test Agents for performing Distributed Load Test.
- Created and customized various scripts of Web application with Microsoft visual studio and conducted various stress tests for performance testing.
- Monitored various performance test execution with Microsoft visual studio and created a descriptive analysis with the analyzer component of it for reporting the response time to project team and stake holders.
- Experience with Subversion source code for storing and updating various documentation in timely manner.
Environment: Microsoft Visual Studio Ultimate 12.0, TAC, LoadRunner/ Performance Center, Unix, SiteScope, Microsoft Azure, Dynatrace, SQL Server 2012, Microsoft Visual Studio, C, C++, ANTS Profiler, HTTP Watch, Subversion, Team Tracker, Team City, Team Foundation Server.
Confidential, Chicago, Illinois
Performance Tester
Responsibilities:
- Coordinated with business team to get the performance requirements for the Load Testing, Stress Testing and Capacity Planning.
- Developed Performance Test plan and Test Case Design Document with the input from developers and functional testers.
- Created automation test scripts with Unified functional testing and performed various customizations to the test script invoking various checkpoints for error handling.
- Utilized LoadRunner and Performance Center for conducting performance tests.
- Extensively used LoadRunner using Virtual User Generator to script and customize performance test harness Web Protocol.
- Utilized LoadRunner Controller to execute multiple scenarios.
- Used Test Results to provide summary reports, response times and monitor averages.
- Provided results by analyzing average response time graph, throughput and hits per second.
- Extensive familiarity with protocols like Web (HTTP/ HTML), Web services and Citrix.
- Parameterized scripts to emulate realistic load.
- Involved in performing load and stress test on the application and server by configuring LoadRunner to simulate hundreds of virtual users and provided key matrix to the management.
- Hands on experience working with Shunra for monitoring Network Latency and providing optimum solutions for network virtualization.
- Experience working on Salesforce cloud for building performance testing framework on virtual Environment consisting of Load Generators, APP Servers and DB Servers.
- Configured and used SiteScope Performance Monitor to monitor and analyze the performance of the server by generating various reports from CPU utilization, Memory, Disk and other OS metrics.
- Created test scripts using Neoload and implemented test execution and presented the analysis and its breakdown to the project team and stakeholders.
- Involved in conducting stress test and volume tests against the application using LoadRunner.
- Helped DBAs identify and resolve bottlenecks.
- Hands on experience with Microsoft test manager to create test cases and organize them into test plans and suites.
- Collect event logs, IntelliTrace data, video, and other diagnostic data with Microsoft test manager while a test execution.
- Utilized Microsoft test manager record your actions, screenshots, and other diagnostic data for inclusion in test results and bug reports.
- Used Quality Center to invoke the scripts and initially performed the baseline testing and organized all the scripts systematically and generated reports.
- Responsible for analyzing the requirements, designing, debugging, execution and report generation of existing legacy system and new application.
- Executed baseline, load and endurance tests.
- Analyzed business critical transactions average response times.
- Responsible for creating automated Performance scripts for load testing using LoadRunner.
- Involved in installing LoadRunner components on multiple desktops.
- Coordinated with Application Owner and System Administrators to identify the bottlenecks and fine tune of the application.
- Conducted Performance Analysis meeting with Stakeholders, Developers, Architects, Test Analysts and other team members associated with the business.
- Conducted meeting to discuss the Response times and its Breakdown and reporting various analysis with the help of reports and Graphs.
- Presented various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Network time, Page Refresh Time and the Page Rendering Time.
- Involved in Performance Tuning with Constant Engagement with the Application Development Team, Test Architects and the Networking Team.
- Used HP Diagnostics for High Level Performance Analysis and formed framework and Standards for Performance Optimization.
- Used APPDYNAMICS for Performance Tuning and High-Level Analysis and Reporting Services.
- Worked with Vendor teams to identify the bottlenecks and performed regression testing to compare results.
Environment: LoadRunner/ Performance Center, NEOLOAD, SHUNRA, Unix, SiteScope, Microsoft Azure, APP-Dynamics, SQL Server 2012, Microsoft Visual Studio, C, C#, ANTS Profiler, HTTP Watch, Subversion, Team Tracker, Team City, Team Foundation Server.