We provide IT Staff Augmentation Services!

Lead Performance Engineer Resume

SUMMARY

  • 8 years of experience in Performance Testing for human services, Insurance, Banking, Retail, and Financial clients.
  • Solid experience in Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) in the areas of business requirements, development, testing, deployment and documentation.
  • Experience in Performance testing using tools Quality Center, HP ALM and JIRA, HP LoadRunner and VSTS.
  • Strong experience in preparing Performance Test Plans, Performance Test Strategy, Performance Test Analysis Reports.
  • Experienced in setting up Performance Environment, Monitoring Strategy and Configuration as identical to Production.
  • Experienced in Workload Analysis, Load Testing, Performance Monitoring and Tuning of Web, App Servers, and Database Servers.
  • Good knowledge in designing and planning various performance tests (Load, Stress and Endurance), executing test scenarios using performance tools.
  • Experience in Monitoring tools like Microsoft Perfmon.
  • Experience in root cause analysis using tools like Dynatrace.
  • Experience in Load Testing Tools like LoadRunner, Jmeter, VSTS.
  • Worked on Web and Web services, Ajax protocols, TruClient.
  • Expertise in Smoke Testing, Black - Box Testing, User Acceptance Testing (UAT), Functional Testing, Positive/ Negative Testing, System Testing, Regression Testing.
  • Solid experience in developing Test Scripts used for Bug Fixes, Enhancements, Functional and Regression testing of software releases according to the product functional specifications and use cases.
  • Experience in applying Testing Methodologies, creating Test Plans, Executing Test Scripts, Automation of Test Cases, Defect Tracking and Report Generation, Requirement Traceability Matrix.
  • Experience in SQL queries for the backend testing using SQL Developer, Toad.
  • Involved in designing required Quality Center (QC) templates, managed the user profiles, configured the defect tracking flow and defined the rules for automatic email generation.
  • Worked with cross-functional teams to execute full system, interface, and end-to-end testing.
  • Excellent team member with problem-solving and trouble-shooting skills.

TECHNICAL SKILLS

Technologies: Load Runner, Quality Center/ALM, Test Director, J-Meter, Bugzilla, IBM clear quest, Windows Mobile, Visual basic and script, HTML, XML, Asp, .Net, Ms- office package, Windows 2000/7, Ubuntu Unix, T-SQL, PL/SQL Developer, Oracle, TOAD.

Frameworks: .NET Framework 4.5/4.0/3.5/2

Databases: MS SQL Server 2005/2008, Unix, Linux.

Scripting: VB Script, Automation Scripting.

Operating Systems: Windows server 2003, 2008, 2012R2, 2016.

Project Mgmt. tools: MS Project, TFS, JIRA, ALM, Sharepoint, MTM.

Standards/Methodologies: CMMI, Agile, TDD.

Version Control: Team Foundation Server, SubVersion

Testing tools: NUnit, Microsoft Visual Studio, Load Runner, J-Meter, Neo Load, QTP, MTM,UFT11.5

PROFESSIONAL EXPERIENCE

Confidential

Lead Performance Engineer

Responsibilities:

  • Played a leading role in performance testing involved in developing performance test strategy, work load model, performance test plan and performance test scripts using Load Runner implementing various protocols like Web Http, Web Services, Oracle 2 Tier and TruClient.
  • Took a leading role in environment setup and requirements gathering meetings.
  • Conduct meetings with Business Analyst to gather Non-Functional Requirements of the application, performance test needs, load and stress expectation from the application.
  • Gathered the performance requirements and understanding the Application Architecture.
  • Used Load Runner to execute multi-user performance tests, used online monitors, real-time output messages and of the features of the Load Runner Controller.
  • Performed load, scalability and capacity testing on applications for life insurance and auto insurance quotations.
  • Designed performance tests to closely mimic the auto insurance quote application with multipage flows.
  • Developed test scripts using Vugen performing various customizations to it like correlation and parametrization and other string functions.
  • Configured Performance Test Environment in Azure cloud using Windows Powershell scripts and installing all necessary components on IIS Application server.
  • Developed LoadRunner test scripts using various protocols like web HTTP/HTML, Ajax TruClient, Web Services (XML and JSON based), Oracle 2 Tier, Ajax Click and Script.
  • Developed LoadRunner test scripts suites to simulate the 3000 transactions per second volume for highly scalable for claims processor web application with right amount of throughput replication between data centers.
  • Experience SOAP and Restful services using Soap Ui for testing the various test scripts and interfaces.
  • Customized LoadRunner scripts: String manipulations, including Loops.
  • Interacted directly with developers, project managers for the development, execution and reporting of all testing efforts.
  • Utilized Dynatrace for performance monitoring and tuning and drawing more clear results of response times and its analysis.
  • Monitored Garbage collection process and CPU utilization using Dynatrace.
  • Identified, analyzed, and documented defects utilizing Team Foundation Server as defect tracking system.
  • Review testing artifacts/deliverables (testing scripts, Load Runner schedules and volumes) prepared by Test Analysts and testers.
  • Hands on experience with root cause analysis using dynatrace and ANTS profiler capturing the methods and sql queries causing high response times.
  • Performed SQL tuning for sql queries causing high response times and applied indexing to the tables to improve response times.
  • Executed endurance tests to check for memory leaks and involved in the JVM tunings and recommended heap settings.

Environment: Windows Server, SQL Server, TFS, Share Point, LoadRunner 11.52, Performance Center, Performance Center, DynaTrace, ANTS Profiler, Fiddler, Work bench, Putty, Http watch, WIX, IIS, Installshield, Subversion.

Confidential

Sr. Performance Tester

Responsibilities:

  • Responsible for system study & information gathering. Participated in initiative walkthrough group meetings.
  • Worked alongside business lead to identify relevant test data, followed by data preparation for the test cases.
  • Used Quality Center for documenting requirements, view, modify requirement tree and converting requirements to tests.
  • Customized the VuGen scripts using Load Runner with Web HTTP/HTML protocol.
  • Performed Load Testing, with creation of scripts, configuration of Controller and Agent machines, setting up Scenarios, execution of Load Tests and Preparation of Load Test Results and Reports Using Load Runner.
  • Test scripts development in HP LoadRunner Vu Gen, modify scripts with required Correlations, Parameterization, logic, think times, iterations, pacing, logging options and preferences.
  • Create Image and Text Verification checks in Vuser Scripts using Load Runner Vuser generator for validation purpose.
  • Ran smoke testing, load testing and stress testing with high iterations using HP Load Runner to Guarantee Production readiness.
  • Create various Transactions to note the response times using Load Runner Vu Gen.
  • Automated performance test scripts and verified the response time under different load conditions using load runner.
  • Designing Manual scenarios in HP Load Runner Controller. Scheduling tests by scenario/group, setting virtual user initiation period, ramp up, test duration and ramp down times.
  • Executed multi-user performance tests in Load Runner Controller tool, used online monitors, real-time output messages and other features of the Controller.
  • Created test scripts with Jmeter and performed test executions and presented performance analysis, response times and its breakdown to the project team and stakeholders.
  • Used controller to launch 1500 concurrent Vusers to generate load using 5 load generators.
  • Used Load Generators to generate load from different locations onto servers.
  • The Average CPU usage, Response time, TPS are analyzed for each scenario by creating Graphs and reports by using LR analysis tool.
  • Observed for failure/errors and monitored metrics (Transaction Response Times, Running Virtual Users, Hits per Second and Windows Resources graph) in tests.
  • Documented average response times, 90% response times and reported them to the application team.
  • Identified bottle necks, performance issues using multiple user test results, online monitors, real-time output messages and Load Runner Analyzer
  • Created, Analyzed the load test results and reported the load test results to Project Manager.
  • Used Quality Center for defect management- adding defect, tracing changes and sending defect e-mail messages
  • Created PL/SQL and SQL Queries against SQL Server that can reproduce the data on the metrics.

Environment: Load Runner/ Performance Center, Unix, WebLoad, Websphere, Weblogic, SiteScope, Dynatrace, Toad, Oracle 11G QC/ALM., JMeter, Vmstat, Fiddler, Http Watch.

Confidential, Mount Laurel, New Jersey

Sr. Performance Tester

Responsibilities:

  • Coordinate Requirements Gathering sessions with the Business Analysts in the project, to understand the Non-Functional Requirements of the applications, the peak volumes and the performance testing needs.
  • Created performance testing artifacts like Performance Test Strategy document, Performance test Plan and Script design document.
  • Configured Microsoft Visual Studio Ultimate in Cloud Environment along with its components of Test Controller and Test Agent.
  • Developed Performance test scripts based on the script design document and invoked various customizations to it like setting up the Extraction rule, adding data source and context parameters.
  • Hands on experience working with Shunra for monitoring Network Latency and providing optimum solutions for network virtualization.
  • Experience SOAP and Restful services using Soap UI for testing the various test scripts and interfaces.
  • Utilized SQL Server 2012 to create various tables and view in the database and used it for Load Test execution of applications.
  • Hands on Experience with driving root case analysis with DynaTrace and creating pure paths deck for performance counters.
  • Used Http watch for recording the response times of various transactions and reporting it to development team.
  • Hands on experience with ANTS Profiler for debugging the page response times and figuring out which Methods and SQL calls have high response times.
  • Understand and define the Performance testing strategy for the project, across releases, by analyzing the requirements of the project.
  • Hands on experience with Microsoft test manager to create test cases and organize them into test plans and suites.
  • Facilitate testing discussions and planning sessions with test leads from the other tracks of the project, i.e. Customer Experience Management, Communication Services, Release Management tracks, Automation Testing, to ensure optimal coverage of performance testing.
  • Hands on experience with Microsoft Test Manager (MTM) for project management.
  • Utilized TFS for requirement gathering, tracking defects and scenarios.
  • Utilized TFS for managing project management functions to shape the project team based on a user-specifiable software process, and which enable planning and tracking using Microsoft Excel and Microsoft Project.
  • Configured Microsoft visual Studio across multiple test machines along with Test Controllers and Test Agents for performing Distributed Load Test.
  • Created and customized various scripts of Web application with Microsoft visual studio and conducted various stress tests for performance testing.
  • Monitored various performance test execution with Microsoft visual studio and created a descriptive analysis with the analyzer component of it for reporting the response time to project team and stake holders.
  • Experience working with Splunk log analysis to capture performance bottlenecks in various tiers of the application. Performed method level tuning to debug the performance variances causing high response times.
  • Experience with Subversion source code for storing and updating various documentation in timely manner.

Environment: Microsoft Visual Studio Ultimate, JMeter, Dynatrace, Splunk, ANTS Profiler, Weblogic, Shunra, Soap UI, Java, .Net, Report Server Oracle Database, Cisco F5, SSL, Windows XP, JDK, SQL, Navigator. UFT 11.5, Fiddler, Http Watch.

Confidential, New York NY

Performance Tester

Responsibilities:

  • Coordinated with business team to get the performance requirements for the Load Testing, Stress Testing and Capacity Planning.
  • Developed Performance Test plan and Test Case Design Document with the input from developers and functional testers.
  • Created automation test scripts with Unified functional testing and performed various customizations to the test script invoking various checkpoints for error handling.
  • Utilized LoadRunner and Performance Center for conducting performance tests.
  • Extensively used LoadRunner using Virtual User Generator to script and customize performance test harness Web Protocol.
  • Utilized LoadRunner Controller to execute multiple scenarios.
  • Used Test Results to provide summary reports, response times and monitor averages.
  • Provided results by analyzing average response time graph, throughput and hits per second.
  • Extensive familiarity with protocols like Web (HTTP/ HTML), Web services and Citrix.
  • Parameterized scripts to emulate realistic load.
  • Involved in performing load and stress test on the application and server by configuring LoadRunner to simulate hundreds of virtual users and provided key matrix to the management.
  • Configured and used SiteScope Performance Monitor to monitor and analyze the performance of the server by generating various reports from CPU utilization, Memory, Disk and other OS metrics.
  • Created test scripts using Neoload and implemented test execution and presented the analysis and its breakdown to the project team and stakeholders.
  • Involved in conducting stress test and volume tests against the application using LoadRunner.
  • Helped DBAs identify and resolve bottlenecks.
  • Hands on experience with Microsoft test manager to create test cases and organize them into test plans and suites.
  • Collect event logs, IntelliTrace data, video, and other diagnostic data with Microsoft test manager while a test execution.
  • Utilized Microsoft test manager record your actions, screenshots, and other diagnostic data for inclusion in test results and bug reports.
  • Used Quality Center to invoke the scripts and initially performed the baseline testing and organized all the scripts systematically and generated reports.
  • Responsible for analyzing the requirements, designing, debugging, execution and report generation of existing legacy system and new application.
  • Executed baseline, load and endurance tests.
  • Analyzed business critical transactions average response times.
  • Responsible for creating automated Performance scripts for load testing using LoadRunner.
  • Involved in installing LoadRunner components on multiple desktops.
  • Coordinated with Application Owner and System Administrators to identify the bottlenecks and fine tune of the application.
  • Conducted Performance Analysis meeting with Stakeholders, Developers, Architects, Test Analysts and other team members associated with the business.
  • Conducted meeting to discuss the Response times and its Breakdown and reporting various analysis with the help of reports and Graphs.
  • Presented various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Network time, Page Refresh Time and the Page Rendering Time.
  • Involved in Performance Tuning with Constant Engagement with the Application Development Team, Test Architects and the Networking Team.
  • Used HP Diagnostics for High Level Performance Analysis and formed framework and Standards for Performance Optimization.
  • Used HP Diagnostics for Performance Tuning and High Level Analysis and Reporting Services.
  • Worked with Vendor teams to identify the bottlenecks and performed regression testing to compare results.

Environment: Load Runner/ Performance Center, Unix, WebLoad, Websphere, Weblogic, SiteScope, Dynatrace, Neoload, Toad, Oracle 11G QC/ALM., JMeter, Vmstat, Workbench, Putty.

Hire Now