We provide IT Staff Augmentation Services!

Lead Performance Engineer Resume

SUMMARY

  • 11 years of experience in Performance Testing in various domains like Health and human services, Insurance and Financial clients.
  • Solid experience in Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) in the areas of business requirements, development, testing, deployment and documentation.
  • Experience in Performance testing using tools LoadRunner, JMeter, Neoload and Web Load.
  • Strong experience in preparing Performance Test Plans, Performance Test Strategy, Performance Test Analysis Reports.
  • Experienced in setting up Performance Environment, Monitoring Strategy and Configuration as identical to Production.
  • Experienced in Workload Analysis, Load Testing, Performance Monitoring and Tuning of Web, App Servers, and Database Servers.
  • Good knowledge in designing and planning various performance tests (Load, Stress and Endurance), executing test scenarios using performance tools.
  • Experience in Monitoring tools like Microsoft Perfmon.
  • Experience in root cause analysis using tools like Dynatrace and App Dynamics.
  • Experience in Load Testing Tools like LoadRunner, Jmeter, VSTS.
  • Worked on Web and Web services, Ajax protocols, TruClient.
  • Worked on Soap Ui, Postman and Swagger for functional validation of services.
  • Created Pipelines using AWS cloud for performance testing.
  • Solid experience in developing Test Scripts used for Bug Fixes, Enhancements, Functional and Regression testing of software releases according to the product functional specifications and use cases.
  • Experience in applying Testing Methodologies, creating Test Plans, Executing Test Scripts, Automation of Test Cases, Defect Tracking and Report Generation, Requirement Traceability Matrix.
  • Experience in SQL queries for the backend testing using SQL Developer, Toad.
  • Involved in designing required Quality Center (QC) templates, managed the user profiles, configured the defect tracking flow and defined the rules for automatic email generation.
  • Worked with cross - functional teams to execute full system, interface, and end-to-end testing.
  • Excellent team member with problem-solving and trouble-shooting skills.

PROFESSIONAL EXPERIENCE

Confidential

Lead Performance Engineer

Responsibilities:

  • Played a leading role in performance testing involved in developing performance test strategy, work load model, performance test plan and performance test scripts using Load Runner implementing various protocols like Web Http, Web Services, Oracle 2 Tier and TruClient.
  • Took a leading role in environment setup and requirements gathering meetings.
  • Used Load Runner to execute multi-user performance tests, used online monitors, real-time output messages and of the features of the Load Runner Controller.
  • Performed load, scalability and capacity testing on applications for life insurance and auto insurance quotations.
  • Designed performance tests to closely mimic the auto insurance quote application with multipage flows.
  • Developed test scripts using Vugen performing various customizations to it like correlation and parametrization and other string functions.
  • Configured Performance Test Environment in Azure cloud using Windows Powershell scripts and installing all necessary components on IIS Application server.
  • Scaled up Apache tomcat Application server configuration based on performance scalability.
  • Hands on experience with database server tuning and monitoring application server, Web Server and database server logs.
  • Configured application and database server along with load generator in cloud Environment.
  • Developed LoadRunner test scripts using various protocols like web HTTP/HTML, Ajax TruClient, Web Services (XML and JSON based), Oracle 2 Tier, Ajax Click and Script.
  • Developed LoadRunner test scripts suites to simulate the 3000 transactions per second volume for highly scalable for claims processor web application with right amount of throughput replication between data centers.
  • Experience SOAP and Restful services using Soap Ui for testing the various test scripts and interfaces.
  • Customized LoadRunner scripts: String manipulations, including Loops.
  • Interacted directly with developers, project managers for the development, execution and reporting of all testing efforts.
  • Utilized Dynatrace for performance monitoring and tuning and drawing more clear results of response times and its analysis.
  • Monitored code level and database calls via pure-paths using Dynatrace.
  • Used JIRA for creating user stories for performance testing and respective sub-tasks for current sprint.
  • Review testing artifacts/deliverables (testing scripts, Load Runner schedules and volumes) prepared by Test Analysts and testers.
  • Hands on experience with root cause analysis using dynatrace and J profiler capturing the methods and sql queries causing high response times.
  • Performed SQL tuning for sql queries causing high response times and applied indexing to the tables to improve response times.
  • Executed endurance tests to check for memory leaks and involved in the JVM tunings and recommended heap settings.

Environment : Windows Server, SQL Server, TFS, Share Point, LoadRunner 11.52, Performance Center, Apache Tomcat, DynaTrace 6.3, ANTS Profiler, Fiddler, Work bench, Putty, Http watch, WIX, IIS, Installshield, Subversion, Oracle 11G

Confidential

Performance Test Lead

Responsibilities:

  • Responsible for system study & information gathering. Participated in initiative walkthrough group meetings.
  • Worked alongside business lead to identify relevant test data, followed by data preparation for the test cases.
  • Used Quality Center for documenting requirements, view, modify requirement tree and converting requirements to tests.
  • Customized the VuGen scripts using Load Runner with Web HTTP/HTML protocol.
  • Performed Load Testing, with creation of scripts, configuration of Controller and Agent machines, setting up Scenarios, execution of Load Tests and Preparation of Load Test Results and Reports Using Load Runner.
  • Test scripts development in HP LoadRunner Vu Gen, modify scripts with required Correlations, Parameterization, logic, think times, iterations, pacing, logging options and preferences.
  • Created JMeter pipelines on AWS cloud using bit bucket app repo.
  • Hands on experience with App Dynamics for monitoring the client side and server-side metrics.
  • Ran smoke testing, load testing and stress testing with high iterations using HP Load Runner to Guarantee Production readiness.
  • Hands on experience tuning Apache Tomcat Web Server based on performance SLA requirements.
  • Create various Transactions to note the response times using Load Runner Vu Gen.
  • Automated performance test scripts and verified the response time under different load conditions using load runner.
  • Created test scripts with Jmeter and performed test executions and presented performance analysis, response times and its breakdown to the project team and stakeholders.
  • Used controller to launch 1500 concurrent Vusers to generate load using 5 load generators.
  • Used Load Generators to generate load from different locations onto servers.
  • Used Splunk log to monitor log files during and post-performance test.
  • Experience with J-stack for thread up analysis.
  • Documented average response times, 90% response times and reported them to the application team.
  • Identified bottle necks, performance issues using multiple user test results, online monitors, real-time output messages and Load Runner Analyzer
  • Created, Analyzed the load test results and reported the load test results to Project Manager.
  • Used Quality Center for defect management- adding defect, tracing changes and sending defect e-mail messages
  • Created PL/SQL and SQL Queries against SQL Server that can reproduce the data on the metrics.

Environment: Load Runner/ Performance Center, App Dynamics, Putty, Unix, JStack, WebSphere, WebLogic, Apache Tomcat, SiteScope, Dynatrace, Toad, Oracle 11G QC/ALM., JMeter, Vmstat, Fiddler, Http Watch.

Confidential

Performance Engineer

Responsibilities:

  • Coordinate Requirements Gathering sessions with the Business Analysts in the project, to understand the Non-Functional Requirements of the applications, the peak volumes and the performance testing needs.
  • Created performance testing artifacts like Performance Test Strategy document, Performance test Plan and Script design document.
  • Configured Microsoft Visual Studio Ultimate in Cloud Environment along with its components of Test Controller and Test Agent.
  • Developed Performance test scripts based on the script design document and invoked various customizations to it like setting up the Extraction rule, adding data source and context parameters.
  • Hands on experience working with Shunra for monitoring Network Latency and providing optimum solutions for network virtualization.
  • Experience SOAP and Restful services using Soap UI for testing the various test scripts and interfaces.
  • Utilized SQL Server 2012 to create various tables and view in the database and used it for Load Test execution of applications.
  • Hands on Experience with driving root case analysis with DynaTrace and creating pure paths deck for performance counters.
  • Used Http watch for recording the response times of various transactions and reporting it to development team.
  • Hands on experience with ANTS Profiler for debugging the page response times and figuring out which Methods and SQL calls have high response times.
  • Understand and define the Performance testing strategy for the project, across releases, by analyzing the requirements of the project.
  • Hands on experience with Microsoft test manager to create test cases and organize them into test plans and suites.
  • Facilitate testing discussions and planning sessions with test leads from the other tracks of the project, i.e. Customer Experience Management, Communication Services, Release Management tracks, Automation Testing, to ensure optimal coverage of performance testing.
  • Hands on experience with Microsoft Test Manager (MTM) for project management.
  • Utilized TFS for requirement gathering, tracking defects and scenarios.
  • Utilized TFS for managing project management functions to shape the project team based on a user-specifiable software process, and which enable planning and tracking using Microsoft Excel and Microsoft Project.
  • Configured Microsoft visual Studio across multiple test machines along with Test Controllers and Test Agents for performing Distributed Load Test.
  • Created and customized various scripts of Web application with Microsoft visual studio and conducted various stress tests for performance testing.
  • Monitored various performance test execution with Microsoft visual studio and created a descriptive analysis with the analyzer component of it for reporting the response time to project team and stake holders.
  • Experience working with Splunk log analysis to capture performance bottlenecks in various tiers of the application. Performed method level tuning to debug the performance variances causing high response times.
  • Experience with Subversion source code for storing and updating various documentation in timely manner.

Environment: Microsoft Visual Studio Ultimate, JMeter, Dynatrace, Splunk, ANTS Profiler, Weblogic, Shunra, Soap UI, Java, .Net, Report Server Oracle Database, Cisco F5, SSL, Windows XP, JDK, SQL, Navigator. UFT 11.5, Fiddler, Http Watch.

Hire Now