Performance Engineer Resume
SUMMARY
- Over 5 years of experience in tools like JMeter,Blazemeter,Load runner,Postman and Neoload in testing Stand - Alone, Client/Server and Web Based applications
- Exposed to all phases of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
- Hands on Experience with various protocols of JMeter such as HTTP, Web-Services, JDBC and selenium webdriver protocol.
- Hands on Experience with end to end performance testing Framework working on On-Prem Environment and on Cloud Environment. Worked on Salesforce, Microsoft Azure and AWS.
- Hands on Experience with Capacity Planning of Applications under test. Performed Capacity planning of application server and web servers along with DB Server.
- Hands on experience with Apache Tomcat and IIS Application Server for performance tuning of method level exceptions.
- Configured JMeter for Distributed load test across multiple test machines using Test Controllers, Test Agents and Setting up the machine for collecting test data requirements.
- Hands on experience with JMeter to conduct various performance, stress and load testing.
- Hands on experience working with SPLUNK Log Analysis to debug APP, DB and CORE Tier Log Files.
- Created Dashboards with Splunk and written regex functions with SPLUNK.
- Hands on Experience with Load Runner Vugen for developing test scripts.
- In Depth experience in different SDLC methodologies like Agile Scrum and Waterfall.
- Good exposure in creating Test Scripts, Test Plans and coordinating with development team to fix Bugs.
- Good Understanding of applications built on Java, .net, MVC Architectures and SAP Platform.
- Experience with DataDog, AppDynamics, New Relic and AppOptics.
- Experience with Splunk Log Analysis, Created Dashboards and Visualization in Splunk.
- Experience with Docker and Kubernetes Orchestration of cloud environments.
- Experience with Data Dog,App Optics and AppDynamics for monitoring cloud applications hosted on AWS Cloud.
TECHNICAL SKILLS
Technologies: JMeter,QualityCenter/ALM,TestDirector,LoadRunner,JIRA,AzureDevops,Ms- office package, Windows 2000/7, Ubuntu UnixDatabases MS SQL Server 2005/2008, Unix, Linux.
Scripting: Automation Scripting.
Operating Systems: Windows server 2003,2007 XP, Vista 2000.
Project Mgmt. tools: MS Project
Standards/Methodologies: Agile, TDD.
Testing tools: J-Meter,Load Runner,Neo Load,Postman
PROFESSIONAL EXPERIENCE
Confidential
Performance Engineer
Responsibilities:
- Experience with Baseline, Scalability, Load, Stress and Endurance Testing.
- Hands on Experience with JMeter for developing test scripts. Experience using bean shell processors.
- Hands on experience with LoadRunner for developing performance test scripts using truclient protocol.
- Experience using blaze-meter for developing testing scripts using .jmx files and enhancing them later via JMeter.
- Used JMeter for creating test scripts for User Interface for Web Applications under test.
- Uploaded all JMeter scripts to Gitlab repository for distributed load test.
- Hands on experience with DataDog for performance monitoring using Pure-paths and dash lets capturing the server side and client-side performance metrics.
- Experience working on AWS Cloud for performance testing. Setting up Virtual Machines and configuring the Application server setup on cloud platform.
- Performance tested micro- services on AWS Cloud environment.
- Experience working on CloudWatch to monitor client-side metrics and Server-Side Metrics of Web Applications hosted on AWS Cloud Environment.
- Experience with Oracle Database for tuning performance bottlenecks. Performing indexing on tables and creating views for performance testing.
- Experience with root cause analysis using localhost and application server log files for capturing performance bottlenecks.
- Hands on Experience with Capacity planning of applications under test and implementing POC and scalability testing.
- Performed Capacity planning for Application Server and DB Server based on prod Environment.
- Experience with Kubernetes to assign various nodes to docker image files on AWS cloud infrastructure.
- Configured various nodes via Kubernetes using a docker image instance on AWS cloud.
- Experience with JIRA/Confluence for Project management and presenting results and analysis to stakeholders.
- Experience working on Jenkins integration with JMeter for creating various shells scripts for test execution and analysis.
- Experience working in an Agile test Environment for performance testing.
- Created performance testing artifacts like Performance test Strategy document, Performance test Plan and Script design document.
- Hands on experience with Apache Tomcat Application Server for tuning performance bottlenecks like methods level exceptions.
- Configured list of available performance counters in DataDog and AppDynamics and monitored the utilization of Application and Database Servers.
- Monitored key performance metrics like Memory, Process, Network, Disk I/O, Deadlocks and SQL Locks.
- Experience using Splunk Log Analysis for identifying bottlenecks in Core and Web Tier Log Files.
- Experience writing regex queries using Splunk Log Analysis.
- Experience with performance tuning of Web Applications using various profilers and suggesting the methods causing high response time and SQL profiling on sql queries causing latency.
- Monitored key performance metrics like Memory, Process, Network, Disk I/O, Deadlocks and SQL Locks.
- Experience with performance tuning of Web Applications using various profilers and suggesting the methods causing high response time and SQL profiling on sql queries causing latency.
- Experience with Jira for raise performance defects.
Environment: JMeter 5.X.X,LoadRunner/ Performance Center12.55, Unix, Neoload, Postman, Oracle 11G QC/ALM., Azure Devops, HTTP Watch, JIRA, Splunk.
Confidential
Performance Engineer
Responsibilities:
- Coordinate Requirements Gathering sessions with the Business Analysts in the project, to understand the Non-Functional Requirements of the applications, the peak volumes and the performance testing needs.
- Configured Performance Test Environment in Azure cloud
- Experience developing performance test scripts.
- Created performance testing artifacts like Performance test Strategy document, Performance test Plan and Script design document.
- Experience with the Defect Management and the Change Management to ensure all defects have been successfully addresses and log in to Team Tracker and all changes have been tested before release cycle.
- Hands on experience with JMeter developing test scripts and executing them in Non Gui mode.
- Experience with BlazeMeter to record the workflows and later customizing the .jmx files using JMeter.
- Experience with creating Virtual Test Environment in Microsoft Azure for performance testing.
- Worked on developing inhouse performance utility tools on Windows Forms Application in .Net.
- Developed Performance test scripts based on the script design document and invoked various customizations to it like setting up the Extraction rule, adding data source and context parameters.
- Experience SOAP and Restful services using Soap Ui for testing the various test scripts and interfaces.
- Utilized SQL Server 2012 to create various tables and view in the database and used it for Load Test execution of applications.
- Hands on Experience with driving root case analysis with DataDog and creating pure paths deck for performance counters.
- Used Http watch for recording the response times of various transactions and reporting it to development team.
- Hands on Experience on APP Dynamics for Monitoring Web Applications under test.
- Configured Key Performance counters on APP Dynamics for Client Side and Server-Side Metrics.
- Understand and define the Performance testing strategy for the project, across releases, by analyzing the requirements of the project.
- Hands on experience with Microsoft test manager to create test cases and organize them into test plans and suites.
- Facilitate testing discussions and planning sessions with test leads from the other tracks of the project, i.e. Customer Experience Management, Communication Services, Release Management tracks, Automation Testing, to ensure optimal coverage of performance testing.
- Configured JMeter across multiple test machines along with Test Controllers and Test Agents for performing Distributed Load Test.
- Created and customized various scripts of Web application with JMeter and conducted various stress tests for performance testing.
- Monitored various performance test execution with JMeter and created a descriptive analysis.
- Hands on experience with Microsoft azure cloud.
Environment: JMeter 5.X.X,LoadRunner/ Performance Center12.55, Unix, Neoload, Postman, Oracle 11G QC/ALM., Azure Devops, HTTP Watch, JIRA, Splunk.
