Sr. Performance Engineer Resume
Charlotte, NC
SUMMARY
- I have 8 years of diversified experience as Performance Engineer.
- Experience includes Requirement Analysis, SLA gathering, Automation and Quality Assurance of Client/Server and Web based applications.
- Extensively worked in designing performance testing strategy, workload modelling, running various types of performance tests.
- Hands on Experience working with Testing tools like LoadRunner, J - Meter and BlazeMeter.
- Used Dynatrace, Wily Introscope, App Dynamics, New Relic and Splunk for monitoring server-side metrics to identify potential bottlenecks.
TECHNICAL SKILLS
Languages: C, C++, SQL, Unix Shell, HTML, Java, XML, Java Scripts, JSP, Servlets.
Performance & Monitoring tools: LoadRunner, Performance Center, Jmeter, HP ALM, Dynatrace, Wily Introscope, Site scope, splunk.
Applications/Tools: MS Office, SOA apps,3-tier apps, Quality Center, Soap UI.
Databases: Oracle, Sybase, Teradata, SQL Server, DB2.
Protocols: Web, Web Services, Citrix, Ajax True Client, Java, Oracle NCA, MAPI, SMTP, RDP.
Operating Systems: Unix, Windows 2000/2003/XP Pro/Vista, Solaris.
PROFESSIONAL EXPERIENCE
Confidential, Charlotte, NC
Sr. Performance Engineer
Responsibilities:
- Initiated and took lead for meetings to gather Performance Testing Requirements and created Test plan.
- Extensive experience working with Product owners, Project Managers and Application Development managers to understand their performance testing requirements and execute tests as per the requirements.
- Worked on the improvement of Performance Process by reviewing and evaluating existing practices with standard testing guidelines.
- Responsible for co-coordinating with the offshore team to complete the project within the deadlines.
- Working as Performance Test Lead in different partitions of The Imaging & Content Management Platform applications.
- Developed Complex J-Meter Scripts for ICMP Application using Apache J-Meter Tool.
- Integration of Smoke test runs using Jenkins jobs for continuous performance testing.
- Configured Jenkins with Performance plugin, Dynatrace and Introscope Plugin to run automated scripts from Jenkins Server.
- Run Short duration tests every week with 10000 users.
- Perform stress test using 125%, 150%, 175% and 200% Volume of Short duration test.
- Execute Long duration test for 8 hours with 10000 users and share results and findings with offshore team.
- Integrated JMeter with Dynatrace and Splunk by passing header information in the scripts to check performance metrics on APM tool and Log analysis.
- Involved in collecting test data, Preparing or processing test data, scripting using Vugen, test data validations. creating scenarios and validating them for UI and web services.
- Created CCMS scripts using JMeter for end-to-end business flow for user and executed test with 200 users.
- Identification of Performance Scenarios/Interfaces in the application based on the NFRs.
- Involved in finalizing the volumes for API and UI scripts in scenarios after analysing the production volumes.
- Created scenarios for various types of tests such as load, stress, duration/endurance using different tools such as Gatling, JMeter.
- Good experience in Splunk, Prometheus and Grafana for logging and monitoring.
- Developed and created Shift Left Test execution process/Grafana dashboards for multiple teams to help them identify performance issues early in the development cycle.
- Hands on experience in customizing Splunk dashboards, visualizations, configurations using customized Splunk queries.
- Handson Experience in using UNIX/Linux commands including VMSTAT / IOSTAT utilities, searching, traversing directories, and editing commands.
- Experience in tuning J2EE Complaint Application Servers like Tomcat on Linux environment.
- Analyze/Monitor logs and graphs for the client side, server-side metrics like Transaction Response Time, Hits per second graph, Pages download per second, Throughput, and Memory & CPU utilization, Pods available by using Grafana and Dynatrace
- Analyzed the test results and prepared various test reports using Dynatrace.
- Configured the JMeter Test Plan for running the tests. Verified that the JMeter scripts were working as expected on different Load generator machines.
- Integrated JMeter with Dynatrace and Splunk by passing header information in the scripts to check performance metrics on APM tool and Log analysis.
- Experience with normalizing SQL query results for use in other data analysis tools, such as in Elasticsearch and Kibana.
- Tested the application with integrated applications with different teams like core Confidential the same time to verify performance impacts.
Confidential
Performance Test Lead
Responsibilities:
- Initiated and took lead for meetings to gather Performance Testing Requirements and created Test plan.
- Organised weekly status meetings with Performance team for better progress of project efforts.
- Provided estimates to each project-to-project team for work efforts of team members.
- Used LoadRunner tool for building different scripts for web and web services.
- Worked on Digital Experience which is the web-based performance Testing.
- Digital Experience is main application for Allianz life where different user roles can use to perform day to day activities.
- Have hands on experience on with blaze Meter and converted recordings to .jmx format
- Created Scripts using MAPI and SMTP protocol to retrieve passcodes from Outlook mail box for Multi factor Authentication for users.
- Involved in native mobile application performance testing using LoadRunner.
- Created scripts for both Native and Hybrid mobile apps using Vugen mobile true client and mobile protocols.
- Design and develop test tools/scripts that supports distributed multiple platforms (Window, LINUX)
- Collaborated with development & support teams to service a continuous delivery environment with the use of build platform tool DOCKER.
- Used Kubernetes to orchestrate the deployment, scaling and management of Docker Containers.
- Worked on mobile based apps. Created and executed scripts for native mobile apps
- Responsible for analyzing results, reports and charts to see response times of individual transactions with respect to Web Applications (HTTP/HTML), Flex, Mobile, Web Services and Ajax Truclient Protocol.
- Developed J-Meter Scripts for Front Office Application using Apache J-Meter Tool.
- Developed J-Meter Scripts for IBM BPM Application.
- Tested Front Office with user load of 1000 users.
- Developed Scripts to test Stored procedures and compare results to previous releases.
- Developed framework to test stored procedures from Oracle client and PL/SQL Database.
- Created tru-client Firefox scripts for testing Service now which is enterprise cloud platform.
- Used Apache JMeter for creating test scripts and Blazemeter for running load tests, Stress Tests and Endurance Tests in AWS cloud.
- Developed LoadRunner scripts in Web, Web services and Tru-client protocols and executed load test in LoadRunner cloud
- Created Sales Force lightening framework scripts using Fiddler and converted script to LoadRunner
- Imported Load runner Scripts with Dynatrace and added Dynatrace headers.
- Created 118 interactive digital workflows encompassing various users and scenarios.
- Created Scripts using Tru-client IE protocol for UEM Dynatrace testing and compared it with transactions of Web interactive application.
- Creating POCs for open-source tools such as Gatling and etc that fits performance modelling and monitoring, performance testing and tuning, and capacity planning.
- Involved in SQL Query tuning and provided tuning recommendations for time/CPU consuming queries
- Executed Exasol Test and compared it with SQL Stored procedures transactions to tune the database
- Execution of Baseline, Performance, Load, Volume, Stress, endurance and Capacity planning tests with load runner controller and Performance center.
- Created Dynatrace dashboards for business-critical transactions.
- Created alerts
- Analysed Response time hotspots for business-critical Transactions.
- Involved in Tuning number of full GC and its CPU spikes Confidential high memory conditions by increasing heap size and thereby eliminating JVM abnormalities.
- Worked with members of the development, infrastructure and database teams to determine essential performance counters to capture and set up hardware and software monitoring across all tiers using Dynatrace.
- Monitored the degraded transactions by drilling down pure path for transactions.
- Monitored application using Dynatrace Real User Monitoring. Created Dashboards and alerts in both production and Staging environment.
- Worked with production monitoring team to set up completed end to end dashboards using Dynatrace.
- Have provided developers and team the analysis for degraded transaction with the executions time Confidential methods and calls.
- Expertise in creating performance testing pipelines using CI/CD tools Azure Devops, Jenkins
- Customized JMeter scripts in Java language using bean shell preprocessor
- Worked extensively with Web services and REST API Web Services testing.
- Introduced rendezvous points in the script for stressing the application for specific transactions.
- Worked extensively on fine-tuning the .Net and Java Applications.
- Tested monthly major releases for Production digital website.
- Performed load test for load of 350 users for Web and 330 users for SIG Services (web services).
- Used Fiddler for HTTP Debugging purposes.
- Used polarion to raised defects and assign to the respective Team with complete analysis of degraded transactions.
Confidential
Sr. Performance Engineer
Responsibilities:
- Used J-Meter for heavy load on a server, group of servers, network or object to test its strength or to analyse overall performance under different load types.
- Worked on Order Capture Engine Application (OCE) is the core backend system. OCE is the centralized Order processing engine for all types of orders received.
- Configured fuse where the orders come and store in temporary database called IDB. And second one is to handle all the AMQs.
- Working with Project Team and explained them Performance Test Scripts Workflows.
- Configured Listeners for different components across the FFPT Environment
- Worked on Iconic 2017 Project which is Related to IPhone 8 Orders for September release and Iphone X for October Release.
- Created and recorded test cases by using BlazeMeter and exported them to JMeter for executing performance testing
- Working with API Testing and batch performance Testing.
- Work with product team to design and run extensive capacity, scalability, stability, and stress tests using load runner and web load.
- Created Scripts for Web services using REST Protocols in Jmeter.
- Maintained the Performance Test Environment Servers.
- Created Script for CRU Bulk line offline orders. Uploading Files on to Server.
- Created Script for CRU Bulk line online orders, XML Request from Production payload.
- Monitored En-queue and De-queue count using J-Console.
- Created Scripts for Fed Upgrade Orders and ATG Inquire Orders.
- Extensively monitored the UNIX servers. Monitored Web sphere using HP Diagnostics
- Performing Load Test for Monthly releases for 250K Concurrent users which are Upgrade orders.
- Monitoring Server Metrics using New Relic.
- Monitored and Analyzed activity Report and performance Report created using Wily Introscope
- Monitoring Memory (Heap/non-Heap memory usage, Garbage Collection Graph), Threads (Active and Idle Thread count for Pool, Number of active threads in JVM), HTTP sessions (Session Graphs: Active, expired and Rejected HTTP session count), App Server Transactions (Number of active transactions, Top level transaction, Nested transactions, aborted and Committed transactions).
- Working on Pulling out data from Oracle Database to validate the Orders have been processed or not.
- Used Fiddler for HTTP Debugging purposes.
- Worked in Agile Methodology, Used Quality Center to Raised Defects.
Confidential - Charlotte, NC
Performance Test Engineer
Responsibilities:
- Initiated and took lead for meetings to gather Performance Testing Requirements and created Test plan.
- Installed/ Configured J-Meter, HTTP watch on RDP machine.
- Used J-Meter for heavy load on a server, group of servers, network or object to test its strength or to analyse overall performance under different load types.
- Worked with J-Meter in stimulating load on the servers to check performance of different load types.
- Have good experience in working with performance testing tools like HP's LoadRunner, Jmeter, Performance Center and Blazemeter cloud load testing tools
- Customized JMeter scripts in Java language using bean shell preprocessor
- With J-Meter did test performance for both, static resources as well as dynamic resources, as well as handle a maximum number of concurrent users then your website can handle and providing the graphical analysis of performance reports.
- Worked with Offshore Team and explained them Performance Test Scripts/Workflows.
- Worked on GTS sales source project.
- Created Scripts for different User Roles as GTS General user, Contributor, User Admin roles.
- Created J-meter Scripts for Advance Search, downloading various file types, Adding Document to cart, Empty Cart.
- Created Scripts for Project Build Tool and Created scripts for project Actions (Edit, Download, Delete, Copy).
- Created J-meter Scripts for Navigation Screens.
- Executed Performed Load Test with maximum used load of 5000.
- Configured Jenkins with Performance plugin, Dynatrace and Introscope Plugin to run automated scripts from Jenkins Server.
- Created Workload Model for Test Scripts scenarios.
- Worked in Agile Methodology, Used Rally to write user stories and to raise defects.
- Execution of Low Volume Test, Peak Load Test, Aging Test with J-Meter, Jenkins.
- Created Daily Status Report for Performance Testing and Published them daily to team.
- Worked with Manual Browser Test using HTTP Watch.