Performance Engineer Resume
Cincinnati, OH
PROFESSIONAL SUMMARY:
- 8 years of Professional IT experience in Performance Engineering with extensive knowledge on end - to-end application performance testing/engineering and Certified AWS solutions architect with experience on designing and supporting AWS migrations from on-premises.
- Strong knowledge on AWS architectures and Analysis using end to end performance testing, component testing and chaos engineering.
- Expertise in shift left testing approach including CICD pipeline using Jules/Jenkins and configuring it with Blazemeter.
- Strong Proficiency in HP Load Runner with experience on multiple protocols like Web (HTTP/HTML), Flex, Web Services, Ajax True Client etc.
- Extensive experience of using JMETER and BLAZEMETER for various protocols.
- Good Experience in service virtualization using CA DEVTEST tool and Wiremock for component testing.
- Good Knowledge on using Splunk for Server log Analysis and to present component level performance analysis.
- Possess Strong debugging and performance Analysis skills to identify root cause for performance issues in end-to-end application flow of multi-tier architectures.
- Possess good knowledge in Mobile applications and database stored procedures performance testing.
- Good knowledge on scripting and Performance testing IBM MQ.
- Worked with various monitoring tools like CA Introscope-wily, DYNATRACE Managed and New Relic to debug performance bottlenecks at infrastructure level.
- Expertise at AWS monitoring tools like Datadog, CloudWatch logs.
- Good knowledge on performance counters of Webservers/Appservers like Apache Tomcat, WebSphere, JBoss, etc.
- Extensive experience on finding bottlenecks of Application servers and Databases using Thread dumps and Heap Dumps.
- Possess good knowledge on development of client-server applications using JAVA/J2EE Technologies.
- Possess good knowledge in SQL concepts and Relational database systems.
- Ability to benchmark performance of an application in Agile environment utilizing a test-driven development approach.
- Excellent communication skills and adept at facilitating walkthrough and sessions.
TECHNICAL SKILLS:
Testing Tools: HP Performance center 12.53, LoadRunner, JMETER, Blazemeter, CA DevTest, Wiremock, Splunk, Dynatrace, New Relic, CA Introscope-Wily, Datadog, CloudWatch
Programming Languages: C, Java, VB & shell scripting
s: AWS Solutions Architect - Associate
Databases: MS-SQL Server, Oracle 12, IBM DB2.
Applications Servers: Web Sphere, WebLogic, Jboss, Tomcat, AWS
PROFESSIONAL EXPERIENCE:
Confidential (Cincinnati, OH)
Performance Engineer
Responsibilities:
- Gathering performance requirements from application stakeholders and driving towards fact-based testing by making sure whether they met with current peak production load using tools like Splunk, DCRUM.
- Pulling Production metrics using Splunk for all the critical business scenarios.
- Environment setup using service virtualization using DevTest mocks for downstream components (multi-tier architecture) which doesn’t have supporting environment for performance tests.
- Supporting AWS migrations from on-premises and performing initial set of environment validations by carrying shakeout activities and monitoring through datadog.
- Designing scripts and scenarios for executions using Jmeter and Blazemeter.
- Designed Jmeter scripts to test IBM MQ’s and also for store procedure performance testing.
- Sprint wise performance Test automation using CI/CD pipeline with integration of Blazemeter scenarios.
- Performance Analysis of all the CBT’s and identifying the root cause for high response times such as identifying the methods at the respective tier level impacting the response times.
- Performance Analysis for each application tier using Splunk and Dynatrace Managed.
- Identifying and Reporting bottlenecks during performance tests to AD teams for resolution and providing tuning recommendations.
- Carrying out integrated and component performance testing to evaluate maximum threshold of each component.
- Analyzing all the Client and Server metrics and presenting them to AD teams by certifying the performance with a go or no-go to production.
- Identifying the potential bottlenecks and methods impacting performance using profiler tools like Dynatrace
- AWS performance testing and deep dive of infrastructure level metrics of each AWS component like EC2, EKS, ALB, RDS, Elasticache.
- Performing resiliency testing on cloud environments with chaos engineering tools like Gremlin.
- Supporting all resiliency test scenarios and evaluating application performance during failover cases.
- Interacting with development teams to propose recommendations to improve application performance.
Environment: &Tools: Jmeter, Blazemeter, Dynatrace Managed, Splunk, Apache, Tomcat, IBM DB2, AWS, EC2, EKS, RDS, Elasticache, ALB, Datadog, CloudWatch, Jules/Jenkins etc.
Confidential (Denver, CO)
Performance Engineer
Responsibilities:
- Gathering performance requirements from application stakeholders and driving towards fact-based testing by making sure whether they met with current peak production load using tools like Splunk.
- Pulling Production metrics using Splunk for all the critical business scenarios.
- Environment setup using service virtualization using DevTest mocks for downstream components (multi-tier architecture) which doesn’t have supporting environment for performance tests.
- Supporting AWS migrations from on-premises and performing initial set of environment validations by carrying shakeout activities and monitoring through datadog.
- Designing scripts and scenarios for executions using Jmeter and Blazemeter.
- Sprint wise performance Test automation using CI/CD pipeline with integration of Blazemeter scenarios.
- Performance Analysis of all the CBT’s and identifying the root cause for high response times such as identifying the methods at the respective tier level impacting the response times.
- Performance Analysis for each application tier using Splunk and Dynatrace Managed.
- Identifying and Reporting bottlenecks during performance tests to AD teams for resolution and providing tuning recommendations.
- Carrying out integrated and component performance testing to evaluate maximum threshold of each component.
- Analyzing all the Client and Server metrics and presenting them to AD teams by certifying the performance with a go or no-go to production.
- Identifying the potential bottlenecks and methods impacting performance using profiler tools like Dynatrace
- AWS performance testing and deep dive of infrastructure level metrics of each AWS component like EC2, EKS, ALB, RDS, Elasticache.
- Performing resiliency testing on cloud environments with chaos engineering tools like Gremlin.
- Supporting all resiliency test scenarios and evaluating application performance during failover cases.
- Interacting with development teams to propose recommendations to improve application performance.
Environment: &Tools: Jmeter, Blazemeter, Dynatrace Managed, Splunk, Apache, Tomcat, IBM DB2, AWS, EC2, EKS, RDS, Elasticache, ALB, Datadog, CloudWatch, Jules/Jenkins etc.
Confidential (Jacksonville, FL)
Performance Engineer
Responsibilities:
- Gathering performance requirements, identifying the critical business scenarios and objectives
- Creating Test Plan and strategy for test executions.
- Involved in LoadRunner and Jmeter scripting (http/html, flex and web services protocols), load test executions & monitoring of the servers.
- Analyzing the Server logs, AWR Reports to identify the potential bottlenecks in the application server and database.
- Analyzing server level metrics at application & database server to identify the infrastructure issues.
- Analyzing Thread dumps & heap dumps to identify the performance bottlenecks in application code.
- In Depth analysis of both client and server level metrics impacting the performance and raising bugs.
- Involved in Defect scrum meetings to discuss the defects with development team.
- Interacting with various development teams and daily status reports.
- Tune systems for optimal performance and characterize systems on multiple platforms and recommend configuration changes.
- Define and configure SLAs for hits/sec, throughput, transactions per second, etc. in Blazemeter.
Environment: &Tools: Performance center, Jmeter, Blazemeter, CA Introscope-Wily, SoapUI, Postman, Splunk/Kibana, Oracle, Rally, SolarWinds.
Confidential
Performance Operations Engineer
Responsibilities:
- Analyzed Business, Functional, Non-Functional Requirements, scope and design Documents of the application.
- Develop performance test plans and test strategies based on business specifications. Work closely with the application teams, business teams to gather volume details, SLAs (Service Level Agreements) to define test plans.
- Involved in writing Performance Test Plan by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and presentation of results.
- Execution and reporting of test Performance results and driving weekly Perf status call with the stakeholders.
- Report to executives on progress toward performance KPI's (Key Performance Indicators), trends, projects.
- Worked on deployment automation of all the micro services to pull image from the private docker registry and deploy to docker swarm cluster using ansible.
- Implemented continuous delivery pipeline with Docker, GitHub, and AWS.
- Worked on Micro Services deployments on AWS, ECS and, EC2 instances.
- Created scripts using Load runner Vugen and Jmeter tools.
- Conducted performance testing using Load Runner for the entire applications using various scenarios specifically that are designed for testing real world scenarios.
- Used SQL queries to check the integrity of the data and to perform Back-end Testing.
- Involved in trouble shooting errors from server exception, log analysis.
- Monitored the performance of the application and database servers during the test run using Dynatrace
- Analyzing Thread dumps & heap dumps to identify the performance bottlenecks in application code.
- In Depth analysis of both client and server level metrics impacting the performance and raising bugs.
- Identifying and Reporting bottlenecks during performance tests to AD teams for resolution and providing tuning recommendations
- ALM was extensively used for logging defects and tracking them.
- Created user stories in CA Rally and connecting them to defects in ALM.
- Modified existing LoadRunner scripts to replicate new builds of the application.
- Analyzed various performance Monitors for WebLogic and WebSphere Application Servers.
- Create and configure standard dashboards to monitor client and server level metrics.
- Design and implemented Jenkins based solutions for build automation, including code coverage, Unit test and regression testing while utilizing Dynatrace for build over build performance comparisons.
- Actively involved with HP Support to resolve identified LoadRunner issues.
- Involved in CI/CD (Continuous Integration/Continuous Delivery) process pipeline to multiple environments and improved business results due to increase in speed and uniformity of Automated service creation and infrastructure provisioning.
Environment: &Tools: Load Runner, Performance Center, Jmeter, Blazemeter, HP ALM, Dynatrace, Splunk, SoapUI, Web services, Akamai, Fiddler, JAVA, UNIX. Web Sphere, Jenkins.
Confidential, NY
Performance Engineer
Responsibilities:
- Involved in analyzing the requirements from the developers and Business users.
- Involved in gathering all the requirements from various teams and worked on the test plan and test strategy documents for projects.
- Involved in presenting the test plan and test strategy document with various stake holders.
- Involved in creating the data required for the performance test effort.
- Involved in setting up the JMETER tool on various environments.
- Involved in creating the test scripts for various scenarios using the JMeter tool.
- Performance testing of Java applications using JMeter for various protocols.
- Created the Scripts for Web Services to test using SOAP UI.
- Involved in downloading various JMeter listeners required for reporting.
- Worked with the System Admin team to know the Scalability between the Performance and production environments.
- Scheduled Test results review meetings with project teams to walk through Test reports and discuss about Performance Bottlenecks Identified.
- Executed Load, Stress and Endurance tests for different Applications.
- Analyzed, interpreted, and summarized relevant results in a complete Performance Test Report.
- Provided daily status in meetings and shared reports to the developers and business team.
- Worked closely with the offshore team in assigning the tasks and provide them the details for understanding the projects and analyzing the reports.
Environment: &Tools: JMeter, CA Introscope-Wily, SoapUI, Quality Center, NMON, JAVA, JSP, UNIX. Web Sphere, Maven, Jenkins, Oracle Enterprise Manager.
Confidential
Performance Engineer
Responsibilities:
- Gathered requirements from project stakeholders and developed Performance Test Plan.
- Extensive experience using Load Runner and Performance Center components like Virtual User Generator, Controller, Load Runner Agents, Result Analysis and Diagnostic Tools.
- Develop scripts using Web (HTTP/HTML), Flex & Web Services protocols through Load runner tool and enhance the scripts to support error/data handling.
- Creation of project plan, test plans, estimations, development and tracking projects at every phase.
- Monitored performance of individual JVMs for finding memory leaks and optimizing JVM heap.
- Monitored performance using windows Performance monitors and Load Runner monitors.
- Used Site Scope and CA IntroScope to monitor the databases, application and web servers for Performance bottlenecks while conducting load, stress, volume, and endurance test.
- Extensively used Dynatrace to diagnose and troubleshoot web/app server performance issues.
- Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects.
- Identified functionality and performance issues, including deadlock conditions, database connectivity problems and system crashes under load.
- Verified that new or upgraded applications meet specified performance requirements by issuing signoff to project teams after successful completion of performance testing.
Environment: &Tools: Load Runner, JMeter, Performance Center, DynaTrace, CA IntroScope, Quality Center, Web Sphere, JAVA, MS SQL.
Confidential
Performance Tester
Responsibilities:
- Responsible for developing test scenario, test suite, test plan and test cases for Performance Testing using LoadRunner and Performance Center.
- Gathered requirements from project stakeholders and developed Performance Test Plan.
- Develop scripts using Web (HTTP/HTML), Flex & Web Services protocols through Load runner tool and enhance the scripts to support error/data handling and to mimic real world scenarios including data setup.
- Monitored and analyzed system performance during load tests using CA Introscope and Dynatrace.
- Experienced in configuring and utilizingCA Introscope and Dynatrace for performance monitoring.
- Extensively used DynaTrace to diagnose and troubleshoot web/app server performance issues.
- Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects.
- Configured Web, Application, and Database server performance monitoring setup using LoadRunner Controller, HP Diagnostics Server.
- Extensively used performance monitors to analyze the system bottlenecks like memory leaks.
- Identified functionality and performance issues, including deadlock conditions, database connectivity problems and system crashes under load.
- Monitored the CPU, memory and network utilizations on the UNIX server using Introscope monitors.
- Verified that new or upgraded applications meet specified performance requirements by issuing signoff to project teams after successful completion of performance testing.
Environment: &Tools: LoadRunner, Performance Center, DynaTrace, Site Scope, Unix, Windows, Java, Web logic, Oracle, SQL Server.
