Performance Engineer Resume
3.00/5 (Submit Your Rating)
Tampa, FL
SUMMARY
- Over 8 years of experience as a Performance Engineer on Web - based and Web services applications with exposure to diverse business domains
- Experience in Micro Focus/HP LoadRunner, Jmeter, Dynatrace, Splunk, CA APM
- Excellent skills in analyzing business requirements and use cases
- Experience in gathering non functional requirements such as response time SLA’s, volumetrics, User load, environment and architecture details
- Worked with Micro Focus/HP LoadRunner Scripts, enhanced scripts with functions, parameterized, stored dynamic content in LoadRunner through Correlation function, also created scenarios for Concurrent users
- Expertise in recording/coding Vugen scripts using different protocols (Web(HTTP/HTML), Web services)
- Expert in conducting Load Testing, Stress Testing, Volume Testing, Scalability Testing, and Endurance Testing using Load Runner and JMeter
- Experience in using Micro focus Performance Center to create scenarios based on workload model and conduct Load, Stress, Endurance and other required tests
- Experience using JMeter to design scripts and execute load test
- Proficient in creating a variety of Test Scenarios to find out different kinds of Bottlenecks
- Experience and knowledge in monitoring technologies and best practices including creating, configuring and maintaining CA APM, Dynatrace, Splunk monitors, alerts, and reports
- Configure the Dynatrace dashboards to identify the performance bottle necks and identify the method hotspots of the application
- Troubleshoot performance issue, analyze bottleneck and make recommendation to improve performance solutions
- Identified functionality and performance issues including deadlock conditions, exception error messages, and system crash under Load in peak hours
- Experience in analyzing test results, merging graphs and preparing test reports
- Experience in JVM and Heap monitoring
- Experience in Thread dump and Heap dump analysis
- Knowledge in profiling tools such as JProfiler, Ants profiler
- Good understanding of the Agile Methodologies, Software Development Life Cycle (SDLC) as well as the Software Testing Life Cycle (STLC)
- Experience in using Application Life-cycle Management (ALM)
- Good understanding of SQL, Java testing Script creation, enhancing the script Databases, Monitoring & Analysis tools and techniques
- Understanding in Web/App/ESB/Database for results analysis, product tuning, and recommendations implementation
- Knowledge on APM/Performance tools in the market and evaluate how new tools might help a given client
- Provided tools and frameworks to engineering team to benchmark and measure performance of the product during the development lifecycle
- Provided inputs and assisted with development of continual process improvement
- Involved in testing of application using the Scrum (Agile) methodology
- Experience in Agile tools JIRA and CA Rally
- Experience working in a fast-paced environment, while delivering quality work and within given timelines and actively involved in the Training of the end users
- Strong Analytical, problem-solving skills, excellent communication and presentation skills, self-starter, quick learner, and a team player
Technical Skills:
- Test Strategies: Performance Testing, Performance Engineering, Load, Stress, Endurance testing, Thread dump Analysis, Heap Dump Analysis, Code profiling
- Operating Systems: Windows, UNIX, Linux, Solaris, AIX
- Testing Tools: HP Load Runner, HP Performance Center, Apache Jmeter, HP ALM
- Agile: CA Rally, Jira
- Web/ Application Servers: MS IIS, Apache Tomcat, WebSphere, Web Logic
- Monitoring Tools: Dynatrace, Splunk, CA APM
- Other Tools: TOAD, SQL Loader
PROFESSIONAL EXPERIENCE
Confidential, Tampa, FL
Performance Engineer
Responsibilities:
- Gathered performance test requirements from the application team based on the performance test request submitted
- Determined business critical scenarios from productions Logs and current production load using Splunk
- Worked closely with the development team to identify the performance test needs and its deliverables
- Designed performance test strategies and performance test plans considering the performance test requirements
- Organized the test cases and test plans in ALM
- Designed performance test scripts in Loadrunner and enhanced the script to meet the test scenarios, troubleshoot any run time errors and set run time settings as required
- Created correlations for the dynamic values in the script based on the response from the server to enhance the Script
- Scheduled and executed the test scripts for performance testing assigning multiple Load generators to meet the test needs
- Designed and executed performance test scenarios in Performance Center
- Collated the execution results and analyzed them
- Monitored performance of the application and database servers during the test run using tools like Dynatrace, CA APM and Splunk
- Thread dump and Heap dump analysis to trouble shoot performance bottlenecks
- Drill down on the Slow Response transactions, HTTP errors, memory spikes, JVM memory, CPU usage, web requests, Host, application processes and database
- Debuged Performance issues. Shared the report to all the stake holders
- Used defect reporting tools like ALM and Jira
- Supported few applications for Jmeter scripting and execution
- Setting up thread groups, samplers, listeners, assertions in Apache JMeter
- Involved in client interactions whenever feedback is needed to improve the testing process and attended weekly project Meetings and discussed the issues raised according to their priority level
Confidential, Hartford, CT
Performance Engineer
Responsibilities:
- Involved with the Project manager, Business analyst, and Developers to understand performance requirements and design to perform the test plan
- Attended the weekly Project Meetings and discussed the issues raised according to their priority level
- Responsible for designing of Test Strategy, Test Plans, and Test Cases and Execution of Test Cases and generating Test Reports and Defect reports
- Created scripts in Virtual User Generator to emulate the behavior of real users. Tested the Load using HP Performance Center
- Used Correlation, Parameterization, and Content Check features to validate thescript
- Performed data driven tests to ensure unique set of data values are inputted to the application
- Enhanced the script with Checkpoints, Rendezvous Points and Transactions to check the performance of the application and added General Virtual user Functions, Protocol Specific Functions, and Standard C Functions to handle bottlenecks of the application
- Created Scenarios based on the needs of client requirements and monitored and analyzed application performance and maximum scalability, critical parameter such as number of users, response times, Hits per seconds (HPS) and Throughput using LoadRunner Analysis
- Created LoadRunner scenarios using Controller and scheduled the Virtual Users to generate realistic load on the server using LoadRunner (Load generator machine)
- Performed Performance, Stress, and Load Testing on N-Tier & Web Based applications
- Analyzed the App and DB servers resource utilization for any bottlenecks
- Checked the Splunk Logs for methods taking long time for particular transaction during the test run period
- Communicated with the development team to resolve defects as needed
- Used monitoring tools like Dynatrace, Splunk and CA APM
- Used JIRA as a bug tracking, issue tracking and project management software
- Performed in-depth failure analysis against thousands of automated test executions
- Responsible for keeping up with the test schedule and interacting with software engineers to ensure clear communications on requirements and defect reports
Confidential
Performance Tester
Responsbilites:
- Involved in preparation of estimation, capacity matrix, testing plan and details, capacity plan and performance strategy documents and conducted assessments and data modeling using excel
- Designed and implemented performance test frameworks for improving test efficiency
- Worked on developing performance/load test strategies, detailed test plans, test cases, test data, test coverage, and reporting status
- Worked on recording, scripting, capturing dynamic value, parameterization and execution of the scripts
- Conducted Performance Testing, Load testing, Stress testing and Endurance testing of the application
- Worked on web-services REST API using JMeter, performed in-depth analysis to isolate points of failure in the application.
- Setup Warm-up period for Load Testing, simulated Browser Cache during Load Tests
- Worked on setting up programmatically accessing number of users for Load Tests
- Created and executed various scripts of the application and identified the response time under the load. Reported the bottlenecks of the application
- Configured TIMEOUT for deployment of load tests to different agents
- Configured Application Performance Analyzer for monitoring system resources and activity
- Assisted in production of testing and capacity certification reports
- Investigated and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment
- Created Test Schedules and worked closely with clients, reported the performance test results
Confidential
QA Analyst
Responsibilities:
- Analyzing Business Requirement Specifications (BRS) and Functional specification Documents for developing manual test cases
- Involved in writing Master test plan as per QA methodology
- Performed manual testing considering the base line of developed test plan and test cases. Participated in the entire development life cycle from the functionality requirements through deployment & maintenance
- Performed Parameterization using various data to test the application
- Responsible for conducting Regression Testing based on the automated Test Scripts using QTP
- Provided technical and procedural support for User Acceptance Testing (UAT)
- Worked in all phases of Testing cycle, mainly Black Box Testing, Business Functionality Testing, GUI testing, and system testing
- Interpreted code designs, maintained and updated numerous test cases for testing software functionality and data flow designs
- Executed and documented various results from regressive testing of software functionality to support Banking operations, utilizing pre-defined test cases
- Used Quality Center for defect tracking and defect reporting
- Provide documentation on all testing results with detailed root cause failure analysis
- Manage completion of all tasks to ensure on-time project delivery
