Performance Engineer Resume
TexaS
SUMMARY:
- Around 8 years of diversified experience as Performance Engineer. Experience includes Requirement Analysis, quality assurance of Client/Server and Web based applications. Extensive experience using automated tool, HP LoadRunner, Confidential & Confidential . Experience in planning Performance Test Strategy, setting up the environment for testing applications being a self - starter, motivated team player and excellent communication and interpersonal skills.
- Experienced in Performance Testing using HP Loadrunner
- Experience in Monitoring Application and Analyzing Application logs with Confidential and Confidential .
- Excellent working knowledge of designing and implementation of all performance test strategy plans and automated test solutions for client/server and Web applications with LoadRunner.
- Extensively worked on Web (HTTP/HTML), Web Services, and Java Vuser Protocols in Loadrunner.
- Hands on experience with designing, developing and executing different type of Load tests for Web applications and Mobile iOS/Android Applications using LoadRunner tool
- Worked in Agile and Waterfall Testing methodologies.
- Worked extensively with LoadRunner in writing test scripts, setting up test scenarios, Executions & test reports.
- Have been involved in end to end SDLC.
- Handled performance test environment readiness pertaining to code deployment followed by test servers/services restarts before start of validation and test run.
- Experienced in developing different test scenarios like Single User/Smoke Test, Load Test, Scalability/Stress testing, Reliability/Endurance testing, Performance regression testing etc.
- Hands on experience configuring and using Dyna-Trace, HP Diagnostics, App-Dynamic tools to setup Performance Monitors for monitoring and analyzing the performance of the server during Load tests.
TECHNICAL SKILLS:
Testing tools: LoadRunner, Performance Center, SOAP UI
Scripting Languages: Shell script, HTML, XML
User Analytics Reporting Tools: IBM Tealeaf, Confidential, Confidential
Environment: s: Windows 98/NT/2003/08/12, Unix, JBOSS, Tomcat, Web-logic, & Amazon AWS Cloud.
Programming Languages: C, C++, Java, SQL
Databases: MS SQL Server 2008/2012, DB2, Oracle, MySQL
Web Technologies and Scripting: HTML/DHTML, XML, ASP, .Net, JSP, VBScript, JavaScript, FLASH, Dyna-Trace 5.5 & 6.1, Visual VM, App-Dynamic, HP Diagnostics, Confidential, Manager Performance tool.
Tools: Office, Excel, PowerPoint, Project, Visio, Outlook, WIN SQL Quality Center, JIRA, RALLY
EXPERIENCE:
Performance Engineer
Confidential, Texas
Responsibilities:
- Gathered requirements from business teams and stakeholders, prepared test plans.
- Set up meetings with Product Owner, Business team, Architecture team, development team in preparing the Non-Functional Requirements for the application under test.
- Used Confidential tool queries to pull the peak hours’ load volumes for the business-critical transactions to be performance tested.
- Prepare Test Plan/Strategy and share with key stake holders, product manager, business lead, & development team for Sign-Off.
- Prepare and Document performance test cases and test data setup process before start of testing.
- Develop scripts using Web (HTTP/HTML) & Web Services protocols through Load runner tool and enhance the scripts to support error/data handling.
- Evaluated Think Time & Pacing calculations in preparing the scenario designs for testing with various loads ranging from 5 TPS to 600 TPS.
- Test execution includes: Smoke Test, Capacity Test, Stress Test, Endurance Test, and HTTP Watch Test - Client Side Page analysis.
- Execute single user test from front end to do page load analysis for test transactions using HTTP Watch Tool.
- Analyze the front-end metrics involving response times, hits, and throughput obtained from load runner tool for each executed load test.
- Analyze the tomcat server metrics - JVM Heap Usage metrics: Used Heap after Collection, %time spent in GC, Young Generation Usage, Old Generation Usage, Perm Gen Usage; Process CPU.
- Analyze the Unix Server CPU & Memory Usage at Box Level using Site Scope & VMSTAT.
- Prepare the consolidated test report for each executed load test in MS Word, MS Excel & Email format after completion of analysis.
- Track and Tabulate the defects pertaining to Key Performance Indicators not meeting the historical baselines test results/SLAs and functionalities not working as expected during manual validation after code deployment
- Support Testing in Production region to replicate the Prod issues (P1 tickets) in Test region then validate the fix and provide Sign-Off. The P1 tickets were mostly about the ones listed below:
- Unhealthy Heap usage - Heap memory not getting fully reclaimed after GC - fix here involved enabling of Multi threading on JRuby Single Container that utilized the thread local storage space effectively after Invocations and GC.
- Connection termination with CPU reaching threshold of 85% - fix here involved Optimization of Search Java Code that avoids duplicate calls hitting the DB.
Environment: Web Server, App Server (Tomcat), Java, .NET, Unix, SQL Sequel Server, DB2 database, MAUI - Mainframes, Stubs Server, Load Balancer, IBM Support Assistant 4.1, Confidential, Load Runner - 12.50 & 12.53, Quality Center 11, Confidential Ajax Client, Visual VM, Graphite.
Performance Engineer
Confidential, CA
Responsibilities:
- Involved in analyzing the requirements from the developers and Business users.
- Installed Agents on Confidential server and configured them with application to monitor metrics (CPU, Memory, Heap, Response Times)
- Configured and deployed of Confidential Monitoring system for managing all the Linux warehouse systems.
- Works closely with customers to evaluate system architecture, deployment requirements, and Confidential configuration needs.
- Proven ability to work with customers to resolve issues related to adoption and utilization of the Confidential product.
- Introduce current synthetic testing customers to additional resources within Confidential to show them how they can find and fix issues that are causing performance problems within their firewall.
- Involved in gathering all the requirements from various teams and worked on the test plan and test strategy documents for projects.
- Involved in conducting the test plan and test strategy document with various stake holders.
- Involved in creating the data required for the performance test effort.
- Involved in creating the test scripts for the various scenarios using the JMeter tool.
- Performance tested the Java applications using JMeter for various protocols.
- Created the Scripts for Web Services to test using SOAP UI.
- Involved in downloading various JMeter listeners required for reporting.
- Worked with the System Admin team to know the Scalability between the Performance and production environments.
- Executed Baseline, Scale, Endurance tests for different Applications.
- Analyzed, interpreted, and summarized relevant results in a complete Performance Test Report.
- Provided daily status meetings and reports to the developers and business team.
- Worked closely with the offshore team in assigning the tasks and provide them the details for understanding the projects and analyzing the reports.
Environment: JMeter, JAVA, App Dynamics, SoapUI, Quality Center, NMON, SiteScope, JAVA, ASP, JSP, Oracle, UNIX. Web Sphere, Maven, Jenkins, Oracle Enterprise Manager
Performance Engineer
Confidential, CO
Responsibilities:
- Gathered Performance requirements for the application and designed performance tests for the multiple clients within the organization
- Gathered business requirements, analyzed the application load testing requirements and developed the performance test plans for Store POS/Enterprise applications.
- Developed Test Plans, Test Scenarios, Test Cases, Test Summary Reports and Test Execution Metrics.
- Developed Vuser Scripts using Java Vuser, Web HTTP, web services & RDP protocols in load runner for applications.
- Customized Load runner scripts in C language like String manipulation for the Load Runner Scripts
- Worked extensively with XML data and SOAP protocols in Non UI Web services (SOA) Testing.
- Responsible for setting up monitors to monitor network activities and bottlenecks.
- Installed Site Scope, and configured monitors for analysis.
- Used Site Scope to get metrics from App/Database servers.
- Monitored Metrics on Application server, Web server and database server.
- Created rendezvous point for Performance test scenarios to find deadlocks.
- Worked with developers in understanding the code and application in order to develop the Load scripts.
- Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution.
- Monitored system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat using Site scope and Linux.
- Monitored Database during testing for any out of table space risks.
- Responsible for Analysis and reporting and publishing the test results.
Environment: HP Load Runner 12.02, Web sphere, HP Site scope, Windows 2008, Linux, Excel, SQL, Oracle Database 11g/12c, Mule soft, SOAP UI, Oracle SQL Developer, IBM Support Assistant.
Performance Engineer
Confidential
Responsibilities:
- Set up meetings with Product Managers (USA, UK, and Australia), architecture team, and development team in preparing the Non-Functional Requirements for the application.
- Prepare Instrumentation reporting requirements for the identified Key Performance Indicators (i.e. Business Critical Transactions) of the application for monitoring performance and reliability in Test & Production environments. Reporting Tools used here are SAS and UNICA.
- Prepare peak hour load volume design for the Key Performance Indicators of the application using SAS & UNICA (Net Insights) reports containing production data and anticipated load estimates stats shared by Capacity Planning team.
- Prepare Reliability, Availability, Performance, and Capacity assessment document for the applications prior to start of actual load testing.
- Prepare Test strategy & Test Plan, and share with key stake holders, product manager, business lead, & development team for Sign-Off.
- Work with business and functional teams in setting up the required test data for load testing.
- Prepare project sizing estimates for offshore resources.
- Evaluate Pacing and Think Time attributes in preparing the scenario designs for testing with various loads from 50 Simultaneous Users to 200 Simultaneous Users that contributes the number of requests from 5K to 20K.
- Applications tested, Lexis Advance Mobile Web, Lexis Advance iPad App, Lexis Advance iPhone App, Lexis Nexis International - Australia, and Lexis Nexis Practical Guidance - Australia.
- Develop scripts using Web (HTTP/HTML) & Web Services protocols of Load runner tool.
- Prepare high level script flow design document for the application that includes all the Key Performance Indicators along with the target peak hour load associated with it.
- Provide application walkthrough related to Key Performance Indicators, architecture walkthrough, and test data information to offshore resources.
- Lead offshore resources in scripting and testing activities on daily bases through web-ex meetings.
- Test performed where mainly comprising of Baseline Test, Compatibility Test, Stress Test, and Reliability (long duration, say for 4+ hours) Test.
- Perform Network & Page rendering test using STOP watch & HTTP watch tool to access the true customer perceived response time.
- Assess the performance, reliability, and capacity of applications after each executed load test meeting the product Tier requirements through SAS reports.
- Analyze the front-end metrics involving response times, hits, and throughput obtained from load runner tool for each executed load test.
- Prepare the consolidated test report for each executed load test in MS Word, Excel, and Power Point using the complete analysis obtained from load runner tool, Network & Page rendering test, SAS reports, app servers graphs, java servers graphs, and data base servers graphs.
- Track and Tabulate the defects pertaining to Key Performance Indicators not meeting the performance requirements; reliability and capacity not meeting the product Tier requirements; code unavailability; functionalities not working as expected; and application architecture issues encountered during load testing (related to App Servers (.NET), Java Servers, and Data Base Servers) through Web-Teams (Test Environment Issues) and Web-Stars (Production Issues).
- Prepare Pre-release & Post Release assessment for the applications tested.
Environment: C, Proxy servers - Unix Boxes, CRM, Load Runner, Performance Center, VUGEN, HTTP Watch, Shared Services - Java layer infra, and Mark Logic Data Base servers.
Performance Tester
Confidential
Responsibilities:
- Set up daily status meetings with business and IT Development teams for requirement gathering related to non-functional requirements.
- Performed impact analysis and provided recommendations in preparing the test strategy.
- Prepared & shared the performance test strategy and test plan with key stake holders for Sign-Off.
- Lead offshore resources in scripting and testing activities on daily bases through web-ex meetings.
- Configured Controllers and load generators for load testing activities.
- Performed test data setups and validation through scripts for effective load testing that replicates production behavior from end user perspective.
- Test performed where mainly comprising of Baseline Test, Integration Test, Stress Test and Reliability (long duration, say for 4 hours) Test.
- Developed scripts using Web (HTTP), Web-Services and FTP protocols of Load runner tool.
- Evaluate Pacing and Think Time attributes in preparing the scenario designs for testing with various loads from 10 Simultaneous Users to 6000 Simultaneous Users that contributes the number of requests from K.
- Restart and validate the JVM status using JVM logs that shows “open (ready) for business” message in the logs.
- Sizing the JVM configurations (instances and heap size) in load test environment identical to Production before the start of load testing.
- Analyzed the front-end metrics involving response times, hits, throughput, and connections per second obtained from load runner tool for each executed load test.
- Analyzed and validated the application architecture and stability after each executed load test by running DB (SQL) queries on TOAD interface that talks to DB instance of Load test environment; and tracked the status and response time of each message getting processed end to end.
- Monitored the WSG (Web Service Gateway) for each load test using the metrics that involves payload size, number of hits, and request response times meeting SLA’s per agreement with SOA governance policy.
- Scheduled meetings with JVM analyst, DB analyst & IT team in preparing the consolidated analysis result.
- Tracked and tabulated issues pertaining to business transactions not meeting performance requirement, application architecture, code availability, DB issues, Functional issues and JVM issues using defects raised through Quality Center.
- Prepared the consolidated test reports for executed load test in MS Excel, Word and Power Point using the complete analysis obtained from load runner tool, JVM metrics, WSG metrics, and application message processing times obtained from SQL queries. And add any Data Base recommendations pertaining to SQL tuning, Locking issue, and Indexing.
- Performed end to end failover testing that involves two intermediate databases (primary & secondary) operated in parallel that talk to Master Database through JVMs and WSG. Here the load requests were automated through load runner tool.
- Identified Table-space issues by running SQL queries in DB instance of load test environment before the start of load testing to avoid any load test failures and redundant test data preparations.
- Scheduled meetings with Key Stake holders and Business team for results presentation & discussions on the testing performed for Sign-Off.
Environment: JAVA, Load Runner, VUGEN, Stand-alone controller, Java -WAS, TOAD, MS SQL servers, SHELL scripting, UNIX, and sitescope.