Performance Engineer Resume
Conway, AR
SUMMARY:
- Around 7 years of experience as Performance Tester with experience working on Load Runner Protocols like Web (HTTP/HTML), Web Services, Ajax Truclient, Citrix. Experience in designing performance testing strategy, workload modeling, running various types of performance tests.
- Experience in all phases of software life cycle for developing, maintaining and supporting the Web Applications, Client Server Applications.
- Experienced in evaluating & implementing Test Automation tools i.e. HP Load Runner, Jmeter and ALM
- Proficient in creating scripts using custom coding using C and Java.
- Worked on various protocols like Web HTTP/HTML, Ajax TruClient, Citrix, Web Services using LoadRunner
- Worked in a collaborative agile environment.
- Experience working of cross platform operating tools WINDOWS and UNIX along with Database Oracle and Microsoft SQL Server..
- Involved in Integration testing, Regression testing, Load testing and System testing for the Enhancements.
- Extensive experience on using Load runner Controller, Performance Center and Virtual User generator.
- Simulated different levels of user load (Vusers - simultaneous user load/concurrent user)
- Develop and maintain overall test methodology and strategy, Documenting Test plans, Test cases and Test scripts using HP Quality Center, Test Manager.
- Expertise in Analyzing the Performance Testing Results and identifying the bottlenecks.
- Good experience in WilyIntroscope Performance Monitoring tool
- Good knowledge on BSM (Business Service Management) monitoring tool to identify the availability of the application.
TECHNICAL SKILLS:
Testing tools: HP Load Runner, Jmeter, Quality Center, Fiddler, Win Runner, SOAP UI
Scripting Languages: C Language, Java Script.
Perf. Monitoring tools: Dynatrace, Wily Introscope, Splunk, HP Site Scope.
Perf. Reporting tools: Load Runner Analysis, IBM nmon Analyzer.
Programming Languages: C, Java
Bug Tracking tools: JIRA, Quality Center, Team Track.
Operating Systems: Windows, Unix, Linux
Web Technologies: HTML, DHTML and XML.
Application Servers: Web Sphere, IIS, Web Logic.
PROFESSIONAL EXPERIENCE:
Confidential, Conway, AR
Performance Engineer
Responsibilities:
- Worked closely with Business Analysts and Developers to gather Application Requirements and Business Processes in order to formulate the test plan.
- Involved in test environment build and designed Load (capacity) model on the basis of current volume and projected percentage increase in volume and also created test data.
- Developed Vugen test scripts in HP Load runner Web HTTP/HTML, Web Services protocols.
- Defining the performance goals and objectives based on the client requirements and inputs
- Involved in Functional, System, Integration, Performance, Load, Stress and Regression testing during various phases of the development using Load Runner.
- Validate web services using SOAPUI.
- Monitored various graphs like Throughput, Hits/Sec, Transaction Response time, Windows Resources (Memory Utilization, CPU Utilization, Threads) while execution from controller.
- Determined the source of bottlenecks by correlating the performance data with end-user loads and response times.
- Configured dashlets in Dynatrace with all performance counters to capture server side metrics and pure-paths to capture the client side metrics.
- Analyzed Heap behavior, throughputs and pauses in Garbage collections as well as tracking down memory leaks.
- Creating Business transactions, dashboards and reports in Dynatrace
- Tuning number of full GC and its CPU spikes at high memory conditions by increasing heap size and thereby eliminating JVM abnormalities
- Worked in both UNIX as well as windows servers environment.
- Analyzing test results and creating a complete report of the tests.
Environment : Load Runner, ALM/ Performance center, Soap UI, Dynatrace, Web services, UNIX.
Confidential, Saltlakecity, UT
Performance Engineer
Responsibilities:
- Responsible for developing test scenario, test suite, test plan and test cases for Performance Testing using LoadRunner and Performance Center.
- Used Virtual User Generator to generate VuGen Scripts for Web (Http/Html), Web Services, Citrix protocols
- Executed baseline, load, stress and endurance testing by Performance Center on website's workflows to identify and report performance bottlenecks.
- Analyzed number of Hits per second, Average Throughput and Response Time of Individual Transactions for specified duration using Load Runner.
- Worked with business analysts and the app team in finding the right SLA’s for the response times and the number of transactions in a peak hour according to the Production data and even for brand new operations and business processes.
- Performed GUI smoke/sanity tests and regression tests for each test cycle, build and release.
- Verified the Transaction logs using Putty for exceptions and application detail processing.
- Analyzed the test results by the LoadRunner analyzer to find response times, transactions, errors and etc details regarding the test executed.
- Used to monitor the application response time, CPU, heap, memory, GC time spent of both App and Web Servers using Dynatrace
- Identified functionality and performance issues, including deadlock conditions, database connectivity problems, and system crashes under load
- Involved in reporting and tracking the defects using Quality Center.
- Prepared Load Test Reports by analyzing the results from Load Runner analysis
- Created the final Performance Test report with the whole testing methodology, results, findings like the test design, test duration, Response times, number of transactions, servers behavior detailed metrics.
Environment : Load Runner, ALM/ Performance center, Soap UI, Dynatrace, Web services, Oracle SQL Developer, Putty Connection Manager, UNIX, Quality Center
Confidential, Portsmouth, NH
Performance Tester
Responsibilities:
- Involved in full life-cycle of the project from requirements gathering to transition and followed Agile methodology.
- Worked with various teams on board in gathering the information regarding the project and coordinating with offshore in terms of performance testing.
- Interacted with the Business Analyst and application teams to discuss the performance requirements and test strategy also developed the performance Test Plans and Load Test Strategies interacted with end client.
- Gathering and analyzing Business Requirement Documents (BRD) and Functional Requirement Documents (FRD) for performance test planning.
- Perform large-scale load volume end-to-end testing using large users data files
- Performed Load Testing against applications using Load Runner scripts to emulate users and monitor systems performance.
- Parameterized and manually correlated the scripts with IP-Spoofing to simulate user load.
- Used Performance Center to manage Load Runner scripts and scenarios and test documentation.
- Used the Load Runner Online Monitors to monitor the possible bottlenecks in the application.
- Experience in monitoring the Load tests and analyzing test results using Wily Introscope.
- Used Wily Introscope to monitor Production and Performance testing Environments.
- Monitored test by checking CPU utilization and Memory Leakage, Thread count and GC heap Using Wily Introscope
- Created dashboards in Wily Introscope to monitor the server metrics
- Monitored the application availability using BSM.
- Responsible for determining the room for Performance improvement for any Application or a Service while testing, Implementation and retesting.
- Verify that new or upgraded applications meet specified performance requirements.
- Monitored different graphs like transaction response time and analyzed server performance status, hits per second, throughput, windows resources and database server resources etc.
- Analyzed the system resource graphs, network monitor graphs and error graphs to identify transaction performance, network problem and scenario results respectively.
Environment : LoadRunner, ALM/ Performance center, ALM/Qualitycenter, SoapUI, WilyIntroscope, Web services, BSM, SQL Server, Web logic, Web Sphere, Load Balancer
Confidential
Performance Tester
Responsibilities:
- Requirements gathering from the SMEs, coordinating with the developing team and the testing of the test application with the business analysts, developing team as well as the testing team.
- Analyzed and assisted in the creation of the feasible and relevant testing environment so that the tested application meets the requirement in production.
- Performed Performance tests with JMeter using TestPlan, Thread Group, Sampler, Listener, and Assertions.
- Performed testing using Jmeter tool based on existing automation test frame for server part testing
- Developed Apache JMeter test suite for functional load testing, analyzed the logs and reported errors
- Created POC for performance testing using Jmeter.
- Performed Data Driven test using test data from excel files
- Executed different test like Single user Test, Load Test, Soak Test.
- Executed Batch Process and traced out transaction trace using Transaction trace Session in Wily Introscope to see any SQL’S or Methods running long time.
- Monitored test by checking CPU utilization and Memory Leakage, Thread count and GC heap Using Wily Introscope.
- Involved in Web Service testing (SOAP) using SoapUI and JMeter.
- Participated in Weekly Meetings to discuss the bottlenecks and Defects.
- Designed UNIX shell scripts to execute the jobs for testing.
- Involved in Web Application testing with specific focus on performance testing using JMeter.
- Correlate JMeter data with system reports on various servers.
- Monitored the performance of the WebLogic application server in the testing environment.
- Analyzed the response time, CPU usage, memory usage, and various others metrics.
- Reported the bottlenecks to the developers and assisted them in fixing the bugs.
- Worked with soapUI and loadUI in testing of the web services with the necessary WSDL and analyzed XML end of day files.
Environment: Jmeter, HP LoadRunner, Performance center, Splunk, HP diagnostic, SQL, JIRA, SQL,
Confidential
Performance Tester
Responsibilities:
- Involved in the creation of detailed Test plan, Test Scenarios and Test cases according to the business requirements, and updated Requirement Traceability Matrix document to ensure complete coverage.
- Created performance testing environment and installed all the necessary components of LoadRuner on all the remote machines.
- Used Load runner for web services based performance testing.
- Using the Load runner tool performed the Load Testing simulating the number of users to a minimum level and a maximum level.
- Performed Non-Functional, Functionality, Security, Cross Browser, Backend, Integration and End to End testing by executing the test cases
- Conducted duration test, Stress test, Baseline test using Loadrunner.
- Executed Smoke and Sanity testing on the initial received build to check the stability of the application build in Performance testing Environment.
- Performed Performance Regression testing on the received build after defect fixes.
- Performed load testing against internal applications and services using Load Runner scripts to emulate users and monitor systems performance.
- Developed scripts/scenarios for performance and load testing of application using Jmeter and LoadRunner
- Designed the VU-Gen scripts using Load Runner VU-generator and executed the VU-Scripts in Load Runner Controller in distributed load Generators.
- Used Rendezvous point, Start and End Transaction, Parameterization, Correlation features in Virtual User Generator of Load Runner.
- Enhanced script by inserting Check points to check if Virtual users are accessing the correct page which they are supposed to be accessing.
- Identified performance issues, including: deadlock conditions, database connectivity problems and system crashes under load.
- Generated the Application Performance reports and reported to the analysis group of Performance testing for fine tuning the application Performance.
- Shared design with Stake Holders and worked with Design Architects to ensure changes in design with respect to Performance Impacts.
- Identified and reported Performance test defects to the development team by applying Priority and Severity concepts using Quality Center.
- Documented and communicated test results. Coordinate development, testing and documentation activities with the offshore team.
Environment: QC, Jmeter, WinRunner, Loadrunner, Performance Center, HP J2EE Diagnostic, SQL, UNIX, API, Web Sphere.