We provide IT Staff Augmentation Services!

Sr Performance Tester Resume

4.00/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • 8 + years of Quality Assurance experience with strong expertise in Performance/Load & Stress Testing using HP Performance Center/LoadRunner.
  • Experienced in designing multiple LoadRunner scripts (VuGen) with different protocols like for load testing different applications. Few of them are Web (Http/Html), Ajax, Web services, LDAP, CitrixICA, FLEX, OracleNCA, Java over HTTP Vuser, JMS
  • Significant experience in Load Testing various applications including Java, J2EE, .NET, COM/DCOM, SQL Server implementations.
  • Well experienced in executing Baseline, Benchmark, Performance, Stress and Memory Leak tests.
  • Experience in writing Test Plans, Developing Test Scenarios along with Test Cases, Test Procedure with reference to Business Requirement Documents (BRD), Functional Specification & Technical Specifications to meet Functional SLAs.
  • As a Performance Tester, is accountable for support across all performance testing activities, load and stress testing processes, methodologies, and tools relating to the clients applications portfolio.
  • Accountable for consulting and advising all teams, assisting with the development and maintenance of reporting metrics to determine the effectiveness of Quality Assurance efforts across client’s applications.
  • Extensive experience in Quality Assurance methodologies and strategies with better understanding of Software Development Life Cycle (SDLC).
  • Hands on experience and exposure in all phases of project development lifecycle and Software Development Life Cycle (SDLC) right from Inception, Transformation to Execution, which includes Design, Development, and Implementation.
  • Strong experience writing SQL queries for back - end testing, UNIX commands for verifying log files, shell scripts to bounce/maintain QA servers, database refresh for QA environments, XML API testing.
  • Experienced in monitoring CPU, Memory, Network, Web connections and through put while running Baseline, Performance, Load, Stress and Soak testing
  • Involved greatly in Performance Testing, Functional Testing and Regression Testing using automated testing tools including LoadRunner, Performance Center, Quick Test Pro, Quality Center, Win runner and Test Director.
  • Experienced in supporting the team of on-site and off-shore to provide Performance Testing services 24/7 for internal client applications.
  • Well-versed in Software development which follows Agile methodologies.
  • Well-versed in implementing best practices for vugen scripting, Performance Testing and reporting Performance test analysis.
  • Installed and Setup Performance Center and Multi Load Runner Agents.

TECHNICAL SKILLS

Testing Tools: LoadRunner 8x/9x/11x, Silk Test 7x, Silk Performer 7x, QTP 9x/10, Quality Center 8x/9x/11x, Selenium

LoadRunner Protocols: Web - HTTP/HTML, WebServices, CitrixICA, FLEX, Winsock, and AJAX, Oracle NCA

Scripting: JAVASCRIPT, BDL, SHELL, VB

Programming Languages: C, C++, C#, JAVA, HTML

Web Server/App Server: MS IIS, Apache, HIS, Web Logic, Web Sphere

Database: Oracle, Db2, and SQL Server

Service Oriented Architecture (SOA): Web Services, XML, SOAPUI, WSDL

PROFESSIONAL EXPERIENCE

Confidential, Phoenix, AZ

Sr Performance Tester

Responsibilities:

  • Gathering and analyzing business and technical requirements for Performance testing purposes.
  • Coordinating with Functional Teams to identify the Business Processes to be Performance tested.
  • Having good communication with cross-functional team and get various updates to move on to next step.
  • Arranging daily stand up meeting with off shore team and plan accordingly with development team and functional team.
  • Experience in using SOAP UI for testing the Web services.
  • Extensively used Web (HTTP/HTML), Web Services, and J2EE.
  • Utilized WSDLs and files to perform web services (integration testing) using SOAP UI and Performance Center.
  • Creating various scenarios to do the performance testing according to scope of the project.
  • Managed a team of 6 members in onsite / Offshore model.
  • Created Test Plan/Strategy, which includes Testing Resources, Testing Strategy, Risks and testing of end-to-end scenarios.
  • Prepared test Estimations and presented in front of higher management for approvals.
  • Created automated scripts by including timers for recording response times, and test checks for confirming the retrieval of the correct page.
  • Involved in performance testing of server’s load and scalability by creating multiple Virtual Users by using Load Runner Virtual User Generator component.
  • Designed multiple LoadRunner scripts (Vugen) with different protocols like Web, Flex, AJAX, TruClient, Citrix, Web services for load testing different GUI and other applications.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.
  • Created, Executed and Monitored the feasibility of various manual and goal oriented scenarios of an application with Load Runner Controller.
  • Run full formal performance test including Load, Peak, Breakpoint, Burst, Longevity and Fail over.
  • Effectively used all the components of Loadrunner 11.52 including the controller and efficient in writing Loadrunner 11.52 functions.
  • Identify system/application bottlenecks and work with Bottom-line to facilitate the tuning of the application/environment in order to optimize capacity and improve performance of the application in order to handle peak workloads generated via Mercury Interactive LoadRunner tool to simulate activity.
  • Configured Web, Application, and Database server performance monitoring setup using LoadRunner Controller, Wily Introscope, Spunk & HP diagnostics.
  • Created Vusers to emulate concurrent users, inserting Rendezvous points in the Vuser scripts and executed the Vuser Scripts in various scenarios which were both goal oriented and manual using Load Runner
  • Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Added various monitoring parameters (CPU, Memory) to the LoadRunner controller for monitoring, also using SiteScope for monitoring database and application servers.
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Using SoapUI for Load testing for different API’s.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports and graphical charts for upper management using Load Runner analysis component.
  • Extensively used Unix commands for debugging and used, modified & ran Shell Scripts for daily reports and data collection.
  • Responsible for analyzing the results like CPU usage, memory usage, garbage collection/heap size, server response times, database response times, active/idle threads, size of weblogic queues, etc.
  • Monitor UNIX logs for different type of exceptions during Load test manually and also using Failbox tool.
  • Extensively used SQL queries, responsible for Database testing using SQL queries, needs to verify records in backend after updating front end, modification and deletion of records from fount end and vice versa.
  • Monitor Oracle and Pl/SQl database while running the load for CPU utilization, storage IOPS, Storage KBs, IO Wait Percentage, AWR reports, etc. and finding out the issues within database.
  • Used SVN for copying JAR/WAR files from a remote repository to a local machine and use them for Loadrunner scripts generation.
  • Identified bottlenecks for a clustered environment relating to Indexes, Connection Pools, and Garbage collections, Memory heap size and fixed them by changing configurations with the help of DB team.
  • Using Quality Center for complete defect management and reporting.
  • Coordinate with Off-Shore QA team.

Environment: Java, J2EE 1.4, Web Services, and SQL Server, Unix.

Confidential, Des Moines, IA

Sr. Performance Engineer.

Responsibilities:

  • Participate in all meetings planned for particular release and obtain necessary technical requirement and such meetings include design review, test execution timeline etc.
  • Meeting with project team to work for project business volume metrics.
  • Gathering and analyzing business and technical requirements for Performance Testing purposes.
  • Configure all necessary hardware and software to support Performance Center.
  • Planning, development and testing of scripts.
  • Independently developed Performance Center Vugen scripts according to test specifications/requirements to validate against Performance SLA.
  • Enhanced Vusers Scripts by correlation, parameterization, transaction points, rendezvous points and various Load Runner functions.
  • Parameterized the Performance Center scripts to access data sheets based on environment like QA, UAT and Production.
  • Created automated scripts for API WSDLs/Portal Application using Vugen in Loadrunner 9.52 (web services protocol/Portal, Frontend Web HTTP/HTML protocol) for regression scenarios.
  • Using LoadRunner created scenarios and set up monitors to track load generator for performance testing.
  • Performed correlation by rightly capturing the dynamic values and parameterize the data dependencies that are always a part of Business process.
  • Conducted several Load tests such as 1 Hour peak production load, Reliability and Stress tests to identify the performance issues.
  • Ran the scripts for multiple users using controller in Performance Center for GUI/API regression/Load testing.
  • Involved in determining scalability and bottleneck testing of applications.
  • Identifying bottlenecks in Network, Database and Application Servers using Performance Center Monitors.
  • Monitored Average Transaction Response Time, Network Data, Hits per Second, Throughput, and Windows resources like CPU Usage available and committed bytes for memory.
  • Analyzed Throughput Graph, Hits/Second Graph, Transactions per second Graph and Rendezvous Graphs using LR Analysis Tool.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Analysis report.
  • Analyzed results and provided Developers, System Analysts, Application Architects and Microsoft Personnel with information resulting in performance tuning the Application.
  • Develop and implement load and stress tests with HP Performance Center and present performance statistics to application teams, and provide recommendations of how and where performance can be improved.
  • Verify that new or upgraded applications meet specified performance requirements.
  • Worked in Quality Process - Prepared monthly Quality Reports, Benchmarking Reports.

Environment: .NET, C#, ASP.NET, HP-Unix, HIS 4.0/7.0, Web services, DB2

Confidential, San Antonio, TX

Sr LoadRunner Tester.

Responsibilities:

  • Participated with the Business Support Group in design, review and requirement analysis meetings.
  • Reviewed the test scripts prepared by the team and the Business User Support group.
  • Worked in process development and metrics evaluation for Feature Testing, which involves tracking feature movement for the entire period of Feature Testing cycle. Also Worked in metrics and process development Regression Testing cycle.
  • Performed Functional testing, Regression testing and End-to-End Testing.
  • Create Performance scripts using Load Runner’s VuGen.
  • Developed UAT Test Scripts and maintained them in Hp Quality Center.
  • Parameterized the LoadRunner scripts to access data sheets based on environment like QA, UAT and Production.
  • Used Load Runner Controller to generate load andmonitor the performance under load of the application being tested.
  • Worked inLogging, recording, issues, tracking and maintainingthe defects usingBugzilla.
  • Generated the reports using Bugzilla on daily basis.
  • Entered defects in Quality Center and participated in the defect-review calls every day.
  • Analyzed the results using Online Monitors and Graphs to identify bottle necks in the server resources using Load Runner and JMeter.
  • Developed and executed SQL Scripts to verify the Analytics data and the corresponding data displayed in the UI.
  • Aided performance modification, managed and resolved technical issues.
  • Prepared testing status reports and reviewed the status with client and project management.
  • Followed Quality process and standard operating procedures.

Environment: Java, Oracle DB, Oracle Apps (AP,AR,GL), Web Services, XML, Windows 2000, Citrix Client, Site Minder

Confidential

Sr LoadRunner Tester.

Responsibilities:

  • Initiated meetings with different teams to find the performance test scope.
  • Participated in the team meetings to discuss the issues arising out of testing.
  • Worked in Requirement analysis, Test strategy documentation of required system and functional testing efforts for all test scenarios including Positive testing, Negative tests
  • Identified the test requirements based on application business requirements.
  • Generated Test Cases for each specification in Requirement Specification Document corresponding to each module.
  • Created Test Data for the test cases for Functional and Automated testing.
  • Documented standards, guidelines, and strategic plans to develop a robust Performance test environment and streamlined the existing Performance testing Process.
  • Lead the development, documentation, and maintenance of LoadRunner scripts along with enhanced standards, procedures and processes.
  • Took lead for creating Performance Center Load Runner scripts using Web-HTTP, Web Services, Oracle NCA Protocols.
  • Tests using Load Runner, Monitor system under load in conjunction With Capacity Planning
  • Troubleshot, tracked and escalated issues related to test execution.
  • Worked on executing Baseline, Benchmark, Stress, Memory leak test scenarios for internal and customer facing applications based on application’s work load model.
  • Ensured daily production support and administration of portal operations like end user performance, identifying the bottlenecks before and after production migration and maintenance upgrades.
  • Generated weekly status reports and reported to management.
  • Worked on sharing applications Performance analysis to Technical Directors and business teams to mitigate GO or No-Go decisions.
  • Communicated project status to Leadership.
  • Worked in Internal and External Audits.

Environment: Java Script, Windows NT/2000, Oracle 8i/9i, SQL Server 2000/2005

Confidential, Boston, MA

LoadRunner Analyst

Responsibilities:

  • Developing Master Test Plan, which includes entire Testing Plan, Testing Resources, Testing Strategy and testing of end-to-end scenarios.
  • Created automated scripts by including timers for recording response times, and test checks for confirming the retrieval of the correct page.
  • Recording, Debugging, correlation and Parameterization of LoadRunner scripts.
  • Involved in the entire life cycle of the project including, requirements gathering, design, test and production support.
  • Designed performance test suites by creating Web (GUI/HTTP/HTML), Web service and Click & Script test scripts, workload scenarios, setting transactions. Extensively used VUGen to create Load Test Scripts.
  • Identify system/application bottlenecks and work to facilitate the tuning of the application/environment in order to optimize capacity and improve performance of the application in order to handle peak workloads generated via Mercury Interactive LoadRunner tool to simulate activity.
  • Created Vusers to emulate concurrent users, inserting Rendezvous points in the Vuser scripts and executed the Vuser Scripts in various scenarios which were both goal oriented and manual using Load Runner.
  • Correlated and Parameterized test scripts to capture Dynamic data and input various test data as per business requirements.
  • Load test summary reports for each run comparing the results with previous runs.
  • Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Identified Disk usage, CPU, Memory and other metrics for all the servers like App Servers, Database Servers, DH and EU’s and how the servers are getting loaded.
  • Added various monitoring parameters (CPU, Memory) to the LoadRunner controller for monitoring.
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.
  • Using Quality Center for complete defect management and reporting.
  • Database testing using SQL queries to compare data accuracy of backend for reports.
  • Monitor database while running the load, reading AWR reports and finding out the issues within database.
  • Responsible to provide on call Production support for the application.
  • Participate in Deep Dive meetings after each Release to work upon improvement areas in the project.

Environment: Java, Windows NT/2000, SQL, MS SQL Server, Web Sphere 6.0

Confidential, NewYork City, NY

Loadrunner Tester

Responsibilities:

  • Good knowledge of the Product Business flow.
  • Supervised resources and gathered metrics.
  • Experienced in defining the performance scenarios based on the client provided QA use cases and inputs.
  • Creating and executing scenarios for Focus tests to target key business use cases.
  • Developed performance workload distribution test models.
  • Documented delays and coordinated test results.
  • Participated in conference calls with client and effectively communicated issues, project updates etc.
  • Established test plans, outlined test environment, scenarios and test scripts.
  • Contributed in preparation and verification of test case pages for all the sub-systems.
  • Worked in developing a typically customized script framework and complex scripts for Load Runner using C language and in automating all the use cases in the product.
  • Developed several utility functions in C language for optimizing and enhancing LR scripts.
  • Creating and executing performance scenarios for benchmarking to optimize JVM and Application configuration.
  • Creating and executing Load Runner scenarios for Performance Verification Testing (PVT) which helps in comparison of performance across different versions and the comparison of performance across different dimensions of data model with in a same version.
  • Performed the analysis for various graphs for the client side and server side metrics like Transaction Response Time, Hits per second graph, Pages download per second, Throughput, Memory & CPU utilization, GC logs, AWR reports and trace logs.
  • Experienced in Generating, Analyzing and interpreting the Oracle AWR reports and MS SQL performance dashboard reports.
  • Effectively analyzed logs and provided necessary recommendations to the client.
  • Identified key performance bottlenecks in the application and also performed end-to-end root cause analysis on these bottlenecks to pin point the reasons. Also presented my analysis in the reports to the client, which was helpful in improving performance.
  • Provide adequate supporting information through reports for bottleneck analysis.
  • Aided performance modification, managed and resolved technical issues.

Environment: Java, J2EE 1.4, Oracle 8i, Windows 2000, Web Sphere 6.0

We'd love your feedback!