We provide IT Staff Augmentation Services!

Performance Tester Resume

San Antonio, TX

SUMMARY

  • Performance test engineer with 7 years of experience in all phases of Software Development Life Cycle (SDLC), developing Test Cases, Test Plans and thorough executions and analytical reporting of results to the stakeholders
  • Excellent understanding of Software Development life Cycle (SDLC) and importance of performance tester in Development/ Enhancement and Maintenance of software applications.
  • Experienced in both manual and automated functional testing of Web and Client/Server applications using automated tools such as HP/Mercury tools: (HP UFT, Quick Test Pro: 8.0/8.2/9.2, Test Director/Quality Center 7.5/8.0/8.2/9.2/10.0 ),Loadrunner.
  • Delivery of regular test and analytical performance progress, status, defect, projected and actual execution, risk assessment and impact reports to the members of management, technical, non - technical audiences, and project team (on site /offshore).
  • Designed and executed Automation test scripts using Performance testing tool HP LoadRunner for applications in Client/Server, Windows, UNIX/Linux, Web Services and Web based applications.
  • Experience in using SharePoint.
  • Knowledge of programming languages like C, C++, Java, PL/SQL to debug and executing LoadRunner scripts.
  • Experienced in performance testing and load testing using tools like Loadrunner, JMETER, Neoload and LoadUI
  • Knowledge of automation tools like Selenium and SoapUI.
  • Expertise in executing VuGen scripts in Load Runner for Performance, Load and Stress Testing using Controller in Load Runner and generated reports using the Analysis tool in Load Runner. Extensively worked on Web (HTTP/HTML),Sap GUI,Sap Web and Web Services (Soap service and REST)
  • Analysis of performance bottlenecks, end-to-end performance, and web performance measures like server response time, throughput and network latency. Worked on System Performance Testing Methodologies like Spike, Stress and Endurance Tests.
  • 3+ years of scripting or programming experience (C#, XML, Visual Studio, Eclipse IDE)
  • Analyzed results Using Load Runner Analysis, Neo Load tool and Wily Introscope..
  • Expertise in Automated Testing in different testing methodologies like Agile and Waterfall

PROFESSIONAL EXPERIENCE

Confidential, San Antonio TX

Performance Tester

Responsibilities:

  • Using Loadrunner, execute multi-user performance tests, used online monitors, real-time output messages and other features of the Loadrunner.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
  • Develop and implement load and stress tests with Loadrunner, and present performance statistics to application teams, and provide recommendations of how and where performance can be improved
  • Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests.
  • Identified Disk Usage, CPU, Memory for Web, APP, TUXEDO, Database servers and how the servers are getting loaded
  • Experience in testing citrix applications.
  • Worked on CRM HR applications in Peoplesoft.
  • Worked extensively on ERP salesforce applications.
  • Monitoring logs through splunk.
  • Extensively tested API performance during agile.
  • Gathered the requirements and compiled them into Test Plan
  • Responsible for implementing Loadrunner, architecting the load testing infrastructure, hardware & software integration with Loadrunner.
  • Prepared Test Cases, Loadrunner scripts, Load Test, Test Data, Execute test, validate results, Manage defects and report results.
  • Responsible for creating and monitoring MQ scripts through Java visualVM or CA Wily.
  • Interface with developers, project managers, and management in the development, execution and reporting of test automation results.
  • Responsible for monitoring JMX and JVM through Wily Introscope.
  • Responsible for monitoring GC through CA Wily.
  • Executed performance tests - load, capacity and stress test using Load runner.
  • Identify and eliminate performance bottlenecks during the development lifecycle
  • Accurately produce regular project status reports to senior management to ensure on-time project launch.
  • Verify that new or upgraded applications meet specified performance requirements.
  • Memory Leaks were identified in Different components. Protocol to Protocol Response times, Web Page break downs, Components sizes were analyzed and reported.
  • Extensive Parameterization of the Loadrunner scripts to ensure the real time load conditions.
  • Configured and ran scenarios in Conductor.
  • Communicate test progress, test results, defects/issues and other relevant information to project stakeholders and management.
  • Used Windows SharePoint services for maintaining and uploading the Project Documentation (Requirements, Test Plan, and Test Cases etc.) and reporting the Weekly status report to the SharePoint Portal.
  • Monitoring the load tests, stress tests and endurance tests using wily.
  • Monitored GC, Heap, Active Threads using wily
  • Collecting wily graphs
  • Hands on experience on network analysis tools like dynaTrace and Http Watch for tuning the performance of a web site by measuring download times, caching or the number of network round trips and measure web site performance in manual and automated test environment by capturing metrics such as performance measures like JavaScript execution, Rendering Time, CPU Utilization, Asynchronous Requests and Network Requests
  • Develop and design test Structured Automation Frameworks in collaboration with the Testing team and Development teams in QTP & Quality Center workflow.
  • Converted manual Test cases and scripts into automation using Quick Test Pro.
  • Continually updated Traceability for all the artifacts through Trace Matrix and set up a process to capture standard Test Reporting Metrics.
  • Communicate test progress, test results, defects/issues and other relevant information to project stakeholders and management.
  • Weekly Status meeting with Development and Management teams to discuss bugs and other issues.

Environment: LoadRunner 11.5, Jmeter, DynaTrace, Performance Center 11.5, Visual Studio 2013, VSTS, HP ALM/ Quality Center, SOA, HTML, XML, JavaScript, Web Services.

Confidential, Charlotte Nc

Performance Analyst

Responsibilities:

  • Involved in project planning, coordination and implemented performance methodology
  • Developed Performance Test Plans and Test Strategies based on business requirements
  • Conducted Performance testing by creating Virtual Users and Scenarios using LoadRunner
  • Recorded and enhanced Load Runner HTTP/HTML web and web services scripts.
  • Enhancing the scripts by employing Manual/Automatic correlation, Parameterization Techniques and LR specific functions
  • Responsible Performance testing, debugging, executing, analyzing complex applications using HP LoadRunner and ALM HP Performance Center.
  • Developed and Executed JMeter Scripts.
  • Created scripts in SAP and done testing on ERP applications.
  • Debugging and validate the Test scripts
  • Used Scheduler to schedule the scenarios for user’s Ramp up/Ramp down in Load Runner Controller.
  • Monitored different graphs like Transaction Response Time and analyzed Server Performance Status, Hits per Second, Throughput etc.
  • Setting up the Pacing and Think Time according to the SLA for Test executions.
  • Tracked and monitored defects using HP ALM.
  • Monitor resources to identify performance bottlenecks and tuning JVM.
  • Monitored resources to identify performance bottlenecks and tuning JVM also Planed and implemented server component-level testing and monitoring
  • Analyzed JVM Heap and GC logs in Web Sphere during test execution.
  • Executed Load Test, Stress Test, and Endurance Test by uploading the VuGen Scripts in to Performance Center 11.5.
  • Designed and executed Performance testing to analyze the bottlenecks in the application using Loadrunner Analysis.
  • Executed performance tests - load, capacity and stress test using HP LoadRunner and Microsoft Visual Studio Load Test.
  • Explorer and recommended changes, as well as re-tested to validate the fixes.
  • To investigate the backend logs created during execution of Load Runner scripts.
  • To Create Loadrunner scenarios & execute different tests as per the requirements.
  • Create a test report, which documents test results and lists any performance bottlenecks.
  • Documented Summary Reports and Closure Reports for each Test execution.
  • Responsible for Performance Tuning for Load, Stress, Endurance Test executions.
  • Worked with development members on bug reproduction and fixes.
  • Updated management on testing results, activities and planning.
  • Developed performance test plans, test scripts, test scenarios based on business requirements
  • Recorded and enhanced test scripts in protocol such as Web (HTTP/HTML), Oracle NCA, Oracle Web Application 11i with parameterization, correlation, adding ANSI C and Oracle NCA functions
  • Executed multi-user performance tests, used online monitors, real-time output messages and other features of the Loadrunner Controller
  • Analyzed performance transaction, and server resource monitors for meaningful results for the entire test run using LoadRunner Analysis

Environment: LoadRunner 11.5, Jmeter, DynaTrace, Performance Center 11.5, Visual Studio 2013, VSTS, HP ALM/ Quality Center, SOA, HTML, XML, JavaScript, Web Services.

Confidential, Houston, TX

Performance Analyst

Responsibilities:

  • Prepared test planning document based on analysis of requirements and design documents.
  • Execute performance / volume testing to ensure all development deliverables are production ready using HP Performance Center.
  • Write/maintain test case scripts and execute; document detailed results and summary report.
  • Multi-task between current release, install activities, and prioritize task for next release
  • Coordinated with project team members Business Analyst and developers on testing requirements.
  • Monitor and resolve testing issues.
  • Developed and Executed JMeter Scripts.
  • Manage testing tasks concurrently on multiple projects.
  • Coordinate the overall execution of the test plan with group and business representatives.
  • Good knowledge of Spotlight software.
  • Responsible for testing for authentication from tandem authorization.
  • Participated in the Performance and Infrastructure testing activities surrounding Web based, middleware applications, a set of web services built on varying technologies (.net, J2EE based web services).
  • Good understanding of the project life cycle, from analysis to production implementation, with emphasis on test data metric analysis, performance testing, load/stress testing, and auditable documentation of plan and results.
  • Analyzed and documentation of SharePoint architecture including central administration, SQL & Windows 2008R2
  • Used Winsock and ODBC Protocol to execute store procedure using LoadRunner for Database migration from Sybase to SQL Server.
  • Used SOAP UI for testing of the web services under the implemented SOA framework.
  • Used Manual and Automated Correlation to Parameterize Dynamically changing Parameters
  • Prepare the data for the Parameterized values in the scripts for multiple scenarios
  • Developing Vuser scripts and enhanced the basic script by parameterizing the constant values using LoadRunner.
  • Extensively used Quality Center for test planning, maintain test cases and test scripts for test execution as well as bug reporting
  • Researched on production issues.
  • Additional responsibilities as directed by either the QA Lead or Management.

Environment: HP LoadRunner 11/11.5, HP UFT, Performance Center, JMeter, HP ALM, SharePoint, Sitescope, Java, C, VBScript, TSL, Visual Studio 2013, VSTS, XML, HTML, MS Office, SQL, PL/SQL, VTS, SOA, Crystal Reports, Web Services,, WebSphere(WAS), Unix and Windows.

Hire Now