We provide IT Staff Augmentation Services!

Sr. Performance Test Engineer Resume

3.00/5 (Submit Your Rating)

New York, NY

SUMMARY

  • Over 8 years of Experience in the IT industry wif emphasis on Performance Testing & Quality Assurance.
  • Excellent skills in Manual Testing, Performance Testing and Automated Testing.
  • Generating and implementing templates for Test Plan, Test Cases, Test Scripts, Business Analysis, Gap Analysis, Test Defect Log, Test Case Checklist etc.
  • Experienced in Automation of software testing process using Mercury Interactive Test suite (Win Runner, Load Runner, Quick Test Pro and Test Director Excellent skills in testing web - based applications.
  • Performance Testing Tools: Administration and installation experience wif Performance Center, Quality center, Sitescope and Test Director. Highly skilled in using Load Runner9.2, JMeter and customized UNIX/Linux Load Generator in PERL.
  • Experience in Designing Test plans, Test Cases, Test Scripts and Test Procedures.
  • Strong skills in performing System, Acceptance, Regression, Stress, Performance, Load, Functionality, Front End and Back End Testing.
  • Experience in Backend Testing of the applications by executing SQL commands.
  • Expertise in testing Performance, Load and Stress for Web and Client/Server Applications.
  • Experience wif Web and Application Servers.
  • Knowledge of Object Oriented Software Development Methodology.
  • Experience in writing System test plans, defining test cases, developing and maintaining test scripts and Documenting all phases of QA process.
  • Participated in requirements analysis reviews and working sessions to understand the requirements and system design.
  • Experience in testing database applications of RDBMS in ORACLE and SQL Server
  • Experience in estimation of test effort and coordinate test schedule wif overall project schedule.
  • Experience in developing business based functional test scenarios.
  • Expertise in Problem solving and Bug Tracking Reports using Bug tracking Tools.
  • Excellent understanding of the Software Development Life Cycle and role of QA
  • Experience in Front-end testing, System testing, Unit testing, Integration testing, Performance testing, Stress testing, Backend testing and Regression Testing of Web based and Client/Server applications.
  • Excellent in communication, presentation and interpersonal skills.
  • Contributed to the completion of all projects on time.
  • Ability to work in tight schedules and on different applications concurrently.
  • Solid analytical and trouble shooting skills.

TECHNICAL SKILLS

Testing Tools: Mercury Testing Suite (Quick Test Pro (QTP), Win runner, Load runner, Test Director), Rational Suite, Rational Robot, Rational Clear Quest, Rational Clear Case.

Test Management and Defect Reporting: Test Director

Java Technologies: JSP, EJB, JDBC, Java Script and Visual Age

Web Design: HTML and FrontPage2000

Scripting Language: TSL, UNIX shell scripting and Perl

Languages: Java, C++ and SQL

Operating Systems: Windows 95/98/2000/NT, DOS and Solaris (UNIX)

Databases: Oracle, SQL Server and MS Access

PROFESSIONAL EXPERIENCE

Sr. Performance Test Engineer

Confidential

Responsibilities:

  • Defined project scope, goals and deliverables dat support business goals in collaboration wif senior management and stakeholders
  • Applied appropriate test methodologies including writing test plans and test cases.
  • Created Test Scripts using the web (HTTP/HTML) protocol for generating Virtual load on the application under test.
  • Prioritized and reported defects using Clear Quest and generated documents and reports for further analysis
  • Installed and Configured load generators on the Client machines.
  • Determining testing scope and when needed created new test cases and added to the testing repository
  • Responsible for Formulating and defining system scope and objectives of projects based upon both user needs and an understanding of business systems and industry requirements
  • Plan testing teams using testing methodologies; perform Systems testing, Regression, Integration, Volume, Stress, and Disaster Recovery.
  • Preparing and adhering to time estimates (SOW) and managing the project testing schedule
  • Prepared Test Plan for each component of DA.
  • Conducted the performance test for CMS, EMS,DTL and ACTT
  • Worked wif EMC team members in preparing the loadrunner scripts for Documentum.
  • Enhanced Load Runner Vuser scripts by Parameterization, checkpoint, and correlation to test the new builds of the application.
  • Created correlation rules, to optimize correlation
  • Monitored and Analyzed activity Report and Performance Report created using Load Runner.
  • Created Scripts for the running of various Metrics using Load Runner for performance testing
  • Handled all the testing activities of CMS from identifying the NFRs, preparing the test scripts, conducting the load test and analyzing the results.
  • Used QTP to automate and measure the client side response time.
  • Used XMetaL XML-based Authoring & Content Collaboration Software developed by JUST SYSTEMS, to crate XMLs.
  • Successfully automated scripts in loadrunner to create XMLs in XMetal and Webtop.
  • Deployed and validated test harness for each build of EMS.
  • Used SQL queries, triggers, and schemas extensively to test the application for data validation.
  • Used Perfmon, VMware to analyze server side statistics.

Environment: LoadRunner 9.1, QTP, IBM Clear Quest, Java scripts, .NET, Report Server AS10G, MS Sql, Microsoft Windows

Sr Performance Engineer

Confidential, New York, NY

Responsibilities:

  • Helped in preparing the Performance Testing Test Plan.
  • Updated the scripts from the previous release to work wif the current upgrade.
  • Create test scripts based on the test cases in Performance Center 9.10 (Web version of Load runner).
  • Set up multiple load test scenario based on the test strategy in Performance Center 9.10.
  • Carry out performance, stress and load testing in Performance Center 9.10.
  • Used Performance Center 9.10 to Design, analyze and perform the load test.
  • Used Sitescope and HP diagnostic Tool to monitor the Engine and Appache Servers.
  • Worked wif the developers in finding bottlenecks.
  • Conducted Load, stress, volume and fail over tests.
  • Involved in development of test cases from functional requirements, technical specification and use cases.
  • Involved in Complete SDLC (Software Development Life Cycle).
  • Parameterized the data values used in Load Runner scripts so dat each script execution can have different values.
  • Called different external operating system functions from a VUGen script using all load function.
  • Created Load/Stress scenarios for performance testing using the Load Runner Controller.
  • Creating Vuser Scripts in Load Runner by recording, incorporating Rendezvous Points
  • Created Scripts for the running of various Metrics using Load Runner for performance testing.
  • Used Load Runner performance monitor to analyze the performance/stress/load condition of the application
  • Enhanced Load Runner Vuser scripts by Parameterization, checkpoint, and correlation to test the new builds of the application.
  • Monitored and Analyzed activity Report and Performance Report created using Load Runner.
  • Created Scripts for the running of various Metrics using Load Runner for performance testing
  • Conducted testing on the servers using Load Runner to establish the load capacity of the server.
  • Defined requirements for large-scale Load Runner performance tests of an application server.
  • Performed Load Testing by generating Vusers using Load Runner.
  • Created and executed SQL queries to fetch data from an ORACLE database server to validate and compare expected results wif those actually obtained
  • Experience in developing Shell scripts in UNIX.
  • Performed backend testing using SQL queries and analyzed the server performance on UNIX OS.
  • Involved and responsible for creating weekly status reports regarding the progress of testing process.

Environment: Mercury (Load Runner, Performance Center 9.10), Oracle, SQL, MS Project, Windows & UNIX platform.

Quality Assurance Analyst

Confidential, Minneapolis, MN

Responsibilities:

  • Created Test Plans which describes the features and functions to be tested
  • Participated in end to end SDLC
  • Created and managed system testing schedule
  • Created manual and automated tests
  • Executed test cases manually to verify the expected results
  • Used Test Director/Mercury quality centre to track, analyze and document defects
  • Involved in developing Entry & Exit criteria and defined the pass and fail standards
  • Performed Positive & Negative Testing
  • Participated in end user requirement meetings and discussed Enhancement and Modification Request issues
  • Handled change requests and coordination to the development team for bug fixes
  • Performed Integration testing, System testing and Regression testing
  • Execution of the test scenarios and scripts and review of product functionality
  • Performed End-to-End testing manually.
  • Involved in firm wide deployment and roll out of Performance Center.
  • Created java Virtual Users in Vu Gen and configure Scenarios to meet the load testing requirements in Performance Center.
  • Analyzed the performance test reports using Load Runner9.2 Analysis Tool.
  • Setting up test lab environment for Performance testing. Installing and configuring HP Performance Center, LR Controller, Load Runner9.2 VUGen, DB Server, File Server, Utility Server, and Performance Center Agent. Installing and configuring monitoring tool- HP Diagnostics and Sitescope.
  • TEMPEffective co-ordination between development team and testing team
  • Used Load Runner performance monitor to analyze the performance/stress/load condition of the application.
  • Created Scripts for the running of various Metrics using Load Runner for performance testing
  • Conducted testing on the servers using Load Runner to establish the load capacity of the server.
  • Defined requirements for large-scale Load Runner performance tests of an application server.
  • Performed Load Testing by generating Vusers using Load Runner.
  • Created automated test scripts using Load Runner.
  • Created Vuser Scripts using Load Runner by recording, incorporating Transactions, Rendezvous points, Correlation and Think Time.
  • Parameterized the data values using the .dat files in VUGen Scripts.
  • Developed the Load Test scripts using the Load Runner Virtual User Generator (VUGen) and enhanced the scripts by including transactions, parameterize the constant values and correlating the dynamic values.
  • Enhanced the script to remove the wasted time in on-line graphs in the Load Runner controller and in transaction response time graphs in Load Runner analysis.
  • Conducted testing on the servers using Load Runner to establish the load capacity of the server.
  • Conducted Load Test for multiple users connected by TCP/IP using Load Runner
  • Inserted Shundra wrappers in the script to capture the transaction in VE modular and emulate the bandwidth of different countries.

Environment: Microsoft.Net, JSP, Servlets, Mercury (Winrunner, Load Runner, Performance Center, Quality Center), Oracle, SQL, MS Project, Windows & Unix platform

Sr. Performance Tester

Confidential, New York, NY

Responsibilities:

  • Involved in development of test cases from functional requirements, technical specification and use cases.
  • Reviewed manual testing methods and developed and executed automated scripts.
  • Execute System, Integration, End-to-End, and User Acceptance Test (UAT) test cases for Web-based and JAVA applications
  • Used Load Runner performance monitor to analyze the performance/stress/load condition of the application.
  • Create test scripts based on the test cases in Performance Center (Web version of Load Runner).
  • Carry out performance, stress and load testing in Performance Center.
  • Created Scripts for the running of various Metrics using Load Runner for performance testing
  • Conducted testing on the servers using Load Runner to establish the load capacity of the server.
  • Defined requirements for large-scale Load Runner performance tests of an application server.
  • Performed Load Testing by generating Vusers using Load Runner.
  • Created automated test scripts using Load Runner.
  • Created Vuser Scripts using Load Runner by recording, incorporating Transactions, Rendezvous points, Correlation and Think Time.
  • Parameterized the data values using the .dat files in VUGen Scripts.
  • Developed the Load Test scripts using the Load Runner Virtual User Generator (VUGen) and enhanced the scripts by including transactions, parameterize the constant values and correlating the dynamic values.
  • Enhanced the script to remove the wasted time in on-line graphs in the Load Runner controller and in transaction response time graphs in Load Runner analysis.
  • Conducted testing on the servers using Load Runner to establish the load capacity of the server.
  • Conducted Load Test for multiple users connected by TCP/IP using Load Runner.
  • Used IP Spoofing to emulate realistic load using Load Runner.
  • Monitored and Analyzed activity Report and Performance Report created using Load Runner.
  • Functionality, Database, Black box, Unit testing, Integration, System, and Load testing in Load Runner.
  • Used Quick Test Pro to validate links, objects, images and text on Web pages continue to function properly
  • Installed, customized and administered Mercury Interactive Test Director, Load Runner test tools.
  • Manually tested the Decision Engine against the database using extensive SQL statements embedded in VB code.
  • Designing, testing and support maintenance for application written in Visual Basic and Oracle wif embedded SQL Commands.
  • Used SQL queries for backend testing which was residing on the Sun Solaris UNIX.

Environment: Java, J2EE, Servlets, JSP, Mercury (Winrunner, Load Runner, Performance Center, Test Director), Weblogic, Oracle, UNIX.

Sr. Performance Tester

Confidential, Bensonville, IL

Responsibilities:

  • Gathered and analyzed user/business requirements and developed System test plans.
  • Interacted wif developers.
  • Performed execution of test cases manually to verify the expected results.
  • The Application was developed in Java, HTML, Java Script, Servlets, JSP and Oracle as the Backend.
  • Involved in Web integration of the project where all the development was done in JSP, Java Beans etc.
  • Performed database operations through EJB wif Oracle as back end.
  • Used Quick Test Pro checkpoints to automatically capture and verify properties such as the number of links.
  • Called different external operating system functions from a VUGen script using DLL load function.
  • Created Load/Stress scenarios for performance testing using the Load Runner Controller.
  • Creating Vuser Scripts in Load Runner by recording, incorporating Rendezvous Points
  • Created Scripts for the running of various Metrics using Load Runner for performance testing.
  • Enhanced Load Runner Vuser scripts by Parameterization, checkpoint, and correlation to test the new builds of the application.
  • Performed stress testing of each application to verify dat the required load would of no negative performance TEMPeffect. This was done through creating and executing different scripts on Load Runner.
  • Generated the test scripts using the Automated-testing tools Load Runner
  • Made rearranging action possible by enabling multi-protocol GUI while recording the script in Load Runner.
  • Customization of Load Runner to suite the requirements of the testing effort. As a Performance Engineer, was responsible for setting up the access privileges and creating user profiles.
  • Identified bottlenecks using online monitors and analyzing graphs using Load Runner.
  • Writing Test cases to test the performance of the application using Load Runner.
  • Conducted stress testing by using Load Runner.
  • Created various Scenarios to be run, used rendezvous points to create intense Virtual User load on the server, configured the scenarios before executing them in the Load Runner.
  • Enhanced Load Runner scripts to test the new builds of the application.
  • Created various Scenarios to be run, used rendezvous points to create intense Virtual User load on the server, configured the scenarios before executing them in the Load Runner.
  • Enhanced Load Runner scripts to test the new builds of the application.
  • Used Quick Test Pro checkpoints to automatically capture and verify properties such as the number of links.
  • Created and Configured non-standard objects to standard objects mapping them to Object Repository of Quick Test Pro.
  • Experienced in creating the test scripts using QTP and Test Director.
  • Wrote Modification Request (MR) for bugs found during test execution using Test Director.
  • Oracle Table manipulation using SQL.
  • Used SQL Queries to verify the data from the Oracle database checked the PL/SQL Packages developed as a part of Backend testing and Shell Scripts to facilitate batch testing in UNIX environment.
  • Worked on MS Office to create daily reports.

Environment: Java, J2EE, JSP, Servlets, Mercury (Winrunner, Load Runner, Test Director), Oracle, SQL, MS Project, Windows & Unix platform

QA Analyst / Performance Tester

Confidential, NY

Responsibilities:

  • Involve in gathered specifications and requirements from development personnel prior to testing.
  • Manual Testing was done to perform functional testing on the User interface
  • Performing functional, load, Integration, regression testing and viewing, analyzing results, risk assessment and defect tracking, defect reporting and documentation.
  • Creating Vuser Scripts in Load Runner by recording, incorporating Transactions, Rendezvous Points and think Time.
  • Parameterized the data values using the .dat files in VUGen Scripts.
  • Used VUGen in Load Runner to generate a sequence of script code according to the sequence of screens and user steps specified in the business process list.
  • Created Vuser Scripts using Load Runner by recording, incorporating Transactions, Rendezvous points, Correlation and Think Time.
  • Developed the Load Test scripts using the Load Runner Virtual User Generator (VUGen) and enhanced the scripts by including transactions, parameterize the constant values and correlating the dynamic values.
  • Conducted result analysis using online monitors and graphs to identify bottlenecks in the server resources using Load Runner.
  • Edited, debugged, and adjusted the script by running it wifin VUGen wif run-time setting logs set to display all messages and then ran in Controller to set full test runtime settings.
  • Created Scripts for the running of various Metrics using Load Runner for performance testing.
  • Inserted Rendezvous points in VUGen script to ensure dat all specified VUsers begin a transaction at precisely the same time.
  • Made rearranging action possible by enabling multi-protocol GUI while recording the script in Load Runner.
  • Enhanced the script to remove the wasted time in on-line graphs in the Load Runner controller and in transaction response time graphs in Load Runner analysis.
  • Conducted testing on the servers using Load Runner to establish the load capacity of the server.
  • Conducted Functionality and Regression testing during the various phases of the application using Win Runner.
  • Recorded the Test cases using Automation tool Winrunner for web based application and checked the functionality of the application.
  • Used SQL to perform Backend Testing Involved automated testing including Load Testing and Regression Testing using Win Runner and Load Runner.
  • Experience in developing Shell scripts in UNIX.
  • Created test cases, executed and recorded results of test cases using Test Director as the tool.
  • Responsible for weekly status to show the Progress of the automation testing effort.

Environment: (Winrunner, Java, J2EE, Servlets, JSP, Mercury Load Runner, Test Director), Weblogic, Oracle, UNIX

We'd love your feedback!