We provide IT Staff Augmentation Services!

Sr. Performance Test Engineer Resume

5.00/5 (Submit Your Rating)

New York, NY

SUMMARY

  • Over 8 years of Experience in teh IT industry wif emphasis on Performance Testing & Quality Assurance.
  • Excellent skills in Manual Testing, Performance Testing and Automated Testing.
  • Generating and implementing templates for Test Plan, Test Cases, Test Scripts, Business Analysis, Gap Analysis, Test Defect Log, Test Case Checklist etc.
  • Experienced in Automation of software testing process using Mercury Interactive Test suite (Win Runner, Load Runner, Quick Test Pro and Test Director Excellent skills in testing web - based applications.
  • Performance Testing Tools: Administration and installation experience wif Performance Center, Quality center, Sitescope and Test Director. Highly skilled in using Load Runner9.2, JMeter and customized UNIX/Linux Load Generator in PERL.
  • Experience in Designing Test plans, Test Cases, Test Scripts and Test Procedures.
  • Strong skills in performing System, Acceptance, Regression, Stress, Performance, Load, Functionality, Front End and Back End Testing.
  • Experience in Backend Testing of teh applications by executing SQL commands.
  • Expertise in testing Performance, Load and Stress for Web and Client/Server Applications.
  • Experience wif Web and Application Servers.
  • Knowledge of Object Oriented Software Development Methodology.
  • Experience in writing System test plans, defining test cases, developing and maintaining test scripts and Documenting all phases of QA process.
  • Participated in requirements analysis reviews and working sessions to understand teh requirements and system design.
  • Experience in testing database applications of RDBMS in ORACLE and SQL Server
  • Experience in estimation of test effort and coordinate test schedule wif overall project schedule.
  • Experience in developing business based functional test scenarios.
  • Expertise in Problem solving and Bug Tracking Reports using Bug tracking Tools.
  • Excellent understanding of teh Software Development Life Cycle and role of QA
  • Experience in Front-end testing, System testing, Unit testing, Integration testing, Performance testing, Stress testing, Backend testing and Regression Testing of Web based and Client/Server applications.
  • Excellent in communication, presentation and interpersonal skills.
  • Contributed to teh completion of all projects on time.
  • Ability to work in tight schedules and on different applications concurrently.
  • Solid analytical and trouble shooting skills.

TECHNICAL SKILLS

Testing Tools: Mercury Testing Suite (Quick Test Pro (QTP), Win runner, Load runner, Test Director), Rational Suite, Rational Robot, Rational Clear Quest, Rational Clear Case.

Test Management and Defect Reporting: Test Director

Java Technologies: JSP, EJB, JDBC, Java Script and Visual Age

Web Design: HTML and FrontPage2000

Scripting Language: TSL, UNIX shell scripting and Perl

Languages: Java, C++ and SQL

Operating Systems: Windows 95/98/2000/NT, DOS and Solaris (UNIX)

Databases: Oracle, SQL Server and MS Access

PROFESSIONAL EXPERIENCE

Sr. Performance Test Engineer

Confidential

Responsibilities:

  • Defined project scope, goals and deliverables dat support business goals in collaboration wif senior management and stakeholders
  • Applied appropriate test methodologies including writing test plans and test cases.
  • Created Test Scripts using teh web (HTTP/HTML) protocol for generating Virtual load on teh application under test.
  • Prioritized and reported defects using Clear Quest and generated documents and reports for further analysis
  • Installed and Configured load generators on teh Client machines.
  • Determining testing scope and when needed created new test cases and added to teh testing repository
  • Responsible for Formulating and defining system scope and objectives of projects based upon both user needs and an understanding of business systems and industry requirements
  • Plan testing teams using testing methodologies; perform Systems testing, Regression, Integration, Volume, Stress, and Disaster Recovery.
  • Preparing and adhering to time estimates (SOW) and managing teh project testing schedule
  • Prepared Test Plan for each component of DA.
  • Conducted teh performance test for CMS, EMS,DTL and ACTT
  • Worked wif EMC team members in preparing teh loadrunner scripts for Documentum.
  • Enhanced Load Runner Vuser scripts by Parameterization, checkpoint, and correlation to test teh new builds of teh application.
  • Created correlation rules, to optimize correlation
  • Monitored and Analyzed activity Report and Performance Report created using Load Runner.
  • Created Scripts for teh running of various Metrics using Load Runner for performance testing
  • Handled all teh testing activities of CMS from identifying teh NFRs, preparing teh test scripts, conducting teh load test and analyzing teh results.
  • Used QTP to automate and measure teh client side response time.
  • Used XMetaL XML-based Authoring & Content Collaboration Software developed by JUST SYSTEMS, to crate XMLs.
  • Successfully automated scripts in loadrunner to create XMLs in XMetal and Webtop.
  • Deployed and validated test harness for each build of EMS.
  • Used SQL queries, triggers, and schemas extensively to test teh application for data validation.
  • Used Perfmon, VMware to analyze server side statistics.

Environment: LoadRunner 9.1, QTP, IBM Clear Quest, Java scripts, .NET, Report Server AS10G, MS Sql, Microsoft Windows

Sr Performance Engineer

Confidential, New York, NY

Responsibilities:

  • Helped in preparing teh Performance Testing Test Plan.
  • Updated teh scripts from teh previous release to work wif teh current upgrade.
  • Create test scripts based on teh test cases in Performance Center 9.10 (Web version of Load runner).
  • Set up multiple load test scenario based on teh test strategy in Performance Center 9.10.
  • Carry out performance, stress and load testing in Performance Center 9.10.
  • Used Performance Center 9.10 to Design, analyze and perform teh load test.
  • Used Sitescope and HP diagnostic Tool to monitor teh Engine and Appache Servers.
  • Worked wif teh developers in finding bottlenecks.
  • Conducted Load, stress, volume and fail over tests.
  • Involved in development of test cases from functional requirements, technical specification and use cases.
  • Involved in Complete SDLC (Software Development Life Cycle).
  • Parameterized teh data values used in Load Runner scripts so dat each script execution can has different values.
  • Called different external operating system functions from a VUGen script using all load function.
  • Created Load/Stress scenarios for performance testing using teh Load Runner Controller.
  • Creating Vuser Scripts in Load Runner by recording, incorporating Rendezvous Points
  • Created Scripts for teh running of various Metrics using Load Runner for performance testing.
  • Used Load Runner performance monitor to analyze teh performance/stress/load condition of teh application
  • Enhanced Load Runner Vuser scripts by Parameterization, checkpoint, and correlation to test teh new builds of teh application.
  • Monitored and Analyzed activity Report and Performance Report created using Load Runner.
  • Created Scripts for teh running of various Metrics using Load Runner for performance testing
  • Conducted testing on teh servers using Load Runner to establish teh load capacity of teh server.
  • Defined requirements for large-scale Load Runner performance tests of an application server.
  • Performed Load Testing by generating Vusers using Load Runner.
  • Created and executed SQL queries to fetch data from an ORACLE database server to validate and compare expected results wif those actually obtained
  • Experience in developing Shell scripts in UNIX.
  • Performed backend testing using SQL queries and analyzed teh server performance on UNIX OS.
  • Involved and responsible for creating weekly status reports regarding teh progress of testing process.

Environment: Mercury (Load Runner, Performance Center 9.10), Oracle, SQL, MS Project, Windows & UNIX platform.

Quality Assurance Analyst

Confidential, Minneapolis, MN

Responsibilities:

  • Created Test Plans which describes teh features and functions to be tested
  • Participated in end to end SDLC
  • Created and managed system testing schedule
  • Created manual and automated tests
  • Executed test cases manually to verify teh expected results
  • Used Test Director/Mercury quality centre to track, analyze and document defects
  • Involved in developing Entry & Exit criteria and defined teh pass and fail standards
  • Performed Positive & Negative Testing
  • Participated in end user requirement meetings and discussed Enhancement and Modification Request issues
  • Handled change requests and coordination to teh development team for bug fixes
  • Performed Integration testing, System testing and Regression testing
  • Execution of teh test scenarios and scripts and review of product functionality
  • Performed End-to-End testing manually.
  • Involved in firm wide deployment and roll out of Performance Center.
  • Created java Virtual Users in Vu Gen and configure Scenarios to meet teh load testing requirements in Performance Center.
  • Analyzed teh performance test reports using Load Runner9.2 Analysis Tool.
  • Setting up test lab environment for Performance testing. Installing and configuring HP Performance Center, LR Controller, Load Runner9.2 VUGen, DB Server, File Server, Utility Server, and Performance Center Agent. Installing and configuring monitoring tool- HP Diagnostics and Sitescope.
  • TEMPEffective co-ordination between development team and testing team
  • Used Load Runner performance monitor to analyze teh performance/stress/load condition of teh application.
  • Created Scripts for teh running of various Metrics using Load Runner for performance testing
  • Conducted testing on teh servers using Load Runner to establish teh load capacity of teh server.
  • Defined requirements for large-scale Load Runner performance tests of an application server.
  • Performed Load Testing by generating Vusers using Load Runner.
  • Created automated test scripts using Load Runner.
  • Created Vuser Scripts using Load Runner by recording, incorporating Transactions, Rendezvous points, Correlation and Think Time.
  • Parameterized teh data values using teh .dat files in VUGen Scripts.
  • Developed teh Load Test scripts using teh Load Runner Virtual User Generator (VUGen) and enhanced teh scripts by including transactions, parameterize teh constant values and correlating teh dynamic values.
  • Enhanced teh script to remove teh wasted time in on-line graphs in teh Load Runner controller and in transaction response time graphs in Load Runner analysis.
  • Conducted testing on teh servers using Load Runner to establish teh load capacity of teh server.
  • Conducted Load Test for multiple users connected by TCP/IP using Load Runner
  • Inserted Shundra wrappers in teh script to capture teh transaction in VE modular and emulate teh bandwidth of different countries.

Environment: Microsoft.Net, JSP, Servlets, Mercury (Winrunner, Load Runner, Performance Center, Quality Center), Oracle, SQL, MS Project, Windows & Unix platform

Sr. Performance Tester

Confidential, New York, NY

Responsibilities:

  • Involved in development of test cases from functional requirements, technical specification and use cases.
  • Reviewed manual testing methods and developed and executed automated scripts.
  • Execute System, Integration, End-to-End, and User Acceptance Test (UAT) test cases for Web-based and JAVA applications
  • Used Load Runner performance monitor to analyze teh performance/stress/load condition of teh application.
  • Create test scripts based on teh test cases in Performance Center (Web version of Load Runner).
  • Carry out performance, stress and load testing in Performance Center.
  • Created Scripts for teh running of various Metrics using Load Runner for performance testing
  • Conducted testing on teh servers using Load Runner to establish teh load capacity of teh server.
  • Defined requirements for large-scale Load Runner performance tests of an application server.
  • Performed Load Testing by generating Vusers using Load Runner.
  • Created automated test scripts using Load Runner.
  • Created Vuser Scripts using Load Runner by recording, incorporating Transactions, Rendezvous points, Correlation and Think Time.
  • Parameterized teh data values using teh .dat files in VUGen Scripts.
  • Developed teh Load Test scripts using teh Load Runner Virtual User Generator (VUGen) and enhanced teh scripts by including transactions, parameterize teh constant values and correlating teh dynamic values.
  • Enhanced teh script to remove teh wasted time in on-line graphs in teh Load Runner controller and in transaction response time graphs in Load Runner analysis.
  • Conducted testing on teh servers using Load Runner to establish teh load capacity of teh server.
  • Conducted Load Test for multiple users connected by TCP/IP using Load Runner.
  • Used IP Spoofing to emulate realistic load using Load Runner.
  • Monitored and Analyzed activity Report and Performance Report created using Load Runner.
  • Functionality, Database, Black box, Unit testing, Integration, System, and Load testing in Load Runner.
  • Used Quick Test Pro to validate links, objects, images and text on Web pages continue to function properly
  • Installed, customized and administered Mercury Interactive Test Director, Load Runner test tools.
  • Manually tested teh Decision Engine against teh database using extensive SQL statements embedded in VB code.
  • Designing, testing and support maintenance for application written in Visual Basic and Oracle wif embedded SQL Commands.
  • Used SQL queries for backend testing which was residing on teh Sun Solaris UNIX.

Environment: Java, J2EE, Servlets, JSP, Mercury (Winrunner, Load Runner, Performance Center, Test Director), Weblogic, Oracle, UNIX.

Sr. Performance Tester

Confidential, Bensonville, IL

Responsibilities:

  • Gatheird and analyzed user/business requirements and developed System test plans.
  • Interacted wif developers.
  • Performed execution of test cases manually to verify teh expected results.
  • Teh Application was developed in Java, HTML, Java Script, Servlets, JSP and Oracle as teh Backend.
  • Involved in Web integration of teh project where all teh development was done in JSP, Java Beans etc.
  • Performed database operations through EJB wif Oracle as back end.
  • Used Quick Test Pro checkpoints to automatically capture and verify properties such as teh number of links.
  • Called different external operating system functions from a VUGen script using DLL load function.
  • Created Load/Stress scenarios for performance testing using teh Load Runner Controller.
  • Creating Vuser Scripts in Load Runner by recording, incorporating Rendezvous Points
  • Created Scripts for teh running of various Metrics using Load Runner for performance testing.
  • Enhanced Load Runner Vuser scripts by Parameterization, checkpoint, and correlation to test teh new builds of teh application.
  • Performed stress testing of each application to verify dat teh required load would has no negative performance TEMPeffect. This was done through creating and executing different scripts on Load Runner.
  • Generated teh test scripts using teh Automated-testing tools Load Runner
  • Made rearranging action possible by enabling multi-protocol GUI while recording teh script in Load Runner.
  • Customization of Load Runner to suite teh requirements of teh testing effort. As a Performance Engineer, was responsible for setting up teh access privileges and creating user profiles.
  • Identified bottlenecks using online monitors and analyzing graphs using Load Runner.
  • Writing Test cases to test teh performance of teh application using Load Runner.
  • Conducted stress testing by using Load Runner.
  • Created various Scenarios to be run, used rendezvous points to create intense Virtual User load on teh server, configured teh scenarios before executing them in teh Load Runner.
  • Enhanced Load Runner scripts to test teh new builds of teh application.
  • Created various Scenarios to be run, used rendezvous points to create intense Virtual User load on teh server, configured teh scenarios before executing them in teh Load Runner.
  • Enhanced Load Runner scripts to test teh new builds of teh application.
  • Used Quick Test Pro checkpoints to automatically capture and verify properties such as teh number of links.
  • Created and Configured non-standard objects to standard objects mapping them to Object Repository of Quick Test Pro.
  • Experienced in creating teh test scripts using QTP and Test Director.
  • Wrote Modification Request (MR) for bugs found during test execution using Test Director.
  • Oracle Table manipulation using SQL.
  • Used SQL Queries to verify teh data from teh Oracle database checked teh PL/SQL Packages developed as a part of Backend testing and Shell Scripts to facilitate batch testing in UNIX environment.
  • Worked on MS Office to create daily reports.

Environment: Java, J2EE, JSP, Servlets, Mercury (Winrunner, Load Runner, Test Director), Oracle, SQL, MS Project, Windows & Unix platform

QA Analyst / Performance Tester

Confidential, NY

Responsibilities:

  • Involve in gatheird specifications and requirements from development personnel prior to testing.
  • Manual Testing was done to perform functional testing on teh User interface
  • Performing functional, load, Integration, regression testing and viewing, analyzing results, risk assessment and defect tracking, defect reporting and documentation.
  • Creating Vuser Scripts in Load Runner by recording, incorporating Transactions, Rendezvous Points and think Time.
  • Parameterized teh data values using teh .dat files in VUGen Scripts.
  • Used VUGen in Load Runner to generate a sequence of script code according to teh sequence of screens and user steps specified in teh business process list.
  • Created Vuser Scripts using Load Runner by recording, incorporating Transactions, Rendezvous points, Correlation and Think Time.
  • Developed teh Load Test scripts using teh Load Runner Virtual User Generator (VUGen) and enhanced teh scripts by including transactions, parameterize teh constant values and correlating teh dynamic values.
  • Conducted result analysis using online monitors and graphs to identify bottlenecks in teh server resources using Load Runner.
  • Edited, debugged, and adjusted teh script by running it wifin VUGen wif run-time setting logs set to display all messages and then ran in Controller to set full test runtime settings.
  • Created Scripts for teh running of various Metrics using Load Runner for performance testing.
  • Inserted Rendezvous points in VUGen script to ensure dat all specified VUsers begin a transaction at precisely teh same time.
  • Made rearranging action possible by enabling multi-protocol GUI while recording teh script in Load Runner.
  • Enhanced teh script to remove teh wasted time in on-line graphs in teh Load Runner controller and in transaction response time graphs in Load Runner analysis.
  • Conducted testing on teh servers using Load Runner to establish teh load capacity of teh server.
  • Conducted Functionality and Regression testing during teh various phases of teh application using Win Runner.
  • Recorded teh Test cases using Automation tool Winrunner for web based application and checked teh functionality of teh application.
  • Used SQL to perform Backend Testing Involved automated testing including Load Testing and Regression Testing using Win Runner and Load Runner.
  • Experience in developing Shell scripts in UNIX.
  • Created test cases, executed and recorded results of test cases using Test Director as teh tool.
  • Responsible for weekly status to show teh Progress of teh automation testing effort.

Environment: (Winrunner, Java, J2EE, Servlets, JSP, Mercury Load Runner, Test Director), Weblogic, Oracle, UNIX

We'd love your feedback!