We provide IT Staff Augmentation Services!

Performance Tester Resume

2.00/5 (Submit Your Rating)

Dover, NC

SUMMARY

  • Six years of diversified experience in Automated, Manual, Functional, Performance testing of Web and Client/Server applications on UNIX /Windows.
  • Experience in diversified fields of Software Quality Assurance.
  • Well acquainted with all stages of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
  • Understanding and Analyzing the User/Business Requirements and the Software Requirement Specifications (SRS).
  • Skilled in Black Box testing, Grey Box testing, Integration testing, System testing, GUI testing, Functionality testing, Backend testing, Performance testing, of both Client server architecture and GUI based applications on Windows, UNIX and Linux platforms.
  • Expert in manual testing, Automation and Performance testing using tools like, HP Quality Center ALM, LoadRunner 12 & Test Director.
  • Good experience in executing the scenarios using Performance Center.
  • Good in using Automated Functional Testing Tools like HP QTP.
  • Solid knowledge of Software quality assurance processes and procedures that includes test planning, design, development, execution and evaluation phases, set up and execution of automated testing.
  • Experience analyzing system & functional specifications, use cases, business requirements and business rules to identify test requirements, design and execute test plan, test cases and test data.
  • Experienced being solely responsible for the creation and execution of functional test plans and test cases from start to finish, proactively seeking information required completing test design as well as developed and executed test cases.
  • Excellent understanding of CMMI concepts, Software Development Life Cycle (SDLC) and QA methodologies.
  • Good understanding and experience working in Waterfall, Spiral, Agile software development life cycles
  • Expert in identifying use cases, writing test plans, creating and executing test cases/ test scripts, bug/defect logging and tracking.
  • Expert in developing and maintaining Requirements Traceability Matrix (RTM) to make sure customer requirements are captured successfully.
  • Vast experience coordinating and communicating with various teams and stakeholders involved in the project
  • Involved in Formulating Test Strategies and Test environment.
  • Experience in working on Scrum Methodology.
  • Experience in preparing of Test Plans, Test Cases, automated Tests and executing the same.
  • Reporting and prioritizing software bugs in conjunction with the Development & QA Managers and analyzing the Test Results.
  • Executed SQL queries in order to view successful transactions of data and for validating data.
  • Experience in creating performance tests with the following: Web protocols (http/https), Flex, Java applications, Web Logic, Oracle ADF applications or Web Center, Oracle EBS/ERP, NCA protocol.
  • Interacted with development, team members and Users.
  • Good Team Player, excellent communication skills, ability to learn quickly and work independently. Highly motivated and have good interpersonal skills.
  • Experience with MOTS projects.
  • Experience in IBM Rational performance Testing Tools.

TECHNICAL SKILLS

Automated Testing Tools: LoadRunner 11.5.12, Performance Center 11, Silk Performer 15, Win Runner.

Test Management Tools: Quality Center, ALM, JIRATools & Utilities MS Office, Microsoft communicator, SnagIt

Script Languages: VBScript, TSL

Programming Languages: SQL, PL - SQL, UNIX

Web Technologies: HTML, VB Script, ASP

RDBMS: Oracle, Microsoft SQL Server, MS-Access 2000

Operating Systems: Windows NT 4.0/XP/ 7/8/2000/2003, Unix

Web Application Servers: IIS, Web Logic, and dell

Environment: Microsoft.NET products (ASP.NET, VB.NET, C#), SQL Server 2000, Microsoft Visio 2000, VSS, ORACLE PL/SQL, Quick Test Pro(QTP) 9.5 and higher, and/or Business Process Testing(BPT), Quality Center, IBM Rational Performance Tester 8.5 and higher

PROFESSIONAL EXPERIENCE

Confidential, Dover, NC

Performance Tester

Responsibilities:

  • Effectively implemented different QA methodologies/policies, strategies and plans in all stages of SDLC.
  • Participated in business requirement walk through, design walk through and analyzed Business requirements.
  • Involved in Preparing the SDD (System Design Document) based on PRD (Project Requirement Document), TRD (Technical Requirement Document) and Marketing Analysis team Inputs.
  • Created Test plan, Test Design, Test scripts and responsible for implementation of Test.
  • Involved in analyzing and writing test plan in accordance with business requirements.
  • Formulated methods to perform Positive and Negative testing against requirements.
  • Performed Manual Testing of the application Front-End and Functionality. Identified the critical test scripts to be automated.
  • Performed Functional, Data Validation, Integration, System, and regression testing.
  • Analyzed and verified Functional Specification documents.
  • Reviewed and updated Test Plans, Test Scenarios.
  • Created and manually executed the Test Cases.
  • Involved in Test cases review meetings and recommended enhancements in UI functions.
  • Involved in different types of Black Box testing (Positive, Negative, Compatibility, Usability, Performance) GUI, Regression testing and UAT.
  • Produced and submitted problem reports.
  • Reported bugs using Quality Center ALM bug tracking tool.
  • Worked with Rational Performance Tester to identify the presence and cause of system performance bottlenecks.
  • Examined system behavior and performance using Load Runner 12.
  • Prepared Test Plan, Test Cases, developed multiple user Scenarios for Load and Performance Testing.
  • Created VuGen scripts, used manual and automatic correlation, parameterization techniques in generating the test scripts for LoadRunner 12.
  • Executed LoadRunner Scenarios using LoadRunner 12 to perform performance, Stress and scalability tests.
  • Using Performance Center in executing scenarios.
  • Worked with LoadRunner12 Controller for configuring and execution of performance test scenarios with multiple virtual users and virtual user scripts, managed and collected metrics for the various system monitors.
  • Used LoadRunner 12 to analyze the response times of business transactions under load, developed reports and graphs to present the stress test results to the management.
  • Defined Rendezvous point to create peak load on the server and thereby measure the server performance under load.
  • Analyzed Throughput graph, Hits per second graph, Transactions per second graph and Rendezvous graph using LoadRunner 12 Analysis tool.

Confidential, Dulles, VA

QA Analyst

Responsibilities:

  • Involved in Performance testing of some client deliveries for performance requirements using LoadRunner 12.
  • Developed LoadRunner 12 test scripts according to test specifications and requirements.
  • Wrote user-define functions to validate and to run Load and Performance test successfully using Vu Gen of Load Runner 12.
  • Involved in generating Vuser in Load Runner for performance testing, and load testing of the application in various Loads.
  • Using LoadRunner, executed multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller 12.
  • Used LoadRunner controller to create scenarios, and analyzed the test results.
  • Using Performance Center in executing scenarios.
  • Analyzed system requirements and developed Test plans, Test cases for Functional, GUI and Regression Testing.
  • Generated test plans, test scripts, test execution and test strategy with the team.
  • Executed Manual Test Cases and verified results with Expected results.
  • Wrote Test Plan and Test Cases according to business requirement.
  • Wrote Test Cases using MS Excel.
  • Performed data driven testing by data driver wizard and parameterization.
  • Creating and updating test cases based on new and/or /updated functional or interface requirements
  • Extract data from an Oracle System and create MS Excel and MS Access reports

Confidential, Manhattan, NY

QA Analyst

Responsibilities:

  • Preparing JAD sessions with users and development teams, create use cases, understanding business/functional requirements and create test cases.
  • Created traceability matrix to ensure complete coverage of requirements through test cases.
  • Analyzed business requirements and functional design documents.
  • Developed Test Strategy and Test Plans to ensure that test cases reflect user needs for the functional, user-interface, performance, usability and security requirements.
  • Developed manual test cases for positive, negative, functional and performance testing
  • Identified testing requirements and formulated Test Cases and Test Scripts and documented them in Quality Center.
  • Executed manual test cases using Quality Center.
  • Conducted navigational testing for the web links using Quality Center to identify defects such as broken links, permanently moved pages.
  • Developed test data that meets requirements.
  • Provide weekly status reports regarding the effectiveness of testing programs.
  • Working closely through an agile methodology and various scrum meetings with development team and business users.
  • Created and executed test cases to ensure all project releases are effectively tested prior to deployment. I was also ensuring proper tracking and reporting of defects across all modules of my project.
  • Assisting in executing of UAT phase of all projects

We'd love your feedback!