We provide IT Staff Augmentation Services!

Performance Test Engineer Resume

4.00/5 (Submit Your Rating)

Washington, DC

OBJECTIVE:

Results oriented Performance Engineer/ QA Analyst who is creative in identifying problems, expert in analyzing & performance tuning and ef fective in implementing solution that meet the current and future expectations of business.

SUMMARY:

  • Over 6 years of experience in monitoring, analyzing and recommending solutions to Performance problems in high - traffic, large-scale distributed systems and client-server architectures.
  • Experienced in Manual and Automation Testing using HP Quality Center, HP LoadRunner, Performance Center and Jmeter.
  • Experienced in Ecommerce, Financial Healthcare and Telecom domains
  • Strong working Knowledge in ASP, VB, XML, Java, J2EE, .Net.
  • Experienced in Ad-hoc UAT, Compatibility, Load, Stress, Performance and Scalability Testing.
  • Experienced in working in V-Model, waterfall and Agile/Scrum methodologies.
  • Good working Knowledge in creating Test plan, Test cases, Test scripts, Test metrics, Test strategies, Test Configuration and Change Management Disciplines.
  • Good knowledge in test coverage and test tractability matrix and defects reporting in Quality Center.
  • Expertise in Bug Tracking system and Process using Quality Center and TestDirector .
  • Experienced in Testing PL/SQL code into high level code for conversion to DataStage
  • Extensively experiencd with Toad, Pl/SQL Developer and Oracle & SQL Server.
  • Experienced in analyzing the Performance of the application using LoadRunner with different Virtual users.
  • Experienced in Load testing, Scenario creation and execution, Measured Throughput, Hits per second, Response time, and Transaction time using LoadRunner Analysis.
  • Experienced in using Jmeter for Database Backend Testing with JDBC & ODBC Connection.
  • Extensively experienced to create and execute Batch File & UNIX Shell Script.
  • Expert scheduling job in both Windows & Unix System through Task Scheduler & Cron Tab.
  • Experienced in performing web testing and web services testing.
  • Strong Interpersonal and Analytical skills.
  • Team player and have good verbal and written communication skills.
  • In-depth knowledge of analyzing systems, evaluating system performance, and responses .
  • Experienced in to setup monitoring and benchmarking tools on performance lab and generate reports regularly.
  • Experienced in working across entire business teams to collect non-functional requirements, formulate scalable test strategies, and enforce performance testing.
  • Experienced in creating, coordinating and managing the performance test environment (PTE) with necessary testing tools and ensure cross leveraging of the PTE across the applications needs.
  • Experienced in to setup and execute performance and scalability tests with required number of concurrent users, profile and fix problems.
  • Expert Extensively experienced in LoadRunner Automation with scheduled batch file, expert in LoadRunner Analysis with custom Template.
  • Experienced in using Jmeter for Database Backend Testing with JDBC & ODBC Connection and loading testing of LDAP,FTP, SOAP using Jmeter
  • Expert in Windows Typeperf & Perfmon Utility to create custom config file and collect windows resources statistics remotely and generate report with PAL.
  • Extensively experienced to create and execute Batch File & Unix Shell Script.
  • Experienced using Monitoring Tools like Task Manager, Process Explorer, Performance Monitor, Resource Monitor and Data Ware House Monitor in Windows system and Jconsole to Monitor Java based application, System Monitor and Topas in Unix system.
  • Experienced in collecting and analyzing database performance using SQL Profiler, Activity Monitor, Dynamic Management View(DMV) in MS SQL Server and using Statspack and TKProf utility in Oracle
  • Extensively experienced in MS Office Suites to create and analyze report and graphs
  • Experienced in using MS SharePoint for business collaboration.

TECHNICAL SKILL:

Platforms: Windows, UNIX, Linux and Main frame

Testing Tools: HP LoadRunner, Performance Centre, HP Business availability centre, HP Quality Centre, Jmeter

Defect Tracking and Bug Reporting Tools: Quality Centre, IBM Clear Quest.

Databases: Oracle and SQL Server, DB2

Web Application Servers: Web Logic, Web Sphere, and ColdFusion

Enterprise Monitoring: Dyna trace

PROFESSIONAL EXPERIENCE:

Confidential, Washington DC

Performance Test Engineer

Responsibilities:

  • Created and presented a Performance Planning Strategy that outlines Current Production Workload distribution, Performance Requirements and Test/Production Environment Requirements and Performance Optimization results
  • Analyzed System Requirement Specifications and developed test plans and test cases to cover all the requirements.
  • Participated in Design Reviews, and Estimations of the project.
  • Responsible for developing Performance Test strategy.
  • Good with Debugging, Identifying, Adjusting and fixing script errors by running VuGen.
  • Involved in all phases of testing - Smoke Testing, Endurance testing, System testing, Integration testing, GUI testing, Regression Testing, Performance Testing and UAT testing.
  • Performed Smoke, User Acceptance Compatibility, Load, Stress, Performance, and Scalability Testing.
  • Designed and Developed Load Test Scripts with VuGen, executed Test Scenarios in Load Runner controller and Analyzed the results with LoadRunner Analysis tool.
  • Experience in creating monitors and reports using SiteScope by determining type of System environments (Web/App or DB servers) and analyzing the CPU Utilization, Memory, Network activity, Processes, Connection attempts/sec, Get Requests/sec.
  • Reported the bugs, e-mail notifications to the developers using the HP Quality Center.
  • Used HP Quality Center for defect tracking in different environments.
  • Worked with business analyst to define Performance requirement (business transactions, user profile, and test data).
  • Responsible for developing Performance Test strategy.
  • Installed and configured LoadRunner.
  • Used LoadRunner to Design, develop, and calibrate tests so that workload being executed is accurate.
  • Analyzed key scenarios to realize crucial functional areas of the application, Creation and Execution of LoadRunner test scripts against the Key functional areas of Stress Testing of the application.
  • Responsible for the creation and implementation of the Performance Testing Framework using LoadRunner.
  • Responsible for monitoring application using Dynatrace.
  • Responsible for coordinating with other cross functional groups.
  • Prepared Performance Test Results for upper management.
  • Provided Test Matrix to management when each project completed.
  • Developed and conducted system and user acceptance test plans on completion of system testing before installation of the application on user environment.
  • Involved in Metrics collection for Iterations and Updating Release Notes.
  • Collected status reports from teammates, consolidating, and updating thro Remedy.

Environment: LoadRunner, Quality Center, Clear Quest, Performance Center, .Net, Java, Oracle, Windows, Internet Explorer, DB2 and MS Office.

Confidential, Issaquah, WA

Performance Analyst

Responsibilities:

  • Created and presented a Performance Planning Strategy that outlines Current Production Workload distribution, Performance Requirements and Test/Production Environment Requirements and Performance Optimization results
  • Analyzed System Requirement Specifications and developed test plans and test cases to cover all the requirements.
  • Participated in Design Reviews, and Estimations of the project.
  • Performed Smoke, User Acceptance Compatibility, Load, Stress, Performance, and Scalability Testing.
  • Reported the bugs, e-mail notifications to the developers using the IBM Clear Quest.
  • Used IBM Clear Quest for defect tracking in different environments.
  • Wrote SQL queries for checking the Data Transactions and Database Integrity in both SQL Server and Oracle.
  • Worked with business analyst to define Performance requirement (business transactions, user profile, and test data).
  • Responsible for developing Performance Test strategy.
  • Installed and configured LoadRunner.
  • Used LoadRunner to Design, develop, and calibrate tests so that workload being executed is accurate.
  • Capture results gathered during execution of tests and work with OOD to analyze and draw conclusions from these results using LoadRunner.
  • Analyzed key scenarios to realize crucial functional areas of the application, Creation and Execution of LoadRunner test scripts against the Key functional areas of Stress Testing of the application.
  • Responsible for the creation and implementation of the Performance Testing Framework using LoadRunner.
  • Responsible for monitoring application using Dynatrace.
  • Responsible for coordinating with other cross functional groups.
  • Prepared Performance Test Results for upper management.
  • Provided Test Matrix to management when each project completed.
  • Developed and conducted system and user acceptance test plans on completion of system testing before installation of the application on user environment.
  • Involved in Metrics collection for Iterations and Updating Release Notes.
  • Collected status reports from teammates, consolidating, and updating thro Remedy.

Environment: LoadRunner, Quality Center, Clear Quest, Performance Center, .Net, Java, Oracle, Windows, Internet Explorer, DB2 and MS Office.

Confidential, Washington, DC

Performance Analyst

Responsibilities:

  • Worked with the Development, Application & Infrastructure Architecture and Technical Services teams to ensure that changes in technical or business systems do not adversely impact Application Performance.
  • Analyzed System Requirement Specifications and developed test plans and test cases to cover all the requirements.
  • Participated in Design Reviews, and Estimations of the project.
  • Performed Smoke, User Acceptance Compatibility, Load, Stress, Performance, and Scalability Testing.
  • Worked extensively with Quality Center for creating Test Plans, Test Cases, Test Design, Test Inputs, Test Logs, and Test Summary Reports.
  • Reported the bugs, e-mail notifications to the developers using the Quality Center.
  • Used Quality Center for defect tracking in different environments.
  • Wrote SQL queries for checking the Data Transactions and Database Integrity in both SQL Server and Oracle.
  • Worked with business analyst to define Performance requirement (business transactions, user profile, and test data).
  • Responsible for developing Performance Test strategy.
  • Installed and configured Loadrunner in a NCA environment.
  • Used LoadRunner to Design, develop, and calibrate tests so that workload being executed is accurate.
  • Worked with Oracle on Demand (OOD) team to optimize the availability of server resources and improves server scalability by balancing traffic among servers using Jmeter.
  • Capture results gathered during execution of tests and work with OOD to analyze and draw conclusions from these results using LoadRunner.
  • Analyzed key scenarios to realize crucial functional areas of the application, Creation and Execution of LoadRunner test scripts against the Key functional areas of Stress Testing of the application.
  • Responsible for the creation and implementation of the Performance Testing Framework using LoadRunner.
  • Prepared Performance Test Results for upper management.
  • Provided Test Matrix to management when each project completed.
  • Tested the application compatibility in all versions of Netscape and Internet Explorer browsers by automating the test cases.
  • Developed and conducted system and user acceptance test plans on completion of system testing before installation of the application on user environment.
  • Involved in Metrics collection for Iterations and Updating Release Notes.
  • Collected status reports from teammates, consolidating, and updating thro Remedy.

Environment: LoadRunner, Jmeter, Quality Center, Performance Center, .Net, Java, Oracle, Windows, VB Scripts, Internet Explorer, SQL Server and MS Office.

Confidential, Silver Spring, MD

Performance Tester

Responsibilities:

  • Worked with Business Analysts, Programmers and Business users through the life cycle of the project.
  • Involved in developing Test plan according to the design specifications and requirements.
  • Involved in planning the test strategy and test resource allocation.
  • Involved in preparing Traceability Matrix, Test Status Report, Test Execution Report, Weekly Summary Report and setting up Test Data.
  • Used HP Quality Center to design Test Documents, including Test Plan, Test Requirements, Test Cases and Test Procedures.
  • Wrote test cases to test the application manually in Quality Center and automated using QuickTest Pro.
  • Reported and reviewed the defects with development team using Quality Center.
  • Responsible in consolidating the test results sent by the testers and updating the Quality Center.
  • Performed Load and Stress test on Oracle with 500 concurrent users load.
  • Conducted Performance benchmarking for Oracle Configuration.
  • Used Jmeter for Database Backend Testing with JDBC & ODBC Connection.
  • Involved in using Monitoring Tools like Task Manager, Process Explorer, Performance Monitor, Resource Monitor and Data Ware House Monitor in Windows system and Jconsole to Monitor Java based application, System Monitor and Topas in UNIX system.
  • Generated Reports, Graphs, Summary data, and Collating execution results to help analyze the performance of the systems using LoadRunner Analysis.
  • Constructed UAT packages that encompassed multiple SIT drops and assisted Business users during UAT testing.
  • Involved with version control, configuration management and change control procedures.
  • Prepared exchange testing report and issue list; Interacted with the customer regarding bug reports.

Environment: Quality Center, LoadRunner, QTP, MS-Office, ASP, Java, J2EE, T-SQL, SQL Server, Toad, Toad for Data Analyst 5, Informatica Power Center Workflow Monitor, Linux, Windows, Jmeter.

Confidential, Chevy Chase, MD

Performance Tester

Responsibilities:

  • Created and presented a Performance Planning Strategy that outlines Current Production Workload distribution, Performance Requirements and Test/Production Environment Requirements and Performance Optimization results.
  • Participated in the SCRUM meetings involving the Development team and product Management team and worked towards analyzing, understanding and finalizing the product stories and making them part of the sprint backlog.
  • Performed GUI, Functional, System, Integration, Regression, UAT, Back end Testing.
  • Monitored project plan execution and project metrics reporting.
  • Ensured the deliverables are in par with Client standards.
  • Prepared Test data and carried out testing and execution.
  • Worked with development teams to investigate and correct software bugs and deficiencies based on testing results using TestDirector.
  • Performed Data Validation using SQL Quires; extensively wrote SQL queries to perform back end testing.
  • Developed Performance scripts using Loadrunner (data parameterization and correlation) with NCA and Oracle Web protocol.
  • Enhanced scripts by adding checkpoints, parameterization and correlation using LoadRunner.
  • Involved in creating and executing Batch File & UNIX Shell Script.
  • Involved in scheduling job in both Windows & UNIX System through Task Scheduler & Cron Tab.
  • Prepared status summary reports with details of executed, passed and failed test cases.
  • Sent the reports to all team members with Daily Status to track the updates in testing.

Environment: LoadRunner, Windows, Java, HTML, SQL, PL/SQL, Rational Requisite Pro, TestDirector, QTP, UNIX, Oracle, MS Word/Excel.

We'd love your feedback!