We provide IT Staff Augmentation Services!

Performance Test Engineer Resume

SUMMARY

  • 9 years of extensive hands on experience in tools such as HP LoadRunner, HP ALM, HP Performance Center, HP UFT, HP SiteScope, HP Diagnostics and Jmeter
  • 3 years of CA AMP
  • 2 years of experience with Selenium, Jenkins, Jira and Cucumber
  • 1.5 years of experience with Gatling
  • 2 years of experience testing AWS (Amazon Web Services)
  • Experience with Shell Scripting
  • Expertise in Analyzing Business Requirements, Design Specifications, Use Cases to prepare Test Plans, Test Cases, Test Scripts.
  • Expertise in Manual Testing, Web Based Testing and Client/Server Testing.
  • Experienced in defining Testing Methodologies, Designing Test Plans and Test Cases.
  • Experienced in verifying and Validating Web based applications and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Lifecycle (SDLC).
  • Experienced in involving in Waterfall & Agile development methodologies
  • Exposure in Test Automation life cycle & Automated Scripts Maintenance
  • Extensive experience in automated tools HP Quality center/ALM for Test Cases and scenarios for Defect tracking and reporting.
  • Knowledge in Defect Management Tools such as ALM/Quality Center.
  • Strong experience in data manipulation using SQL for the retrieval of data from the Relational database.
  • Experienced in Writing SQL for various RDBMS like Oracle, MY SQL & SQL SERVER.
  • Experienced in Automation testing using HP UFT/Quick Test Professional (QTP).
  • Expertise in developing automated test scripts using VB Script in UFT/QuickTest Professional (QTP).
  • Proficient in protocols such as Web, Citrix, RTE, PeopleSoft JDBC, Siebel, SAP and Web Services for performance using LoadRunner, Jmeter and ALM Performance Center
  • Well experienced in using monitoring tools such as CA APM and HP SiteScope
  • Well proficient with complex ‘C’ Programming and VB Scripting
  • Vast experienced in creating Web Services scripts using LoadRunner by scanning WSDL files and recording the Client
  • Experienced in using HP Quality Center/ALM for gathering requirements, planning and scheduling tests, analyzing results and managing defects and issues
  • Very proficient in interacting with Oracle, SQL Server and DB2 databases using SQL
  • Dexterous in tracking and reviewing defects using HP ALM/Quality Center
  • Well experienced in preparation of Test Plans, Test Scenarios, Test Cases and Test Data from requirements and Use Cases
  • Conversant in defining performance test strategy, performance test cases, load scripts and documenting the issues
  • Experienced in developing Performance Test Plan, executing Load Testing, analyzing the results and generating Load Testing reports using LoadRunner
  • Deft in conducting Load testing, Scenario creation and execution, measuring Throughput, Hits per second, Response time and Transaction time using LoadRunner Analysis
  • Experienced in using Monitoring Tools such as Task Manager, Process Explorer, Performance Monitor, Resource Monitor and Data Ware House Monitor in Windows system and Jconsole to monitor Java based applications, System Monitor and Topas in Unix system
  • Ability to work on multi platform environments like Windows and Unix with clear understanding of file system, environment variables and file transfer protocol (FTP)
  • Effective time management skills and consistent ability to meet client deadline
  • Excellent inter - personal abilities and a self-starter with good communication skills, problem solving skills, analytical skills and leadership qualities

TECHNICAL SKILLS

Testing Tools: ALM/Quality Center, LoadRunner, Performance Center, Jmeter

Operating Systems: MS-DOS, UNIX, Windows

Mainframe: MVS, CICS

Monitoring Tools: Site Scope, JConsole and MS SCOM

Languages: Java, VB.Net, VBScript, C++, SQL and PL/SQL

RDBMS: SQL Server, Oracle, Sybase and MS-Access

Web Technologies: HTML, DHTML, EXML, SOAP, JSP, ASP, PHP and Java Applets

Documentation Tools: MS Word, MS Exel and Test Editor

Web Servers: Apache, Web Logic, Web Sphere and IIS

PROFESSIONAL EXPERIENCE

Confidential

Performance Test Engineer

Responsibilities:

  • Responsible for setting up Jmeter on AWS for cloud load testing
  • Responsible for designing, creating and executing performance benchmarks and identifying root causes of performance defects
  • Identified areas for Test Automation and provided design prototypes for the Automation suite based on tool.
  • Responsible for creating and executing Autmation tests when needed using Selenium and Cucumber
  • Ability to track defects using Jira
  • Involved in Test case preparation based on the business requirements.
  • Made reports on Testing progress and results, defected resolution and documented final test results in order to execute Testing Engagement.
  • Executed Data mining of a large shared Test environment to set up Test scenarios.
  • Evaluated users’ needs through performing System configurations, Test planning and reporting.
  • Created and loaded Test data sets to validate system or unit functionality.
  • Performed the Integration, System, and Regression testing of Software for both Manual and Automated Test execution.
  • Developed Test scenarios, Test case, Test Data and mapped the Test case against a Requirement in HP ALM.
  • Used HP ALM as a Test planning and Defect management tool.
  • Wrote SQL scripts to validate the data in the database on the back end & Master files.
  • Scheduled VBScripts in windows scheduler that trigger stored jobs in UFT at desired times.
  • Performed descriptive programming approach of UFT in handling dynamic objects.
  • Integrated UFT with ALM and ran automated test scripts stored in ALM by invoking UFT in the background.
  • Reported defects out of UFT.
  • Develop Vuser scripts for Web (HTTP/HTML), Citrix, Oracle and Web Services protocols based on the user workflow
  • Performance test complex SOA based application using LoadRunner Webservices protocol to imitate a real user activity
  • Identify real world scenarios and Day in Life performance tests
  • Perform complex Usage Pattern Analysis
  • Develop complex ‘C’ Libraries and Utility functions for Code Reusability and Modularity
  • Independently develop LoadRunner test scripts according to test specifications/requirements
  • Use LoadRunner, execute multi-user performance tests, use online monitors, real-time output messages and other features of the LoadRunner Controller
  • Perform in-depth analysis to isolate points of failure in the application
  • Involved in performing load tests using LoadRunner on Oracle applications using Citrix Client
  • Responsible for generating reports on these load testing scenarios, including documenting several factors such as User Think Time, Page Views per Second and number of virtual users for later analysis
  • Responsible for browser compatibility testing as well as HTML 4.0 compliance testing
  • Perform validations to check and make sure that the product design satisfies and fits the intended usage
  • Execute stress tests with a load of Vusers to see the breakpoint of the application
  • Monitor the metrics such as Response Time, Throughput and server resources such as CPU utilized, Available Bytes and Process Bytes by using LoadRnner Monitors for IIS and Web Logic server
  • Monitor the Web Logic server using Fog Light, a performance monitoring tool from Qwest Software
  • Participate in walkthroughs with the client and the development team and attend Defect reporting meetings
  • Assist in production of testing and capacity certification reports
  • Investigate and troubleshoot performance problems in a lba environment which also includes analysis of performance problems in a production environment
  • Interface with developers, project managers and upper management in the development, execution and reporting of test automation results

Environment: HP ALM (formerly known as Quality Center), LoadRunner, Jmeter, CA AMP, SiteScope, VB, Html, XML, MS-Office, SQL, PL/SQL, Oracle Unix and Windows, UFT formerly known as QTP

Confidential

Performance Test Engineer

Responsibilities:

  • Identified areas for Test Automation and provided design prototypes for the Automation suite based on tool.
  • Involved in Test case preparation based on the business requirements.
  • Made reports on Testing progress and results, defected resolution and documented final test results in order to execute Testing Engagement.
  • Executed Data mining of a large shared Test environment to set up Test scenarios.
  • Evaluated users’ needs through performing System configurations, Test planning and reporting.
  • Created and loaded Test data sets to validate system or unit functionality.
  • Performed the Integration, System, and Regression testing of Software for both Manual and Automated Test execution.
  • Developed Test scenarios, Test case, Test Data and mapped the Test case against a Requirement in HP ALM.
  • Used HP ALM as a Test planning and Defect management tool.
  • Wrote SQL scripts to validate the data in the database on the back end & Master files.
  • Scheduled VBScripts in windows scheduler that trigger stored jobs in UFT at desired times.
  • Performed descriptive programming approach of UFT in handling dynamic objects.
  • Integrated UFT with ALM and ran automated test scripts stored in ALM by invoking UFT in the background.
  • Reported defects out of UFT.
  • Develop Vuser scripts for Web (HTTP/HTML), Citrix, Oracle and Web Services protocols based on the user workflow
  • Performance test complex SOA based application using LoadRunner Webservices protocol to imitate a real user activity
  • Develop complex ‘C’ Libraries and Utility functions for Code Reusability and Modularity
  • Independently develop LoadRunner test scripts according to test specifications/requirements
  • Use LoadRunner, execute multi-user performance tests, use online monitors, real-time output messages and other features of the LoadRunner Controller
  • Perform in-depth analysis to isolate points of failure in the application
  • Involved in performing load tests using LoadRunner on Oracle applications using Citrix Client
  • Responsible for generating reports on these load testing scenarios, including documenting several factors such as User Think Time, Page Views per Second and number of virtual users for later analysis
  • Perform validations to check and make sure that the product design satisfies and fits the intended usage
  • Execute stress tests with a load of Vusers to see the breakpoint of the application
  • Monitor the metrics such as Response Time, Throughput and server resources such as CPU utilized, Available Bytes and Process Bytes by using LoadRnner Monitors for IIS and Web Logic server
  • Monitor the Web Logic server using Fog Light, a performance monitoring tool from Qwest Software
  • Investigate and troubleshoot performance problems in a lba environment which also includes analysis of performance problems in a production environment
  • Interface with developers, project managers and upper management in the development, execution and reporting of test automation results

Environment: HP ALM (formerly known as Quality Center), LoadRunner, Jmeter, CA AMP, SiteScope, VB, Html, XML, MS-Office, SQL, PL/SQL, Oracle Unix and Windows, UFT formerly known as QTP

Hire Now