We provide IT Staff Augmentation Services!

Performance Engineer Resume

0/5 (Submit Your Rating)


  • 14+ years of extensive experience in Regression, Performance, Load/Stress, Customer field testing
  • Hands - on expertise in Test documentation, Manual, Automation testing, DevTest and Execution on Client/Server, Integrated Intranet, UNIX, Linux, Mainframes and Internet applications
  • Hands on experience in using tools like Performance Center, QTP, Test Director, Quality Centre, Rally, JIRA, clear quest.
  • Used the various monitoring tools like HP Site scope
  • Strong theoretical and practical experience with various software development life cycle (SDLC) Agile, waterfall, Interative approaches and implementation in a large organizations in various projects with actions to continuous improvement over releases.
  • Worked closely with the developers and business customers to understand the business requirements and overall strategies. Provided key inputs to Requirement document preparation.
  • Worked on preparing high level and detailed test strategy document, development support plan, test report for Beta customers and final GA release document for customers and internal service teams for support the field deployment.
  • Solid experience scripting in different Vugen protocols such as Web services, Winsock, Web Http, Java, RTE, CRM applications.
  • Expertise in Manual and Automated Correlations to Parameterize Dynamically changing Parameters values
  • Experience in performance testing of .NET applications, Java based applications.
  • Experience in infrastructure testing for enterprise wide applications UM, HP Diagnostics and Wily Interoscope to keep track of the test performance and identify various bottlenecks.
  • Hands on experience and exposure in all phases of project development lifecycle and Software Development Life Cycle (SDLC) right from Inception, Requirements, Design, construction and Test execution.
  • Extensive experience in using different types of testing including baseline, benchmark, load, stress and endurance testing for different applications in complex environments.
  • Proficiency in testing the applications compatibility on UNIX and Windows platforms
  • Strong process and documentation skills for performance testing/engineering.
  • Monitoring system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat
  • Collecting the frequency of JVM Heap and Garbage Collection in Web Sphere and Web logic servers during test
  • Good knowledge on database performance issues. Servers and environments include SQL (DB2, SQL Server) queries
  • Individual with good analytical, inter personal and problem solving skills


Automated Test Tools: LoadRunner 11.0/9.5.x/8.1/7.x, ALM Performance Center 11.0, WinRunner7.x/6.5, Quality Center, Test Director 8/7.x/6, Quick Test Professional, TOAD, Shell, SOAP UI, RALLY, JIRA

Management Tools: Microsoft Project, VISIO, MS EXCEL, MS WORD, MS PowerPoint

Monitoring Tools: HP Sitescope, HP Diagnostic, Perfmon, Weblogic, Wireshark, Ethereal

Languages: Java, C#, TCL, C++, PL/SQL, Shell

Configuration Mgmt Tools: ClearCase, Visual Source Safe, JIRA, Rally, QC

RDBMS: Oracle11g, Oracle 8i/7.x, ODBC, Sybase, MS SQL Server 2000/6.x, MS Access 2000/9.x, DB2

Front-End: JSP, Visual Studio 10

Internet: HTML, DHTML, XHTML, VBScript, Java Script, CSS

Others: MS-Office, MS-Excel, OOPS, XML, J2EE, MQ Series, SPSS, IBM iSeries

Operating Systems: UNIX, MS-DOS 6.22, WinXP/ 2000/98/95/3.1 , Windows 2007, Win NT4.0/3.51, Win Server 2008, Linux



Sr Technical Lead


  • Can adopt to any new domain or technologies by quickly learning from user of the product perspective and have the proven ability to test the product in a quick span of time and received multiple appreciation for quick learner, which helped me to learn lot of products in various domains on short timeframes.
  • Worked on various domains telecom, banking, retail UI testing, web based large corporate portal with huge database testing, migration and upgrade path for various releases of various customer data driven testing were done.
  • Involved in different meetings of developers, architecture engineers and project team to understand the impacted operations.
  • Participated and implemented agile testing practices for widely distributed teams.
  • Prepared and documented scope, needs and process of the performance test by analyzing comprehensive Non-functional, functional and Business Requirement Documents.
  • Prepared performance test plans and test cases by analyzing requirements and stored them on Quality Centre.
  • Create complex HP Loadrunner scripts, update existing scripts, execute different sets of load test scenarios, monitored different resources, hardware, network, and provide the detail load test reports.
  • Prepared the test strategy documents and load models based on requirements.
  • Work closely with the environment management team, Performance Center operation team to maintain the HP Loadrunner, Performance center environments.
  • Coordinate with different team to understand the current proposition of hardware, software and network during changed made on release and draft the new impacted areas.
  • Involved with capacity network engineers, senior architecture, applied engineering teams and other party for capacity analysis process.
  • Work and coordinate with production support team, Wily monitoring team, data modeling team, DBAs, application development team, deployment teams and make sure that the performance test environment is working and correct codes have been deployed.
  • Prepared the Test Estimates, Test Summary Reports, Testing Status Statistics and Comparison Charts using established formats and performed reviews.
  • Organized and presented the performance test result analysis and documented in appropriate location.
  • Create complex HP Loadrunner scripts, update existing scripts, execute different sets of load test scenarios, monitored different resources, hardware, network, and provide the detail load test reports.
  • Perform other necessary performance test/load test related tasks (WAN emulations testing using IBM Opnet) assigned by production support in case performance related issues occurred.
  • Ensure that all the test cases are updated in the Quality Center along with Master test plan.
  • Re tested critical defects including critical fixes and coordinated with developers in release of defect fixes meeting tight timelines
  • Defects were tracked, reviewed, analyzed and compared using Quality Center


Sr. Technical Lead


  • Wrote Performance Test Plan and Test Case design document with the inputs from developers and functional testers.
  • Extensively used HP tools to script and customize performance test harness Web Protocol.
  • Extensively used Controller to generate load and define load generators.
  • Used Test Results to provide summary reports, response times and monitor averages.
  • Dealt with business team to get the performance requirements for the Load Testing, Stress Testing and Capacity Planning.
  • Extensively used other features like parameterization, correlation and configured monitors for Websphere, MQ Series and Database.
  • Responsible for analyzing the requirements, designing, debugging, execution and report generation of existing legacy system and new panama application.
  • Responsible for creating a base line and executing Performance, endurance testing.
  • Measured the web based applications for Transaction Response Time for Business critical transactions.
  • Responsible for creating automated Performance scripts for load testing using HP Loadrunner to test JSP pages and HTML pages including the web server.
  • Install HP HP Loadrunner and Virtual User Generator on the desktops, Install and configure Sitescope
  • Develop Test harness in VuGen; customize the test scripts for correlation, parameterization and setting up run time settings.
  • Coordinated efforts with Application Owner and System Administrators to communicate the bottlenecks and fine tune the application.
  • Worked with Vendor teams to identify the bottlenecks and performed regression testing to achieve the pre spin off results.

Environment: HP Loadrunner, Soap UI, Java, .NET Oracle 9i, AS Forms Server, Report Server AS10G, Developer 6i, Oracle Database, Oracle J-Initiator, Cisco F5 SSL, Windows XP, JDK,SQL Navigator, PIMS, IBM SOA


Sr. Software Engineer


  • Identified the test scenario, test suite, test plan and test cases.
  • Also identify the maximum automated testing scenario to reduce the regression testing time for the JAVA, JSP application.
  • Responsible for writing System Test Plan, Test cases and Test scripts both for manual and automated testing using WinRunner.
  • Created Performance Test service for performance analysis matrix for the SOA infrastructure for providing SLA (Service Level Agreement) to the consumers. This includes the following
  • Responsible for setting up Configuration Management and defect management tools (Rational) for a perfect history of version and defect tracking.
  • Responsible for mentoring different teams for the training of HP Loadrunner, WinRunner and Test Director.
  • Resolved stability and Performance issues in the current environment and improved scalability from 50 concurrent users to 1500 users.
  • Managed resources and process of performance testing (like Load, Stress, Volume, Endurance and Failover) using HP Loadrunner (Controller, Virtual User Generator, Analysis) and Protocols used Web, Web Services
  • Extensively worked on UNIX to change the database connections, tracing logs, monitor resources of the machines, create users and execute batch jobs.
  • Coordinated with tools team to Install HP Diagnostics, Wily Introscope and Sitescope on the performance environments for triage calls to identify the bottlenecks.
  • Managed near-shore and off-shore team to develop test harness, execute performance scenarios during nights and weekends and report generation.
  • Presented results of the performance testing along with Project Management team to the client mainly senior management.
  • Also involved with project management team to schedule the testing activities for the TST space and resource allocation.

Environment:, Toad, Oracle, Mainframe, MQ Series, Unix, IBM SOA, HTML, DHTML,XML, QTP,IIS, Apache, Quality Center, Agile


Performance Engineer


  • Reviewed and analyzed Business Requirement and Software Specification Requirements
  • Developed clear and concise documentation regarding requirements management plans, functional requirements, supplemental requirements, test plans and test cases.
  • Analyze the communications, informational, database and programming requirements of clients; plan, develop, test and implement software programs for engineering applications and highly sophisticated systems.
  • Extensive experience in using Quality Center to document requirements, write and execute test cases, report defects and generate testing reports
  • Creating scripts using various protocols such as Http/HTML, WinSock and Web Services in HP Load Runner.
  • Conducted IP Spoofing using Load Runner Controller
  • Conducted Resiliency Tests for all projects and responsible for preparation of test cases.
  • Used Rendezvous points to detect database deadlocks
  • Used Performance Center to manage Load Runner scripts and scenarios and test documentation.
  • Used the Load Runner Online Monitors to monitor the possible bottlenecks in the application.
  • Involved in reporting and tracking the defects using Quality Center.
  • Conducted result analysis and interacted with Development and Architecture teams.
  • Extensively used Oracle SQL to test integrity of data by querying the database
  • Reported the bugs and sent email notifications to the developers using the Quality Center.
  • Supported customers through their UAT to test issues logged and pass to appropriate resource and to provide feedback on issues to customer.
  • Used Load Runner monitors to measure the Transaction Response time, Network delay and Throughput.
  • Analyzed the results of the Load test using the Load Runner Analysis tool, looking at the online monitors and the graphs and identified the bottlenecks in the system.
  • Responsible for documenting requirements and design specifications including current state assessment/gap analysis for Software products.
  • Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects.
  • Maintained an internet based bug tracking tool for customer UAT.
  • Conducted presentations of Performance Test results.

Environment: HP Loadrunner, HP Quality Center, UNIX, Oracle, HP RUM, SQL, ClearQuest, TOAD, Windows XP, Webservices, XML, WSDL, HP Diagnostics, HP Performance Center .

We'd love your feedback!