We provide IT Staff Augmentation Services!

Performance Engineer Resume

2.00/5 (Submit Your Rating)

FloridA

PROFESSIONAL SUMMARY:

  • Over 6 years of diverse experience as a Performance Engineer/ Tester in both distributed Client/Server and web applications.
  • Hands on experience in implementing LoadRunner, Developing Load Test Conditions and Test Scenarios.
  • Solid Experience with installing LoadRunner Controller, Analysis and Generator on Windows platform.
  • Experience in the Entire SDLC life cycle from requirements gathering to releasing into production.
  • Expertise in using performance testing tools such as HP LoadRunner, JMeter and Performance Center.
  • Good experience using profiling tools such as AppDynamics, DynaTrace and CA IntroScope.
  • Ability to work in tight schedules and on different applications concurrently.
  • Excellent in communication, presentation and interpersonal skills.

TECHNICAL SKILLS:

Testing Tools: LoadRunner, JMeter, Performance Center, Dynatrace, App Dynamics, Sitescope, CA Introscope, NMON, Quality Center

Languages: C, JAVA/J2EE, VB Scripts, C#.Net, XML, UNIX - Shell Scripting

Databases: Oracle, DB2, SQL Server, MS-ACCESS, My SQL

GUI: VB, JSP, Java Applets, ASP, HTML.

Web Related: DHTML, XML, VB Script, JavaScript, Java, Weblogic, WebSphere, IIS.

Operating Systems: AIX, HP-UX, SOLARIS, Windows and Linux

PROFESSIONAL EXPERIENCE:

Confidential, Florida

Performance Engineer

Responsibilities:

  • Identify, understand and plan for organizational and human impacts of planned systems, and ensure that new technical requirements are properly integrated with existing processes and skill sets.
  • Plan to test a system flow with various load conditions using tools like HP LoadRunner, HP SiteScope and HP Performance Center.
  • Identified performance bottlenecks using App Dynamics.
  • Interact with internal users and customers to learn and document requirements that are then used to produce business requirements documents.
  • Write technical requirements from a critical phase.
  • Interact with designers to understand software limitations.
  • Help programmers during system development, ex: provide use cases, flowcharts or even database design.
  • Developed and deployed test automation scripts to do end to end performance testing using Load Runner.
  • Used c functions for string manipulations
  • Wrote custom C functions to enhance load runner Vugen.
  • Perform end to end performance testing.
  • Deploy the completed system.
  • Document requirements or contribute to user manuals.

Environment: LoadRunner, Performance Center, App Dynamics, Splunk, SiteScope, Excel, Oracle, MS SQL Server, Web logic, Load Balancer, JAVA, AJAX, J2EE Diagnostic Tool, web, Windows, HP-UX, AIX, Custom C function.

Confidential, Minnesota

Performance Engineer

Responsibilities:

  • Responsible for gathering and analyzing requirements for performance testing.
  • Installed and configured Jmeter for performance testing
  • Performance tested the Java applications using JMeter for various protocols
  • Interact with internal users and customers to learn and document requirements that are then used to produce business requirements documents
  • Developed and deployed test Load scripts to do end to end performance testing using JMeter.
  • Exclusively worked on Web (Http/html) sampler on JMeter.
  • Configured the JMeter properties files for specific load testing scenarios.
  • Uploaded scripts and ran LoadTest on Blaze Meter with 1000 users to check the error rate, logs, timeline report, load report, aggregate report and prepare an overall report for the test.
  • Extensively used DynaTrace for monitoring servers and identifying bottlenecks.

Environment: Jmeter, BlazeMeter, DynaTrace, Agile, WebSphere, WebLogic, SQL developer, Web Services, Java, Oracle, IIS and Unix / Linux.

Confidential, Wisconsin

Performance Engineer

Responsibilities:

  • Involved in Performance Testing - Load testing, stress testing and soak testing of the application using LoadRunner.
  • Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
  • Developed and deployed test automation scripts to do end to end performance testing using Load Runner.
  • Used c functions for string manipulations
  • Wrote custom C functions to enhance load runner Vugen.
  • Independently developed LoadRunner test scripts according to test specifications/requirements.
  • Tagged LoadRunner’s vugen scripts with DynaTrace headers for monitoring web tagged requests in DynaTrace.
  • Monitored pure paths, hotshots, server and heap metrics using DynaTrace.
  • Monitored load balancer, JDBC connection pool, Garbage Collector, DB based on which performance tuning was performed.
  • Collected server side performance metrics for disk I/O, memory, and CPU physical server utilization statistics, as needed using NMON on Linux system.
  • Performance tested the Java applications using JMeter for various protocols.
  • Monitored the tests using Dynatrace, Customized Dashboards, API Distribution, Run time Diagnostics, Pure Path tree, Error Analysis, Incidents & Alerts.
  • Introduced agile methodology for the performance testing cycles.
  • Developed LoadRunner Scripts in Web, Web services, and Database protocols.
  • Analyzed the requirement and design documents.
  • Involved in preparing the status reports and defect status reports.

Environment: Jmeter, LoadRunner, Performance Center, NMON, DynaTrace, Excel, Oracle, MS SQL Server, Web logic, Load Balancer, JAVA, AJAX, Quality Center, Crystal Reports, J2EE Diagnostic Tool, web, Windows, HP-UX, AIX, Custom C function.

Confidential, Connecticut

Performance Test Analyst

Responsibilities:

  • Analyzed system documentation like Requirements document, User Interface Specifications to develop and Execute Test scripts.
  • Designed the Test Environment and the Scenarios for the Load Testing.
  • Developed Performance testing plan based on business and technical requirements.
  • Perform large-scale load volume end-to-end testing using large user’s data files.
  • Extensively used Performance Center for performance testing the application.
  • Used Virtual User Generator to generate VuGen Scripts for Web (Http/Html), Java, .Net and Web Services protocol.

Environment: Load Runner, Performance Center, IIS, WebLogic, Oracle, QC, SOA Test, MS Office, MS-Visio, Java, Wily Introscope, Windows and LINUX, .Net.

Confidential, Illinois

Performance Tester

Responsibilities:

  • Responsible for setting up performance test lab and testing environment for LoadRunner.
  • Developed testing strategy, test plan and workflow for the major business function, which was tested.
  • Analyzed the results of different scenarios ran under different environment to determine the performance under same user load.
  • Responsible for performance testing of Web, Web Services (SOA), Siebel and PeopleSoft applications using HP Load Runner.
  • Developed performance tests scripts using VuGen scripts.

Environment: Load Runner, Performance Center, Win Runner, Citrix, LDAP, Oracle, MS SQL Server, Weblogic, WebSphere, JAVA, Windows 2000 / XP, Solaris, AIX, IE.

We'd love your feedback!