We provide IT Staff Augmentation Services!

Performance Engineer Resume

3.00/5 (Submit Your Rating)

Tampa -, FL

PROFESSIONAL SUMMARY:

  • 5 Plus years of experience as a Performance Engineer using industry standard tools and best practices, with hands on experience in end - to-end of performance testing lifecycle.
  • Extensively involved in all phases of Performance Testing Life Cycle.
  • In depth knowledge in preparing Test Plans, Test cases and Test Scripts
  • Executed various Performance Tests viz., Load Tests, Stress Tests, Capacity Tests and Longevity Tests.
  • Thorough understanding and working knowledge of protocols like web HTTP/HTML, Web services, AJAX and RTE.
  • Extensive experience on testing complex Client-ServerApplications built on Java and .Net platforms.
  • Expertise in monitoring the application performance using various APM tools including Dynatrace, App Dynamics and Introscope to identify performance bottlenecks
  • Expertise in using performance testing tools such as HP Load Runner, Performance Centre, Controller and hands on experience on JMeter
  • Experienced in using Fiddler for call trace analysis and record user sessions, used to create automation scripts.
  • Proficient in identifying bottlenecks and performing root cause analysis.
  • Have experience working in an Onsite / Offshore environment.
  • Extensively worked in Agile environment and water fall environment
  • Have always been a strong team player and contributed towards the team’s goals.
  • Worked productively, and effectively in both team as well as individually

TECHNICAL SKILLS:

Testing Tools: LoadRunner12.53, HP JMeter, Performance Center 12.01, ALM Performance Center, Quality center 10, Fiddler

Protocols: Web (HTTP/HTML), Backend Web Services, WSDL, Truclient Ajax, Citrix

Languages: C, C++, XML, SQL, JAVA/J2EE

PROFESSIONAL EXPERIENCE:

Confidential, Tampa - FL

Performance Engineer

Responsibilities:

  • End to end Performance Testing & Engineering of the nCino Application Interface, creating test plans/strategy that stress the system in ways that help predict capacity limits with respect to performance based on business/Performance/volume risks.
  • Working with the Performance Testing Team for any Application Reviews and analysis to the code level to identify performance constraints, reliability and security risks, inefficient resource utilization and suggest possible remediation and optimizations.
  • Capacity Planning/Management to define and calculate workload profiles and size h/w to match, including estimate/projection justification.
  • Implementing industry standards of Performance Testing/Engineering by proposing any technology tool stack to follow best practices.
  • Understanding the current system architecture and work with the Performance Testing/Engineering Team to drive Testing/Engineering initiative with respect to the implementation of nCino across the board.
  • Identifying a realistic Performance Test Approach and implementing a process-based industry-standard Testing/Engineering Strategy.
  • Design and develop automated test workflow to measure the performance of CLM (nCino) application before scheduled release to production.
  • Monitoring nCino’s Application end-to-end design flow and conduct any root-cause analysis to identify potential Performance Tuning Capabilities.

Environment: Silk Performance Testing Tool, Salesforce (Apex platform), CLM(nCino), SOAPuiWeb-HTTP/HTML, Web Services, API Testing, Java, HP ALM Octane, Multi-Tenant Cloud Infrastructure, DynaTrace

Confidential, St. Louis- MO

Performance Test Engineer

Responsibilities:

  • Gathered business requirement, studied the application, and collected the requirements from stake holders
  • Responsible for analyzing problems, developing solutions and making decisions that impact projects. Completes assigned load and performance tasks on time (e.g., analysis/design, development, testing, maintenance, documentation, etc.).
  • Developed Load runner scripts using various protocols like web HTTP/HTML, Web Services (SOAP& Rest) Citrix & Ajax true client
  • Design and modify API scripts with changing requirements and executing on daily basis, report the test metrics to the responsible stakeholders of the project.
  • Conducted Performance, load, and stress testing using Load Runner and Performance Center
  • Used Dynatrace to monitor Applications at the business transaction level including pure paths and hotspots
  • Identify the bottlenecks from Dynatrace while test executed and see if request is distributed eventually, capture the system utilizations, and monitor the memory.
  • Tagged Load Runner vugen scripts with Dynatrace scripts, and monitored web tagged requests.
  • Used Splunk to check for error messages in server log
  • Experienced in Identifying and drilling down on the slow performing SQL queries and application methods from Dynatrace and resolve the bottlenecks.
  • Responsible for coordinating the testing efforts with both Onshore and Offshore teams.
  • Analyzed visual studio graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server by configuring the server counters.
  • Excellent verbal, written and analytical skills with ability to work in a team as well as individually in fast paced, dynamic team environment
  • Responsible for Knowledge Transfer sessions for Client personals for Project handover
  • Actively participated in Sprint Demos, Project grooming sessions, Retrospective, Sprint planning’s, morning standup calls

Environment: Load Runner, Performance Center, Splunk, Dynatrace, Quality Center, Excel, Oracle, IIS, MS SQL Server, Citrix, Web logic, Load Balancer, J2EE Diagnostic Tool, Windows, HP-UX, HP Service Test.

Confidential, Aliso Viejo, CA

Performance Engineer

Responsibilities:

  • Working with Automation Framework using open source technologies.
  • Working with Functional and Regression test coverage using tools to implement Load Runner/Performance Center/Jmeter/Selenium Webdriver and others.
  • Ensure that the testing and automation tools are wired into our Continuous Integration process using tools like Jenkins.
  • Work with individual engineers to ensure that the different product teams have a high degree of focus on quality by working on strategies (and code) for Functional testing and imply the approach for Performance Testing and Engineering.
  • Execute any automated/manual testing required to move product work through the agile sprints using Visual Studio.
  • Ensure all testing includes a multi-platform approach by thinking about different browsers and devices where iHeartRadio products live.
  • Collaborate with Product Managers and Engineers on building out test strategies and plans to implement Load and Stress Testing using Load Runner Testing tool to test Restful API’s communicating with the Viero 8 Database.

Environment: HP LoadRunner, HP Performance Center, NeoLoad, SCOM Monitoring, Ignite, WebsphereWeblogic, .Net framework, C#, SQL, Visual Studio, Splunk

Confidential, CA

Performance Test Analyst

Responsibilities:

  • Interacted with project team, and gathered requirements, and developed performance test plans.
  • Developed and deployed test load scripts to do end to end performance testing using Load Runner and Jmeter
  • Provided multiple sets of data during test execution and used the data randomly, sequentially and uniquely.
  • Used App Dynamics to monitor overall Application performance, business transaction performance, and application infrastructure performance.
  • Executed load tests for new applications for benchmarking for future releases.
  • Experienced in server-side monitoring and test result analysis using Wily / CA IntroScope for application server to identify performance bottlenecks
  • Used to identify the queries which were taking too long and optimize those queries to improve performance using Wily IntroScope
  • Monitored different graphs like transaction response time, CPU, memory, GC, threads, long running SQL queries using AppDynamics and CA IntroScope
  • Used Quality Center for tracking and reporting the defects
  • Analyzed, interpreted and summarized relevant results in a complete Performance Test Report.
  • Wrote detailed performance test results report including recommendations.

Environment: Load Runner, JMeter, Fiddler, Performance Centre, AppDynamics, Quality Center, HP Diagnostics, CA IntroScope, Oracle, JAVA, Web Logic, Web Sphere, XML, Web Logic

Confidential

Performance Tester

Responsibilities:

  • Developed performance test plans and managed tasks for performance testing of business applications
  • Performed Parameterization in LoadRunner.
  • Added performance measurements for Oracle, Web logic, IIS in Load Runner.
  • Involved in gathering all the requirements from various teams and worked on the test plan and test strategy documents for projects based on the NFR’s.
  • Developed and deployed test automation scripts to do end to end performance testing using Load Runner.
  • Added and monitored Web Logic, App server and Windows servers during performance testing by using SiteScope.
  • Involved in reporting and tracking the defects using Quality Center.
  • Exposed and worked on composite application monitoring using HP Diagnostics.
  • Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
  • Creation of project plan, test plans, estimations, development and tracking projects at every phase
  • Analyzed requirement and design documents.
  • Used LoadRunner tool for testing and monitoring.
  • Analyzed results using LoadRunner Analysis tool and analyzed Oracle database connections, sessions, Web Logic log files.
  • Performed business level transaction monitoring and analysis using Dynatrace.
  • Created counters on the Dashboard which uses agents to track down top timed events.
  • Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects.
  • Conducted presentations of Performance Test results.
  • Verify that new or upgraded applications meet specified performance requirements.
  • Used Performance Center to execute tests and maintain scripts.

Environment: LoadRunner, Jmeter, Quality Center, Sitescope, J2ee, Oracle, SQL, Site Scope, MS Office, MS Access, MS Vision, MS Project, Dynatrace, Appdynamics, Web-HTTP/HTML, Web ServicesCitrix, Ajax, RTE Protocols, Splunk.

We'd love your feedback!