We provide IT Staff Augmentation Services!

Senior Performance Engineer/ Test Lead Resume

2.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • Over 9+ Years of experience in Software Testing include Performance / Manual / Automation testing in Client - server, web services- SOA, web-based applications and Manual Quality Assurance.
  • 7+ years of diverse experience as a Performance Test Engineer/Lead/Consultant with expertise in Performance Testing, Software Testing Life Cycle (STLC), Test Case Development/Automation, Test Scripting in HP LoadRunner/Jmeter/Oracle Performance Testing for Client/Server and Web based applications and services.
  • Worked in different domains like Airlines, Financial, Insurance and retail sectors with a unique combination of skill set in solving complex quality assurance challenges.
  • Experience of the full Software Development Life Cycle (SDLC) and Methodologies & Validations to ensure the Quality Assurance Control.
  • Experience in Collaborating with Key Stakeholders-Business Representatives, Product/Project Managers, Developers, DBA’s, Infrastructure leads, Architects, Middleware etc.,
  • Expertise in Agile Testing Methodologies & Software Test Life Cycle (STLC).
  • Used HP tools Quality Center (QC), LoadRunner, QTP, Performance Center and JMeter Open source tool for Performance Testing.
  • Experience with all LoadRunner components such as VuGen, Controller, Analysis, Load Generator and with the components of JMeter.
  • Worked extensively on LoadRunner 9/9.5/11.5/12.0 v especially with protocols viz. Web (HTTP/HTML), Oracle NCA, SAP-GUI, SAP-Web, Citrix, MQ, Webservices, Ajax, Web (Click & Script), RTE, Siebel Web, Oracle 2-Tier, Citrix, DB etc.
  • Experience with testing different versions of Oracle EBS, Siebel CRM and PeopleSoft HCM/FSCM 9.0/9.1 and Most of the Modules
  • Having expertise with hands on Experience of Monitoring tools such as HP Sitescope, CA Wily Introscope, Dynatrace and Appdynamics.
  • Used OEM, Sql Profiler, AWR Reports and other Monitoring tools for Monitoring DB
  • Well versed in Creating Dashboards in Dynatrace and Splunk and other monitoring tools as when required
  • Experienced in preparing Performance Test Strategy, Test Plan with Risk Assessments and Test Closure Reports
  • Experienced in System Performance Testing Methodologies (Load/Spike/Stress/Endurance Tests).
  • Expert knowledge of Identifying and Analyzing the Bottlenecks in Performance testing, Web Performance Throughput, Server Response Time and Network Latency.
  • Experienced in analyzing scenario results using LoadRunner Viz. On-line graphs analysis and reporting, network delay, client delay identification, I/O delays, transaction time, CPU and memory usage, miscellaneous, server level issues and extensively worked on JVM profiling and Tuning.
  • Performed regular testing as test scripts prepare and test execution and analysis of performance progress, defects, risk assessment, impact reports.
  • AnalyzedPerformance, Response time, Hits per Second CPU Utilization, Memory, I/O, and Throughput graphs.
  • Performed tests and analysis such as load test, spike test, stress test, endurance test, performance bottleneck test, benchmark test, baseline test etc. using HP Load Runner on web server, application server and database server at different levels and loads.
  • Reported various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Network time, Page Refresh Time and the Page Rendering Time.

TECHNICAL SKILLS

Testing Tools: HP Storm Runner, HP Load Runner 8.0, 9.5, 11.0, 11.50, 12.2, 12.53 HP Performance Center 11.0, 11.5, 12.2, ALM, HP Quality Center, JMeter 2.5, 2.7, 2.8, 2.9, 2.10, 3.1v SOAP UI and QTP

Languages: Microsoft C#, C, C++, visual basic, Php

Markup/Scripting Languages: DHTML, CSS, JQuery, JavaScript, XML, HTML

Web Technologies: HTML, CSS, jquery, wordpress

Packages: MS-Office, Adobe Photoshop CS5, Dreamweaver, flash, Illustrator, In Design

RDBMS: MS SQL, Microsoft Access, SQL Server, Oracle Database

Operating Systems: Windows 98, 2003 Server, Windows NT/2000/XP.

Monitoring Tools: Performance Center, Wily Introscope, Sitescope, Dynatrace, HP Diagnostics, Transaction Viewer, Splunk, and OEM & App Dynamics.

PROFESSIONAL EXPERIENCE

Confidential, Dallas, TX

Senior Performance engineer/ Test Lead

Responsibilities:

  • Interact with business leads, Solution architects and application team to develop and mimic production usage models by collecting non-functional requirements for multi-year rollout of large volume SOA.
  • Conduct work group meetings across all areas of the product organization to identify, prioritize, and mitigate risks to the responsiveness and scalability of our offerings.
  • Follow Agile (Scrum) process, the performance validation process goes by the ‘Work Done & Ready to Go’ approach from time to time, release to release and in specific sprint by sprint.
  • Organize status meetings with the stakeholders for Performance Testing in the project Ensure processes and content of all Performance testing artifacts are documented, maintained and transitioned to the client teams as per the Client's Retention and Transition policy.
  • Worked with developers in understanding the code and application in order to develop the Load scripts.
  • Involved in Mobile appperformancetesting for different iOS versions.
  • Created Performance Load Test detailed plan and Test Case Design Document with the input from developers and functional testers.
  • Production Testing executed using HP Storm Runner/AWS/Akamai and associated load test scenarios for cloud performance testing are build.
  • Developed scripts and executed load test in Production Simulating Peak day volume.
  • Gathered business requirements, analyzed the application load testing requirements and developed theperformancetest plans for DOTCOM/Web API and other enterprise applications
  • Developed Test Plans, Test Scenarios, Test Cases, Test Summary Reports and Test Execution Metrics.
  • Developed robust Vuser Scripts using Web (HTTP/HTML, SAP & web services, Citrix protocols in load runner for applications.
  • Created test data needed forperformancetests using service virtualization tools like LISA/Mounte bank.
  • Worked extensively with JSON/XML data and SOAP protocols in Non UI Web services (SOA) Testing.
  • Responsible for setting up Site scope monitors to monitor network activities and bottlenecks and to get metrics from App/Database servers.
  • Configured and used DynaTrace for performance monitoring and performed trouble shooting on Bottlenecks with performance testing along with response times, analysis and profiling the application to find out where the performance issue.
  • Setting up user profiles, configuring and adding application servers on DynaTrace.
  • Added Header with the script and monitoring the script Using DynaTrace Client.
  • Monitored Metrics on Application server, Web server and database server.
  • Used Splunk to monitor and collect the metrics ofPerformancetest servers.
  • Reported various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Network time, Page Refresh Time and the Page Rendering Time.
  • Analyzed JVM GC verbose logs and Heap dumps to find out potential memory leak issues.
  • Responsible for Analysis, reporting and publishing of the test results.

Environment: HP Load Runner 12.53, HPperformancecenter12.20, HP Storm Runner, Web sphere, HP Site scope, Dynatrace 6.3, Windows 2008, Linux, Share point, Excel, SQL, Oracle Database 11g/12c, Oracle SQL Developer, IBM Support Assistant, Mountebank, LISA, Splunk.

Confidential, Phoenix, AZ

Senior Performance engineer/ Test Lead

Responsibilities:

  • Analyze and assess impact of Oracle Application R12 upgrade on existing 11i EBS suite and discover reports on performance and integration with related system and present test strategy and plan.
  • Conducted work group sessions with application managers, developers. Business analyst for gathering requirements, analyzed requirements.
  • Responsible for the creation of the test plan/strategy, test schedule, testing status reporting, test case creation, monitoring of test case execution and script execution where needed.
  • Worked closely with the application managers and DBA managers to build production like environment for performance testing.
  • Complete setup of 8 load generators and LR 12.02 installation and monitoring tools.
  • Involved in Performance Tested EBS upgrade from 11ito R12, mainly wrote Scripts for Oracle Application Financial Module (AP, AR, CM, FA & GL) & HR, PROC, STAR, EAM modules.
  • Extensively worked on the VUGen using multiple protocol web HTTP/HTML and Oracle NCA to record, correlate and parameterize the Vuser scripts.
  • Reviewed scripts that were developed by teammates and verified scripts in the standalone mode.
  • Uploaded Scripts, Created Timeslots, Created Scenarios, maintained scripts and Run the Load Tests in performance Center. Analyzed Test Results Response time, Transaction per Seconds and Throughput per graphs.
  • Performance baseline testing, load testing and stress testing during the performance testing cycles.
  • Developed Test scripts through LoadRunner vugen and Created Different Scenarios for each Application, executing them in ALM.
  • Used CA Wily Introscope Extensively to monitor all the Tiers for Determining any performance Bottlenecks.
  • Used OEM-Oracle Enterprise Manager 12c to Monitor the Processes Performance on the Backend.
  • Written Query and identify which queries were taking too long and optimizing those with Database Tuning performance and Worked with Database administrator to index database to improve performance of the Applications.
  • Monitored Application Server through Analysis. Analyzed various graphs by LoadRunner Analysis and communicated bottleneck issues to the System administrator
  • Monitored resource utilizations such as; CPU usage % of Memory occupied in VM Stat I/O Stat JVM, Thread, System Processing time and latency in Linux Responsible for collecting the frequency JVM Heap and Garbage collection cleaned up during Test Execution Ran Benchmark such as; CPU, Disk, I/O memory and Graphics tests. Analyzed Measuring Response time, TPS/ Throughput. Tracking Network traffic and Server capacity.
  • JVM Performance Tuning: GC and Heap Analysis, Thread dumps, Heap dumps, Memory Leaks, Connection Leaks, Core Dump
  • Analyzed the impact of involve printing program on application servers and Database server performance while OBR program is running.
  • NFS latency was analyzed whether the throughput and latency on redo log file systems are under acceptable levels.
  • Communicated with team members to discuss test reports.
  • Participated in Defects to discuss the bottlenecks and Attended in a walk-through meeting, regular client calls with business stakeholders, SMEs, technical stakeholders, weekly status meeting and send weekly report status to the manager

Environment: Load runner 12.02, Performance center 11.5/ALM, Dyna Trace, SiteScope, Web logic 11g, Windows XP, VUgen, Web methods Integration Servers, Windows 2008, Windows Vista, Web applications, Portal applications, XML files, Jconsole, SOAP.

Confidential, Cincinnati, OH

Senior Performance engineer

Responsibilities:

  • Responsible for Performance Related requirements gathering from the project team and prepare a test strategy document
  • Created Resource Estimates, Test Plan and Test Strategy documents.
  • Created and designed the scripts using Load runner depending on the requirements and protocols
  • Deployed build to performance testing environment for testing and ran Load/Stress/Endurance test and all Performance related Tests based on the needs and Requirements.
  • Involved in performing volume testing based on the production volumes and cycles.
  • Responsible for creating the Load Distribution tables for various scripting modules involved.
  • Responsible for creating the load scenarios and various runtime configurations for the individual scripts that are part of the load test.
  • Performed manual correlation without relying on the auto Correlation or Correlation studio feature of LoadRunner.
  • Wrote Scripts on various protocols such as Web, Web Services, AJAX and Ajax true client.
  • Worked closely with Business Owners, Architects, and Developers to do pro-active Capacity planning / Monitoring / Tuning to make sure the applications are scalable and available to the growing business needs
  • Drilled down the problematic pages in Analysis to find out where the performance degradation is has been occurring.
  • Analyzed all the various performance metrics involved in the test run like Web resources, CPU, Memory, Request Analysis, DB Connection Pool, and Thread Pool etc.
  • Pinpointed the bottlenecks present in different layers of the Application and Identified Memory Leak in the App and made recommendations to overcome the same
  • Monitor testing through couple of monitoring tools using Dynatrace, Sitescope and Wily Introscope etc.
  • Used Dynatrace and Wily Introscope Extensively to monitor all the Tiers for Determining any performance Bottlenecks.
  • Create a Dashboard in Dynatrace Diagnostics and Splunk as per project needs.
  • Configure Dynatrace diagnostics to capture matrices during load test for analysis.
  • Identify Performance bottleneck in code using Dynatrace Diagnostics and Splunk.
  • Worked on P1 Tickets related to out of memory, exception occurred on production by using Dynatrace Diagnostics to identify the Performance bottlenecks.
  • Used diagnostic tools along with load runner to identify the issues in the application.
  • Collected different performance metrics from various components of systems, analyzed performance data and presented Test reports to various audience from technical groups to Senior Management and Executives
  • Created Performance test completion reports.
  • Analyzed various graphs while running and after the load test (Running vusers, transaction response time, Throughput, Hits per second, Error statistics and Error description graphs.

Environment: Load runner 11.5/11.0/9,52, Performance center 11.00/9.52, DynaTrace, SiteScope, TOAD, Wily Introscope, Splunk, SQL Server Management Studio, Web logic 11g, Windows XP, Web methods Integration Servers, Windows 2008, Windows Vista, Web applications, Portal applications, XML files, CMS

Confidential, Province, RI

Senior Performance engineer/Performance Monitoring Expert

Responsibilities:

  • Worked with business, Product Manager, Developer and UAT audiences. Gathering requirements
  • Involve in analyzing the requirements from Business Analyst and determined they are captured correctly and Interpreting Performance test requirements.
  • Developed Test plans to ensure accomplishment of load-testing objectives
  • Tested application performance across workflows; determined, if the application can perform satisfactorily in a production environment
  • Responsible for creating script creating Scripting (VUGEN), Execution (Controller) and Analysis / Reporting.
  • Performed Baseline, Load and Stress Testing Using LoadRunner and Present Performance statistics to the Team.
  • Extensively worked on the VUGen script in Web, Mobile protocol and Web services (SOAP) Protocol in LoadRunner, simulate virtual users and transactions and simulated user think time.
  • Developed LoadRunner scripts using VUgen for Single User, Baseline, Soak scenarios by storing dynamically varying objects IDs in parameters and validating correct downloads of the HTML page by validating the content in Sources.
  • Measured Response time of the important action of users using start and stop transaction functions.
  • Configured LoadRunner Controller, Load Generator and Execute Performance Test for multiple cycles of test scripts.
  • Developed and Implemented load and stress test with LoadRunner, and present performance statistics for the Application Teams.
  • Extensively worked on Performance monitoring and analyzed the response time Memory leaks, hits/Sec and throughput graphs.
  • Analyzed various graphs generated by LoadRunner Analysis and Communicated bottlenecks to the system administrator.
  • Identified and eliminated the fault tolerance defect of JVM using a patch when ffdc introspection resulted in sudden memory growth within the JVM causing OOM errors in small object area of JVM heap
  • Uploaded scripts in ALM Performance Center, Created Time slots, Created Test schedules and maintain scripts. Used Performance Center for Scripts in ALM project and submit defects.
  • Added Header with the script and monitoring the script Using DynaTrace Client.
  • Used Dynatrace, Jprobe tools for profiling the application to find out where the performance issue
  • Did deep diagnostics using DynaTrace Tool, Monitored DB and Application Servers to trouble shoot root cause of problem
  • Carried out deep dialogists using DynaTrace to capture memory leaks in the application by carrying out Longevity tests.
  • Used DynaTrace to measure web site performance in test environment to capture performance metrics of key product features
  • Monitoring application health using Dynatrace and reviewing performance of different components of web pages, also comparing daily, weekly and monthly trends for deep down analysis.
  • Setting up user profiles, configuring and adding application servers on Dynatrace tool.
  • Reviewing web components using Dynatrace client, analyzing and giving feedback to improve performance.
  • Monitored hardware capacity ensures the necessary resources are available for testing.
  • Worked closely with software engineering team members in order to tune and optimize product stability and performance.
  • Assisted in application tuning and infrastructure capacity requirements to support high-volume peak periods of traffic.
  • Performed Baseline test, stress test and high volume of users using Jmeter and monitored the performance of the load test on the system and measured database response time, Http request, Login and proxy server
  • Installed and Configured application through profiling tools such as VisualVM, Jconsole and Monitored Linux resources during the load test finding Bottlenecks and solving the issues on Linux servers using different monitors.

Environment: Load runner 12.0/11.5, Performance center 11.5/ALM, OEM, CA Wily Introscope, Dyna Trace, SiteScope, Web logic 11g, Windows XP, VUgen, Web methods Integration Servers, Windows 2008, Windows Vista, Web applications, Portal applications, XML files, Jconsole, SOAP.

Confidential, Grapevine, TX

Senior Performance engineer

Responsibilities:

  • Requirement Gathering, Estimation Assessments and created Test Plans and Test Scenarios design for all the releases
  • Discussing with the business analyst to find key business scenarios for performance testing.
  • Involved in preparing performance test data.
  • Used HP LoadRunner to design/develop performance testing automation scripts, functions, and scenarios, processes based on complex situations.
  • Creating and debugging performance test script using HP Loadrunner 11 VuGen Component with Web (HTTP/HTML), Web Services and Ajax protocol.
  • Involved in Performance Testing PeopleSoft HCM and FSCM which Includes HR, ESS/MSS Self-Service, Payroll, GL, AP/AR using Load Runner.
  • Performance Tested Most of the Modules of HCM and FSCM.
  • Monitored the Web, App, Process Server and DB Servers while the System is under testing
  • Closely Monitored SQL Traces using PeopleTools SQL Trace, Application Server Logs, Batch Timings Reports, WebServer Access Logs and process Monitors, etc.,
  • Inserted Transactions, Checkpoints into VuGen Scripts as well as parameterize & correlate the scripts.
  • Carried out multiple phases of load tests and schedule them as per requirements.
  • Executed multi-user performance tests using LoadRunner. Monitored Controller through online, real-time output messages.
  • Executed Load, Stress and Endurance Testing to simulate a process with over 1000 Vusers.
  • Involved in preparing performance test plan and defining baseline results.
  • Enhancing the Script by Correlation, Parameterization & Run-time settings.
  • Involved in creating and debugging the scenario in Controller.
  • Setting up the schedule, scenario configuration and running the scenario.
  • Diagnosed Memory leakage/Garbage collection on HP Diagnostic.
  • Performed problems/bottlenecks identification. Recommended remediating those using HP Diagnostics.
  • Involved in configuring Site scope Monitors (Web logic domains, clusters & DB Connection pools) to LR controller scenario.
  • Analyzed test results using HP Load Runner Analyzer tool on-line graphs and reports and looked for performance delays, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server.
  • Executing the performance test scripts and logging the performance defects in HP Quality Center.
  • Comparing the results to baseline and providing the results comparison chart to the development team.
  • Following up with development team for defect fixes and avail them a re-execution.

Environment: LoadRunner 9.5/11.0, SAP ECC 6.0 and Web (HTTP/HTML), HP Quality Center (ALM), HP Sitescope

Confidential, Fair Lawn NJ

Senior Performance engineer

Responsibilities:

  • Gathered user requirements, create Test Plans and Test Scenarios designs.
  • Prepared Test Strategies, Test Plan and Test Cases as per the business requirements and Use Cases.
  • Provided Test Estimates for various phases of the project.
  • Involved in Load Testing of various modules and software application using LoadRunner.
  • Executed automated Functional testing with Soap UI for SOA and web service testing.
  • Performed Load and Performance Testing of critical areas of various Oracle 11i modules like AR, AP, OM and GL
  • Developed the Load Test scripts using the LoadRunner, enhanced the scripts by including transactions, parameterize the constant values and correlating the dynamic values
  • Developed Vuser Scripts in Web\HTTP, Web Services and Click & Script Protocols.
  • Test Scripts generation in Load Runner, Customization of scripts with required logic, correlation, parameterization, pacing, logging options and preferences.
  • Executed Load, Stress, Benchmark and Endurance Tests.
  • Preparation of Load Generators and test execution, tracking metrics such as TPS, Response Time,
  • Transaction Graphs, Run Time Graphs and Resource Graphs.
  • Analyzed the results of the Load test using the Load Runner Analysis tool, looking at the online monitors and the graphs and identified the bottlenecks in the system.
  • Updated test metrics, test plans, and documentation at each major release and performed Regression testing using LR scripts.
  • Performed QA Process management, identified functional/Code changed vs. business impact and Re-executed Performance tests before releasing to production.
  • Load test monitors were used to monitor Oracle database, Oracle11i HTTP server and UNIX resources on the database and application server
  • Reported and tracked defects in JIRA.
  • Worked closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
  • Generating detailed Test Reports and presenting them to Project Team.
  • Responsible for multiple rounds of Performance Testing after system refinement.
  • Preparation of Final Test Reports and coordinating QA sign-off for each project.
  • Supported Production team to understand and execute the processes.

Environment: HP LoadRunner 8.5/9/1, HP Quality Center, Visual VM, Oracle 11g, Web Logic 11 Enterprise Manager, Web Logic 12 Cloud Control.

Confidential

Performance Analyst

Responsibilities:

  • Gathering and understanding the requirements
  • Designing load-testing scenarios for various load conditions.
  • Co-ordinated several meetings with the internal client team
  • Tested to find out if targeted user load can perform business processes, Modeled the workload after studying the user activity
  • Developed Performance test plan and test scripts for web-based applications.
  • Constructed benchmarks for the applications that include J2EE, and Oracle 11i products.
  • Extensively used the follow Loadrunner protocols, Web HTTP, J2EE, .Net, Citrix and Oracle.
  • Captured data from Vugen scripts by using ANSI- C functions, correlation, transaction timings, verification checkpoints and parameterization.
  • Creating Performance scenarios and scripts for doing multiple iterations.
  • Identified functionality and performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load.
  • Responsible for editing, updating, and maintaining load testing of existing scripts.
  • Configured and used SiteScope Performance Monitor to monitor and analyze the performance of the server by generating various reports from CPU utilization, Memory Usage to load average etc.
  • Analyzed LoadRunner Metrics and other performance monitoring tools results during and after performance testing on Application server database and generated various Graphs and Reports.
  • Responsible for analyzing results, reports and charts to see response times of individual transactions with respect to whole applications.
  • Monitored diagnostics on multi-tiered architecture using Foglight and Grid Control
  • Monitored system resources and analyzed the results for identifying bottlenecks responsible for the poor performance
  • Involved in Server Side Monitoring during performance test execution
  • Identified gaps in the performance testing capability, capacity planning, and production monitoring and developed a plan to fill gaps.
  • Analyzing the Performance Test Execution Report and preparing Final Testing Report
  • Presented test results to stake holders from time to time.

Environment: HP Loadrunner, Winrunner, HP BAC,HP J2EE Diagnostics, QC, Clear Quest, Oracle, Foglight, Linux, Grid Control, Wiley, J2EE, .Net, Windows, IIS 5, JMeter Citrix, Websphere

We'd love your feedback!