We provide IT Staff Augmentation Services!

Performance Tester Resume

0/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • 9+ Years of experience in Performance Engineering and Software Quality Assurance, Capacity, Availability and Performance Processes.
  • Extensive experience in Manual /Automated testing of Web based and Client/Server applications with proficiency in Load and Performance Testing. Good experience in agile methodology
  • Involved in analysis, design, implementation, execution, maintenance and documentation for system testing.
  • Experience in analyzing JVM heap and GC logs Java garbage collection, System Logs, heap dumps, Performance related Issues and providing Root Cause analysis (RCA) to fine tune the relevant Parameters in production and test environment performance issues
  • Proficient in writing test plans, test cases, test scripts and test result reports.
  • Involved greatly in Performance Testing, Functional Testing and Regression Testing, using automated testing tools including Load Runner, Performance Center, Quick Test Pro, Quality Center and JMeter.
  • Extensively used JvisualVM, heapdumps for finding out memory leak pattern (root cause analysis - RCA).
  • Perform the monitoring Performanceof the application and database servers during the test run using APM Tools like Dynatrace, AppDynamics, Wily Introscope, AWS and SiteScope.
  • Experienced in Designing Multiple Load Runner scripts (VuGen) with different protocols like Web (Http/Html), Ajax, Ajax TrueClient, JAVA VUser, Web services, SAP Web for load testing different applications
  • Expertise on Web Services(SOAP/REST) and experienced in using SOAP UI for testing of SOA environment
  • Significant experience Load testing various applications including .Net, J2EE, COM/DCOM implementations.
  • Expertise in understanding Business Processes from provided non-functional requirements (NFR) and converting them into practical Test Scenarios and analyzing the test results for reporting.
  • Bottle neck analysis; JVM Analysis (Thread, Heap and GC)
  • Knowledge in Configuring and Administering SCM (SVN), Builds (Maven), CI/CD ((Jenkins) Automation Tools.
  • Configured Web, Application, and Database server performance monitoring setup for various application s using HP Diagnostics, Site scope, Dynatrace, AWS Cloudwatch and CA Intro scope.
  • Experienced in Design and Execution of Test criteria, Scenarios, and Scripts from requirements.
  • Knowledge in Java and j2ee programming.
  • Proficient in Creating and Enhancing scripts, Executing Tests and Analyzing Performance results using LoadRunner, Wily Introscope, SiteScope, Splunk, HP Diagnostics, GUI dashboard and Performance Center.
  • Comprehensive knowledge of Linux, UNIX and Windows Operating Systems.
  • Knowledge in AWS services EC2, VPC,CloudFormation,CloudFront,Cloudwatch, IAM, RDS and S3.
  • Generate performance graphs, session reports and other related documentation required for validation and analysis. Publish results and receive appropriate signoff. Prepare detail status reports and monitoring of all defects and issues.
  • Good experience in Analysis, Debugging &identifying of SAP issues using TCodes(ST03N), STAD, SM50, System log (SM21), EWA report and AWR report etc.
  • Involved in Planning and Translation of Software Business Requirements into test conditions, execution of all types of tests, and identification as well as logging of Software bugs for business process improvement.
  • Quick Learner, excellent problem solving, conflict resolution skills with a strong technical background and good interpersonal and communication skills.

TECHNICAL SKILLS

Test Automation tools: Performance Center12.53,12.55, Loadrunner 7.5/7.8/8.1/9 x/ 11.0/12.01/12.53, SoapUI 3.6.1, HP QC ALM 9.x/10.0/11/12, QTP 9x/10

Databases: Oracle 10G,8i/8.0/7.0, IBM DB2, MS SQL Server 6.5/7.0/2000, MS Access 7.0/97/2000.

Programming: C, JAVA, SQL, PL/SQL (Stored Procedures, Functions, Triggers, Cursors), XML, HTML 4.0, Visual Basic 6.0/5.0, Python, Unix Shell Scripting, SQL-Plus.

Monitoring APM Tools: Site Scope, Wily Introscope, Splunk, HP Diagnostic, NewRelic, AppDynamics and Dynatrace

Web/Application Server: WebSphere 4.x/7.x, Tomcat, Java Web Server 1.2, Microsoft Personal Web Server, Web Logic Server5.x

Operating Systems: Linux 6.9, Sun Solaris 2.6/2.7, HP-UX, IBM AIX 4.2/4.3, Win 3.x/95/98/00/03, Windows XP, Unix.

PROFESSIONAL EXPERIENCE

Confidential, Dallas, TX

Sr. Performance Engineer

Responsibilities:

  • Created Test Plan which includes Change Request details, Testing Schedule, Testing Resources, Tools required, Risks and testing of end-to-end scenarios.
  • Responsible for analyzing the results like CPU usage, memory usage, garbage collection/Java Jvm heap size, server response times, database response times, active/idle threads etc.
  • UsedPerformanceCenter to execute Tests and Dynatracefor analyzing root-cause ofperformance bottlenecks.
  • Extensively used JvisualVM, heapdumps for finding out memory leak pattern (root cause analysis - RCA).
  • Analyze the Database performance issue with help of ADDM, AWR report.
  • Identified bottlenecks for a clustered environment relating to Indexes, Connection Pools, Garbage collections, Memory heap size and fixed them by changing configurations with the help of development and DB team.
  • Designed multiple LoadRunner scripts (Vugen) with different protocols like Web (html/http) and Web services for load testing different GUI and client-based applications.
  • Created detailed test status reports, web trend analysis reports, cross result analysis report and graphical charts for upper management using Load Runner analysis component.
  • Created, Executed and Monitored the feasibility of various manual and goal-oriented scenarios of an application with Load Runner Controller.
  • Run full formal performance test including Load, Capacity, Stress and Duration testing.
  • Effectively used all the components of Performance Center ALM version 12.53 which takes care of complete Performance test cycle.
  • Monitored server logs and events using Sitescope.
  • Configure the LoadRunner Controller and Performance Center for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Added various monitoring parameters to the LoadRunner controller for monitoring, also using Dynatrace and HP Diagnostic for monitoring database and application servers.
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports and graphical charts for upper management using Load Runner analysis component.
  • Monitor SQL server during performance tests for buffer cache hit ratio, available bytes, page reads/writes etc.

Environment: JAVA, WebSphere8.x, Performance Center 12.53, SiteScope, oracle, Linux, XML, VuGen, Java JRE1.7, Dynatrace, AWS, Web services, SoapUI.

Confidential, Atlanta, GA

Performance Engineer

Responsibilities:

  • Worked with application team and developer(s) to select Use Cases to gather performance tests requirements and SLA’s.
  • Developing Master Test Plan, which includes entire scope, Testing Resources, Testing Strategy and testing of end-to-end scenarios.
  • Involved in Developing Automation Frame work.
  • Designed multiple LoadRunner scripts (VuGen) with different protocols like Web (Http/Html), Ajax, Web services for load testing different applications.
  • Monitored and Analyzed Garbage collection using AppDynamics and participated in tuning/optimization of performance.
  • Creating and Executing load test scenarios usingJMeterand Load runner tools
  • Execute Load, Stress, endurance and Fail over tests for a variety of security applications.
  • Participated in discussions with the QA manager, Developers and Administrators in fine-tuning the applications based on the Results produced by Analysis Tool.
  • Worked with developers and release management to design and implemented Jenkins and delivery pipelines.
  • Using SoapUI and Load UI for Load testing for different API’s where frontend application is not available.
  • Worked on Java Messaging Services (JMS) Published messages to Topics/Queues. Control inbound message rate while publishing messages, calculate queue depth, create pending messages trend analysis.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.
  • Created Vusers to emulate concurrent users, inserting Rendezvous points in the Vuser scripts and executed the Vuser Scripts in various scenarios which were both goal oriented and manual using Load Runner.
  • Configured Web, Application, and Database server performance monitoring setup using LoadRunner Controller and HP Site scope.
  • Experience in Splunk, monitoring the response time/errors for each component of the tier
  • Responsible for performance, stress and capacity testing assessments for a number of company products.
  • Used SQL Developer, MS Server Management Studio and Toad to retrieve data from Oracle/SQL Server database.
  • Attended strategy and planning meeting with management and Application team on regular basis.
  • Extensively worked on Web methods, developer created packages, Work flow service, build code services and integration with UNIX system in order to do middleware performance testing.
  • Extensively used JvisualVM, AppDynamics for finding out memory leak pattern (root cause analysis).
  • Coordinated and collaborated with Developers, Production support team, Project Management folks and Requirements Analysts to resolve issues.

Environment: /Tools: Tomcat, Splunk, JvisualVM, AppDynamics, Windows Server 2008/2012, JMeter, AIX, DB2, HP ALM Quality Center 12, Load Runner 12.01, Site Scope 11.24, WinSCP.

Confidential

Performance Tester

Responsibilities:

  • Reviewed and analyzed the requirements of the new system and identified discrepancies that can hinder System, Regression, and User Acceptance Testing (UAT);
  • Participate in all meetings planned for release and obtain necessary technical requirement and such meetings include design review, test execution timeline etc.
  • Meeting with project team to work for project business volume metrics.
  • Gathering and analyzing business and technical requirements for Performance Testing purposes.
  • Configure all necessary hardware and software to support Performance Center.
  • Planning, development and testing of scripts.
  • Independent developed Performance Center Vugen scripts according to test specifications/requirements to validate against Performance SLA.
  • Enhanced users Scripts by correlation, parameterization, transaction points, rendezvous points and various Load Runner functions.
  • Parameterized the Performance Center scripts to access data sheets based on environment like QA, UAT and Production.
  • Created automated scripts for API WSDLs/Portal Application using Vugen in Performance Center 9.52 (web services protocol/Portal, Frontend Web HTTP/HTML protocol) for regression scenarios.
  • Involved greatly in Performance Testing, Functional Testing and Regression Testing using automated testing tools including LoadRunner, Performance Center, Quick Test Pro, Quality Center, Win runner and Test Director.
  • Using Performance Center created scenarios and set up monitors to track load generator for performance testing.
  • Performed correlation by rightly capturing the dynamic values and parameterize the data dependencies that are always a part of Business process.
  • Conducted several Load tests such as 1 Hour peak production load, Reliability and Stress tests to identify the performance issues.
  • Ran the scripts for multiple users using controller in Performance Center for GUI/API regression/Load testing.
  • Involved in determining scalability and bottleneck testing of applications.
  • Identifying bottlenecks in Network, Database and Application Servers using Performance Center Monitors.
  • Monitored Average Transaction Response Time, Network Data, Hits per Second, Throughput, and Windows resources like CPU Usage available and committed bytes for memory.
  • Analyzed Throughput Graph, Hits/Second Graph, Transactions per second Graph and Rendezvous Graphs using LR Analysis Tool.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Analysis report.
  • Analyzed results and provided Developers, System Analysts, Application Architects and Microsoft Personnel with information resulting in performance tuning the Application.
  • Develop and implement load and stress tests with HP Performance Center and present performance statistics to application teams and provide recommendations of how and where performance can be improved.
  • Verify that new or upgraded applications meet specified performance requirements.
  • Worked in Quality Process - Prepared monthly Quality Reports, Benchmarking Reports.

Environment: .NET, C#, ASP.NET, HP-Unix, HIS 4.0/7.0, Web services, DB2, Load Runner 9.52/HP Performance Center 11.0, HP ALM Quality Center 11.0, HP diagnostics Server, HP Site Scope

Confidential

Software Tester/Performance Engineer

Responsibilities:

  • Created automated scripts by including timers for recording response times, and test checks for confirming the retrieval of the correct page.
  • Involved in performance testing of server’s load and scalability by creating multiple Virtual Users by using Load Runner Virtual User Generator component.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.
  • Run full formal performance test including Peak, Breakpoint, Burst, Longevity and Failover.
  • Configured Web, Application, and Database server performance monitoring setup using LoadRunner Controller.
  • Have good experience in Analysis, Debugging & solving of SAP issues using - TCodes (ST03N), STAD, SM50, System log (SM21), EWA report and AWR report etc.
  • Extensively used Wily Introscope for finding out memory leak pattern (root cause analysis - RCA).
  • Determine baseline for current system that resides in production with Oracle version 8i.
  • This represents the best-case response time that the application can deliver and identifies a benchmark for reference as user transaction volumes scale on the target infrastructure.
  • Responsible for executing load testing in Client/Server environment for JAVA application to see the impacts of new changes in the application.
  • Testing will be conducted on a subset of the Rational Tools. The tools to be tested include Requisite Pro, Clear Quest, Test Manager, and Project Console.
  • Coordinated and collaborated with Developers, Production support team, Project Management folks and Requirements Analysts to resolve issues.
  • Involved in Developing Automation Frame work.
  • Used various parameterization techniques with Data Table, Random, Environment Variable and Action parameters.
  • Modified the Automation scripts by inserting check points to verify the object properties
  • Created Data Driven test phases by creating different data tables.
  • Parameterized various links in the application for Functional/Integration testing.

Environment: SAP, ABAP, Windows 2003, Hp-Unix, Linux, Wily Introscope, Oracle 8.i & 9.i, MS SQL Server, XML, Load Runner.

We'd love your feedback!