We provide IT Staff Augmentation Services!

Performance Tester Resume

5.00/5 (Submit Your Rating)

New, YorK

PROFESSIONAL SUMMARY:

  • 6 years of Extensive knowledge in Performance Testing with 2 years of Manual testing experience. Solid knowledge of (SDLC): Software development life cycle management, QA life cycle, software quality assurance SQA, and implementation of various Client/server and web - based applications using Agile methodology.
  • Experienced in Using the Automated testing tools HP Load Runner, QTP, Test Director/Quality center.
  • Expert in Performance Testing with an excellent command on HP Load Runner, HP pc, and JMeter and Expertise in developing complex performance test scripts, using VUgen.
  • Strong working experience in different types of Performance and reliability Testing (Load Stress, Endurance, Capacity, Volume, Scalability, Reliability etc.) by using Load Runner and JMeter.
  • Expertise in creating Test Plans, Test Strategy and Test cases documents and their execution both Manual and Automated.
  • Expert in report gatherings, load modeling and report analyzing for performance testing.
  • Experienced in Reviewing and analyzing the Business requirement documents and user specifications. And good command in Developing functional Test cases and good ability on converting all the test cases into Test scripts, using HP ALM and QTP.
  • Experienced in Creating Load Runner Scenarios and scheduling the Virtual users to generate realistic load on server using Load Runner (Load Runner generator machine)
  • Expert in gathering the Test data. Test data Input includes precondition, test input, Test Result, Test Regression data.
  • Experienced in Planning of Test Strategy on how to automate the testing. Selecting the Test cases for Test cases for Regression Testing and automating the Test cases using QTP and ALM.
  • Experienced in Creating Test case, test scenarios and executing the test using JMeter and post man for APIS testing. And proficient in writing functions to emulate real time environment for Load Runner scripts.
  • Experienced in developing performance Test plan from the requirement and application design. Worked closely with developers, support group to ensure test environment, application and data readiness. Well trained in Assisting Application developers and technical support staff in identifying and resolving defects
  • Well learned in SOAP REST Html Http and external data source testing and deep knowledge of web services, creation, development design and functionality of web services.
  • Managed Performance across the application life cycle using Application Management Tool suite.
  • Excellent knowledge on Performance Tuning.
  • Expert in Tracking and reporting bugs using tools JIRA and Back Log.
  • Comfortable with various leading operating systems: Windows NT/98/95/2000/XP and UNIX.
  • In depth knowledge of Oracle Database, SQL, Java, Jsp, Html, Visual Bas Excellent working experience in fast paced environment using Agile Methodology.
  • Strong communication skill as well as a Good Team player with good time management and ability to work on different projects concurrently.
  • Strong knowledge and experience in working with tools like ALM, Load Runner, QTP(UFT), Quality center and Performance center.
  • Used wily Introscope to monitor the health of the various applications. Used APM for Monitoring the web application on JAVA and net Platform.
  • Extensive working experience on servers Monitoring with the help of SiteScope, new relic and HP Diagnostic.
  • Experienced in monitoring and diagnosing application performances and/ or configuration issues. Good skill to set up Performance environment and monitored database for sessions, connection pool and memory issue.
  • Utilized Database, Network, Application server and WebLogic Monitors during the execution to identify bottlenecks, bandwidth problems, infrastructure problems, and scalability and reliability benchmarks.
  • Configured and used SiteScope performance Monitor to find and analyze the performance of the server by generating various reports from CPU utilizations, Memory usage to Load etc.
  • Analyzed LoadRunner Metrics and other performance monitoring tools results, during and after performance testing, on application server data base and generated various Graphics and Reports.

TECHNICAL SKILLS:

Load Testing Tools: Load Runner 9x,11x, HP Performance Center, HP ALM Performance center, Quality Center 9.0/9.2, HP Quick Test Professional, JMeter. GitHub, and Jenkins.

Load Runner Protocols: HTTP, Citrix, SAP, People Soft, TruClient, API Rest Client, Web services, JAVA-RMI, Flex.

Monitoring Tool: New Relic, SiteScope, Wily Introscope. Oracle Enterprise Manager, Dyna Traces, HP Diagnostics

Web Application servers: IBM web Sphere, BEA Web Logic 7.x/8.x/10.x, Tomcat 5.0,5.5

Operating System: Windows 2008 R2, Windows NT. Unix.

Languages: JAVA, JSP, HTML, Visual Basics, Oracle, C, SQL, C#

Databases: MS SQL server 7.0/2000/2005/2008 , DB2, Oracle 9i/10G,11i, SQL, TOAD

Methodologies: Performance Testing, Quality Assurance, Agile.

WORK EXPERIENCE:

Confidential, New York

Performance Tester

Responsibilities:

  • Involved in different meetings to gather specifications and requirements (Load Metrics, Performance requirements, SLA, work Flows, etc.) prior to testing.
  • Participated and implemented agile testing practices for widely distributed teams and executed performance Test Plan and Test cases with standard format.
  • Generated Load Runner automation scripts and prepared the test data accurately with the help of additional sub scripts.
  • Worked closely with the other engineers to determine if the proposed architecture can handle current and anticipated production volume.
  • Executed different performance tests (smoke Test, Baseline test, Load Test, Stress Test, Capacity Test and Endurance Test),
  • Created load Test scripts using VUGEN in the following protocols: HTTP, AJAX, SOAP, ODBC, Terminal Emulator, JAVA (web services).
  • Developed VuGen scripts and enhanced the basic script by adding custom code,
  • Analyzed the report and validate that the forecasted load levels can be reached with acceptable response times of open pages for given functionalities.
  • Identifying the breakpoint for the product by the increasing number of users using the application support without degrading the performance using stress test.
  • Using Blaze Meter, executed the performance tests in cloud performance testing.
  • Used rendezvous concept of LoadRunner to generate peak load onto the server thereby stressing it and measuring its Performance.
  • Identify the specific system components response times to troubleshoot performance Bottlenecks.
  • Working with SDLC team to troubleshoot root cause of the issues related to DB and Application servers using Dynatrace Tool.
  • Monitor and used CA Wily APM to generate reports, graphs, analysis of application latency.
  • Used Java Garbage collection monitor tool to understand if there are any memory leaks during the Capacity & Endurance Tests.
  • Used Dynatrace to measure website performance in test environment to capture performance metrics of Key product features.
  • Used APM tool Dynatrace, AppDynamics to Monitor End user experience, Over all Application.
  • Performance, Business Transaction Performance and application infrastructure performance across all tiers (web, app, DB) of the applications. Adding Dynatrace headers to VuGen scripts to monitor response times closely.
  • Used Splunk tool to check whether the messages are triggering at back end.
  • Used HP Diagnostics & Wily Introscope to further monitor various graphs like VM Heap, GC, Threads status, Java process utilization, JVM exceptions, collection leaks and context switch /sec to pin point issues.
  • Collected the frequency of JVM Heap & Garbage collection in DCT server during Test.
  • Worked on performance testing report and made recommendation for systems application performance improvement.
  • Identifying the problems, prioritizing them and communicated the Bug Issues to the developers using bug tracking tool Quality Center.

Confidential, Baltimore, MD

Jr Performance Tester

Responsibilities:

  • Develop performance test suites, creating threads, and setting up sampler using JMeter testing tools.
  • Involved in Localization testing and performance testing of web-based modules. Handled Load testing using JMeter.
  • Provided support in the performance testing using JMeter task includes developing test plan, test scripts and reports,
  • Currently working in an Agile development environment.
  • Participate in weekly Scrum meetings for the applications development.
  • Develop Scenario based testing for the JMeter scripts.
  • Used GitHub plugin for Jenkins is the most basic plug in for integrating Jenkins with GitHub Projects.
  • Used GitHub to schedule build and pull the code and data files from GitHub repository to Jenkins machine and automatically trigger each build on Jenkins server, after each commit on GitHub repository.
  • Used to create Jenkins Pipeline script using Groovy, for running JMeter. Pipelines are defined using Groovy-based DSL so Jenkins and Java APIS can be used to define the job.
  • Create, schedule and run the scenarios using JMeter and generate necessary graphs.
  • Extensively worked on JMeter to create Thread Groups and test web application for various loads on key business scenarios.
  • Performance test development for continuous Benchmarking using JMeter and Jenkins.
  • Used Blaze Meter chrome extension to record all of the HTTP/S requests that browser sends, creates a JMeter script, and uploads it to Blaze meter, where you can execute it with a single click.
  • Used Blaze Meter chrome extension tool that enables to Record, Browse, Upload, and Run your testing scripts. So far, the Chrome Extension supported creating JMX, JSON or YML files, for running in JMeter and Blaze Meter.
  • Created and executed JMeter scripts for performance testing of portal.
  • Verifying the Requirement via the Automated Scripts, installing rational Robot tool on different environment.
  • Executing Automated scripts for different release Automated movement and identification solutions (PD AMIS) project and verifying the automated scripts identifies issues within the product.
  • Involved modifying the automated scripts-based problem Report return against the application by the end users.
  • Provided support in web testing Application Performance testing using Silk Performer Tool,
  • Provided support functional testing of the web Application using silk Test.
  • Provide support and responsible for managing and tracking infrastructure requirements and software upgrade.

Confidential, Baltimore

Jr Performance Tester

Responsibilities:

  • Analyzed the scalability, throughput and load testing metrics against test servers.
  • Assisted in designing, developing and reviewing test processes.
  • Compared and contrasted system performances with varying levels of physical resolution (RAM, CPU, cores, Disk caches, network) and compute nodes.
  • Defined problems pertaining to the system and created the test scripts to resolve the issue.
  • Built and implemented testing application suites and conducted analysis.
  • Collaborated with systems architects and software engineers for development process,
  • Conducted troubleshooting of Business and production issues.
  • Provided technical support to the business.
  • Conducted testing procedures and wrote reports for presentation to the teams.

Confidential, Baltimore, MD

Manual Teter

Responsibilities:

  • Wrote Test Plan and Test Cases according to business requirement.
  • Performed data driven testing by data driver wizard and parameterization.
  • Analyzed Business Requirement Document and Technical Specifications Documents to identify test Scenarios and Test procedures.
  • Work in Agile, Scrum, and Sprint environment in order to change the requirements and features set.
  • Interacted with business analyst to gather the requirements for business and Manual Testing.
  • Responsible for Manual and Automation Testing.
  • Developed Test Plan & Script, Created Defect Management Templates, created Initial Test Plan and developed test cases and test scripts manually.
  • I was responsible to check the log in functionality of the Confidential URL and check the functionality of the website using different format of valid and invalid inputs.
  • Created updated and maintained Test Matrix and Traceability Matrix.
  • Created and executed test plans, test cases for various types of testing’s like Integration, Performance, backend testing, stress, regression testing.
  • Developed and executed automated test scripts for functional. Integration and regression testing.
  • Wrote SQL queries and performed back end testing for data validation to check the data integrity during migration from back end to front end.
  • Bug Identification and debugging in the system.
  • Participated in walkthroughs and meetings with business analyst, developers, team lead and QA manager on regular basis.
  • Daily updates on Bug and defects issues and their status with the team.

We'd love your feedback!