We provide IT Staff Augmentation Services!

Performance Tester Resume

4.00/5 (Submit Your Rating)

Rosemont, IL

PROFESSIONAL SUMMARY:

  • 7 Years of experience in Testing with focus on Load runner, jmeter.
  • Diverse experience in all phases of Software Development Life Cycle (SDLC) and software testing life cycle(STLC).
  • Exposure to all Stages and Methodologies of SDLC, including Agile/Scrum methodologies, V - Model and waterfall model.
  • Well versed with all the testing concepts and methodologies.
  • Involved in analyzing System Requirements and developing test cases for Functionality, Regression, Performance, System Testing and Acceptance testing.
  • Experience in writing test scripts and executing them using Selenium, load runner,Jmeter.
  • Excellent Experience working on SQL Server 2008/2010.
  • Expertise in understanding user requirements and translating business requirements into technical solutions.
  • Strong conceptual, analytical, and testing skills and excellent communication skills with leadership qualities.
  • Expertise in Virtual User Generator (VuGen) scripting for performance/load testing, Multiple protocols (Web http/html, web services, Ajax Tru-Client, Mobile, Citrix).
  • Hands-on experience on Monitoring and Analysis tools such as HP SiteScope, Willy Introscope, Splunk, Dynatrace, HTTP watch etc
  • Responsible for analyzing Throughput Graph, Hits/Second graph, Transactions per second graph and Rendezvous graphs using LR Analysis tool
  • Skilled in Troubleshooting and debugging of scripts and execution issues.
  • Excellent knowledge and skills in test monitoring for transaction response times, web server metrics, Windows / Linux / AIX system resource, Web App, Server metrics, Database metrics, and J2EE Performance.
  • Very Good understanding of performance related aspects of application servers (OS, JVM) and Databases
  • Hands-on experience on other test repository and tools such as HP Quality Center, JIRA.
  • Hands-on experiences on others tools/emulators such as SoapUI/RUMBA.
  • Skilled at database testing.
  • Experience in analyzing performance bottlenecks, root cause and server configuration problems using LoadRunner Monitors
  • Hands on Experience with all the types of Performance Tests (Load, Stress, Endurance, Spike, Scalability, and Volume).
  • Ability to work/handle multiple projects independently.
  • Good Analytical and communication skills.
  • Committed and hard working with a quest to learn new technologies and undertake challenging tasks.
  • Quick Learner, adapter of new tools/technologies and their test applicability. Excellent written/verbal communication, highly motivated, and self-starter.

TECHNICAL SKILLS:

Operating Systems: AIX, HP-UX, DOS, Solaris, Windows, and Linux

Languages: C, C++, JAVA/J2EE, VB Scripts, XML, UNIX - Shell Scripting

Databases: Oracle, DB2, SQL Server, MS-ACCESS and My SQL

GUI: VB, JSP, Java Applets, ASP, HTML

Web Related: DHTML, XML, VBScript, JavaScript, Applets, JAVA, JDBC

Testing Tools: LoadRunner, Jmeter, Performance Testing, SiteScope, HP Diagnostic tool, Wily IntroScope, DynaTrace, AppDynamics.

Web Servers: Web logic, Web Sphere, IIS

Other: Quality Center, Team Quest, Wily IntroScope and Performance Center

PROFESSIONAL EXPERIENCE:

Confidential, Rosemont, IL

Performance Tester

Responsibilities:

  • Understood the Business processes and Change Requests from the business for Different Travel Plans.
  • Created Load Test Plan and conduct Release performance testing using Load Runner
  • Extensively used Load Runner for performance, load and stress testing
  • Tested the application using various protocols in Load Runner (Vugen) for creating Vuser scripts
  • Used Load Runner controller to create and execute goal oriented and manual scenarios
  • Used Load Runner Transaction and Web monitors to pinpoint bottlenecks
  • Generate, analyze, and publish Load Runner test results and document the testing process
  • Day-to-day maintenance and performance tuning.
  • Analyzing the GC logs using IBM Support Assistant to verify the Memory Leaks.
  • Management of WebSphere application server performance tuning and troubleshooting of WebSphere Application Server 7.0 on QA and Production Environments.
  • Configured application specific JVM settings, Web container parameters using the Administration Console.
  • Test cases wrote in Mercury Quality Center based on functional requirements, test plan and use cases
  • Developed Executed & Tested test plans, test cases and test strategies
  • Extensively documented test result sheets, screen shots capture sheets
  • Extensively executed and verified SQL Queries on SQL Server
  • Performed Database integrity testing by executing SQL statements
  • Assist application Developers and technical support staff in identifying and resolving problems
  • Documenting and communicating test results using Quality Center
  • Used Quality Center for tracking and reporting of defects
  • Worked with appropriate business and technology leads to determine the acceptable range for test results and performance
  • Assist application Developers and technical support staff in identifying and resolving problems

Environment: HP Load Runner 9.1/8.1, HP Performance Center, HP Diagnostics, Quality Center, JAVA/J2EE, JDBC, XML, WAS 7.0, IBM Db2 8.1, Tomcat5.5, and IBM Support Assistant 4.1.

Confidential, Richmond, VA

Performance Tester

Responsibilities:

  • Designing the Performance Test plan, strategy with approach, methodology, Workload model, Metrics and Type of different test, which is required for the project to validate for its SLA.
  • Install the loadrunner and updating the new licenses using LoadRunner License Utility.
  • Developed Load Runner scripting using web-http, web-services Protocols
  • Extensively used Auto and manual Correlation, Parameterization and Content Check features
  • Established test data for testing the application.
  • Inserted Transaction Points to measure the performance of the application.
  • Set up the IP spoofing for performance environment before running the test.
  • Designed the manual scenarios using load runner controller module.
  • Set up, gather and evaluate statistics from all monitors.
  • Involved in Performance tuning of the class objects/ paging issues, and the database queries
  • Involved in the garbage collection overhead issue and solved with the proper recommendations
  • Analyzed various graphs generated by Load Runner Analyzer including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Web Server Resource Graphs.
  • Frequently communicated with developers and senior QA team members to assist in clarification of technical issues.
  • Logging and tracking bugs in Bug Tracker
  • Responsible for team meetings to discuss the issues arising out of testing
  • Analyzed Throughput Graph, Hits/Second graph, Transactions per second graph and Rendezvous graphs using LR Analysis tool.
  • Prepared load Test analysis reports (CPU Utilization, Throughput, Response Times, Web Server Monitor Counters, System Resource Performance Counters and Database Performance Counters).

Environment: HP Load Runner 12.5, Performance Center 11.5, NET 4.5.1, SQL Server DB 2008, SSRS, SSIS, XML, Oracle 11G, MS IIS 7.5, BugTracker, JQUERY, PL-SQL, Web Services

Confidential, Framingham, MA

Performance tester

Responsibilities:

  • Analyzed software specifications and technical service description documents·
  • Involved in the development of Test Procedures for various stages such as Integration, System, User Acceptance Testing, Positive and Negative Testing·
  • Prepared Test Plan that provides a detailed list of conditions under which the system will be tested·
  • Various documents like Test cases, Test flows, was created using MS Word and MS Excel·
  • Created Application specific and generic functions to reduce redundant code·
  • Performed Black box testing and regression testing·
  • Reported defects to the team lead through Test director and helped the developer to resolve the technical issues·
  • Executed Manual test cases and verified results with expected results.
  • Generated Test Scripts using Load Runner.
  • Used Controller to Perform Load Test, Scalability test and Stress Test.
  • Different Loads at the increments of 10 starting from 5 Virtual Users, 20 Iterations to 250 Virtual Users were ramped until it reached 100% CPU.
  • The Average CPU usage, Response time, TPS are analyzed for each scenario.
  • Also analyzed the Quick Test Pro, Load Runner reports to calculate Response time and Transactions Per Second (TPS).

Environment: Load Runner 9.1/8.1, Test Director, Windows NT, UNIX, IIS, Oracle, JAVA/J2EE, HTML, DHTML, Java Script.

Confidential

Performance Analyst

Responsibilities:

  • Monitored the metrics such as response times and server resources such as Total Processor Time, Available Bytes and Process Bytes by using LoadRunner Monitors.
  • Proficient debugging and executing LoadRunner scripts.
  • Experienced in configuring and utilizing SiteScope and Wily IntroScope for performance monitoring.
  • Executed load tests for new Applications for benchmarking for future releases.
  • Enhanced Vuser scripts by introducing the timer blocks, by parametrizing user id’s to run the script for multiple users.
  • Monitored and analyzed system performance during load tests using SiteScope and Gomez.

Environment: LoadRunner, Performance Center, Wily Introscope, Site Scope, Web Http/Html, Web services, Oracle DBA, UNIX, Web Server, Quality Center, Windows

Confidential

QA Analyst

Responsibilities:

  • Involved in identification, analysis and validation of functional and technical specifications to design test strategies
  • Coordinated with Business Analysts to resolve issues with Requirements for Functional and User Acceptance Testing
  • Responsible for developing and implementing test plans, test cases and test scripts in Mercury Test management tool - Quality Center
  • Conducted Functional, Integration, System and Regression testing
  • Created detailed Test cases for validating business functions and Regression testing
  • Prepared Test Data for Positive and Negative Testing used in Data Driven Test
  • Tested various text-hyperlinks and image-hyperlinks of Home page and different pages
  • Tested the functionality of each screen to monitor proper navigation
  • Used Quality Center for defect reporting and tracking
  • Extensively used Lotus Notes for organization of business meetings and emailing.
  • Worked with developers in resolving issues in test environment and defects while application under test

Environment: Quality center, JAVA, J2EE, JSP, EJB, Web Http/Html, Web services, Oracle 9i, PL/SQL, TOAD, VSS, Visio, Win NT/XP.

We'd love your feedback!