We provide IT Staff Augmentation Services!

Sr. Performance Tester Resume

3.00/5 (Submit Your Rating)

Boston, Ma

SUMMARY:

  • Around 8 years of Quality Assurance experience with strong expertise in Performance/Load & Stress Testing using HP Performance Center/LoadRunner.
  • Expertise in Test documentation, Performance testing and execution on Client/Server, Integrated Intranet, UNIX, Linux, Mainframes and Internet applications
  • Hands on experience in using automated tools like Load Runner, Test Director, Quality Centre, JMeter and Performance Center.
  • Performance testing Experience in J2EE, PeopleSoft, Oracle applications by using HTTP/HTML, Web Click &Script and Citrix, ICA Protocol and multiple protocols.
  • Monitoring system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat.
  • Hands on experience and exposure in all phases of project development lifecycle and Software Development Life Cycle (SDLC) right from Inception, Transformation to Execution which include Design, Development, and Implementation.
  • Proficient in writing test plans, test cases, test scripts and test result reports.
  • Significant experience Load testing various applications including .Net, Web sphere, J2EE, CRM, Business Objects and Citrix implementations.
  • Experienced in executing Baseline, Bench mark, Performance, Stress and memory leak tests.
  • Performed Performance Testing, Functional Testing and Regression Testing using automated testing tools including LoadRunner, Performance Center, Quick Test Pro, Quality Center, WinRunner and Test Director.
  • Experienced in supporting the team of on - site and off-shore which provides Performance testing services for 24/7 for internal client applications.
  • Well-versed in Software development which follows Water fall and agile methodologies.

TECHNICAL SKILLS:

Automation tools: Loadrunner 11.5/11.0/9.5/9.1/8.1/7.8/6.5, Winrunner, Quick Test Pro, IEX, Test director, Test Manager, Quality Center.

LoadRunner Protocols: Web Services, Citrix, Oracle NCA, PeopleSoft 8, ODBC, Web HTTP/HTML, Sybase Ctlib, Sybase, IMAP, SMTP, POP3.

Databases: Oracle, Teradata, MS SQL Server, Sybase MS Access, IBM DB2

Languages: SQL, PL/SQL, Pro*C, C/C++, MFC,HTML.

Web: Java Web Server 1.2, Microsoft Personal Web Server, Web Logic Server5.x, HTTP.MS MQ .Net.

Operating Systems: Operating Systems: Windows XP/2000/98/NT/95, UNIX and LINUX.

Other: Ms Word 2000/XP, Visio 5.0

PROFESSIONAL EXPERIENCE:

Confidential, Boston MA.

Sr. Performance Tester

Responsibilities:

  • Identified the test requirements based on application business requirements.
  • Prepared Test Cases, Test Strategy, and Test Plan based on the non-functional business requirements to meet SLA timings.
  • Had meetings with different teams for performance test scope.
  • Collaborated with development team to analyze the application’s core functionalities and its various dependencies for effectively identifying potential bottlenecks.
  • Worked closely with developer to ensure the s/w components meet highest quality standards
  • Recorded Scripts using VuGen with web http/html and Web Services protocols.
  • Developed scenarios in controller and ran multiple tests.
  • Created and enhanced numerous test scripts to handle changes in objects, in the tested applications GUI and in the environment using Selenium,
  • Analyzed the test results and presented a high level Executive performance testing report.
  • Configured the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines.
  • Created testing framework using Selenium web driver and automated scenarios using selenium
  • Responsible for setting up monitors to monitor network activities and bottlenecks.
  • Installed SiteScope and configured monitors for analysis and used to get metrics from servers.
  • Analyzed results for bottlenecks and made recommendations.
  • Involved in business functionality meetings and Use-case analysis and developing the templates for user and documentation.
  • Analyzed LoadRunner on-line graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server.

Environment: Load Runner 11, Selenium, Windows 2003, SQL, Java, JSon, Web Services, Performance Center, SoapUI, Sitescope.

Confidential, Atlanta GA.

Sr. Performance Engineer

Responsibilities:

  • Involved in creating high level test plan.
  • Gathered all the requirements for scripting the web services.
  • Had meetings with different teams for performance test scope.
  • Independently developed LoadRunner Vugen scripts according to test specifications/requirements to validate against Performance SLA.
  • Executed multi-user Performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller,Vugen,Analyser.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Analysis reports
  • Monitored Average Transaction Response Time, Network Data, Hits per Second, Throughput and Windows resources like CPU Usage available and committed bytes for memory.
  • Involved in performance testing of server’s load, Cox Video Testing and scalability by creating multiple Virtual Users by using Load Runner Virtual User Generator component.
  • Understanding the change thoroughly and if necessary contact related business analyst, developer and other SME’s.
  • Designed multiple LoadRunner scripts (Vugen) with different protocols like Web, Citrix, and Web services for load testing different applications.
  • Did Video Testing using IEX server for Cox TV.
  • Executed IEX scripts for Cox TV to capture the bit frames and to calculate the response times.
  • Parameterization and correlation was done to all web service scripts.
  • Analyzed results and provided Developers, System Analysts, Application Architects and Microsoft Personnel with information resulting in performance tuning the Application.
  • Runtime setting for each script were done individually to handle in standalone mode.
  • Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Pinpointed all the bottle neck and suggested solutions to improve performance of the system.
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.

Environment: Windows 2003, Oracle 10i, Oracle SQL, Java, MFC, Web Services, Performance Center, Web Sphere, Load Runner 11.5, AS400, IEX Executor, Test Creator, Quality Center.

Confidential, Troy MI

Sr. Performance Engineer

Responsibilities:

  • Created Test Plan, which includes Testing Resources, Testing Strategy and testing of end-to-end scenarios.
  • Created automated scripts by including timers for recording response times, and test checks for confirming the retrieval of the correct page.
  • Participated along with the Business Support Group in design review and requirement analysis meetings.
  • Involved in performance testing of server’s load and scalability by creating multiple Virtual Users by using Load Runner Virtual User Generator component.
  • Designed multiple LoadRunner scripts (Vugen) with different protocols like Web, Siebel, RTE, Citrix, Web services and Winsock for load testing different applications.
  • Created, Executed and Monitored the feasibility of various manual and goal oriented scenarios of an application with Load Runner Controller.
  • Identify system/application bottlenecks and work with Bottom-line to facilitate the tuning of the application/environment in order to optimize capacity and improve performance of the application in order to handle peak workloads generated via Mercury Interactive LoadRunner tool to simulate activity.
  • Configured Web, Application, and Database server performance monitoring setup using LoadRunner Controller & Performance Center.
  • Created Vusers to emulate concurrent users, inserting Rendezvous points in the Vuser scripts and executed the Vuser Scripts in various scenarios which were both goal oriented and manual using Load Runner
  • Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Added various monitoring parameters (CPU, Memory) to the LoadRunner controller for monitoring.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.

Environment: Windows 2003, Linux, Oracle 8.i & 9.i, SQL, MS SQL Server, people soft, .Net, BizTalk, XML, MS MQ .Net, LoadRunner 11.0, Quality Center, Performance Center, Citrix, Test Manager.

Confidential, Atlanta, GA.

Performance/Load Tester

Responsibilities:

  • Developed Test Plan, which includes entire Testing Plan, Testing Resources, Testing Strategy and testing of end-to-end scenarios.
  • Worked in Requirement analysis, Test strategy documentation of required system and functional testing efforts for all test scenarios including Positive testing, Negative tests
  • Generated Test Cases for each specification in Requirement Specification Document corresponding to each module
  • Created Test Data for the test cases for Functional and Automated testing
  • Created automated scripts by including timers for recording response times, and test checks for confirming the retrieval of the correct page.
  • Involved in the entire life cycle of the project including, requirements gathering, design, test and production support.
  • Designed performance test suites by creating Web (GUI/HTTP/HTML), Siebel, Web service and Click & Script test scripts, workload scenarios, setting transactions. Extensively used VUGen to create Load Test Scripts.
  • Involved to run the ETL process to make the data files into stranded file to process the data into control tables.
  • Identify system/application bottlenecks and work to facilitate the tuning of the application/environment in order to optimize capacity and improve performance of the application in order to handle peak workloads generated via Mercury Interactive LoadRunner tool to simulate activity
  • Tests using Load Runner, Monitor system under load using in-house tools and in conjunction With Capacity Planning
  • Ensured daily production support and administration of portal operations like end user performance, identifying the bottlenecks before and after production migration and maintenance upgrades
  • Took lead for creating Performance Center Load Runner scripts using Web-HTTP, Web Services, Oracle NCA Protocols
  • Worked on executing Baseline, Benchmark, Stress, Memory leak test scenarios for internal and customer facing applications based on application’s work load model
  • Worked on sharing applications Performance analysis to Technical Directors and business teams to mitigate GO or No-Go decisions
  • Participated in the team meetings to discuss the issues arising out of testing
  • Generated weekly status report and reported to management

Environment: .Net, Windows 2000/XP Professional, UNIX,DB2,Oracle 10i, Quality Center 9.0, VuGen, LoadRunner 9.1,9.5, Test Manager.

Confidential, Pleasanton, CA

Sr.Performance Tester

Responsibilities:

  • Reviewing the requirements documents for testability.
  • Developing Master Test Plan, which includes entire Testing Plan, Testing Resources, Testing Strategy and testing of end-to-end scenarios.
  • Designed performance test suites by creating Web (GUI/HTTP/HTML), PeopleSoft test scripts, workload scenarios, setting transactions. Extensively used VUGen to create Load Test Scripts.
  • Correlated and Parameterized test scripts to capture Dynamic data and input various test data as per business requirements.
  • Extensively used C programming language for incorporating business logic and error handling code in to the scripts.
  • Assisted Application Developers and technicalsupport staff in identifying and resolving defects.
  • Created LoadRunner scenarios and scheduled the Virtual Users to generate realistic load on the server using LoadRunner(Load generator machine)
  • Gathering the Test Input data. Test Input data includes Pre-condition, Test Input, Test Result, Test Regression data.
  • Planning of Test Strategy on how to automate the testing. Selecting the Test Cases for Regression testing and automating the Test Cases using WinRunner.
  • Creating GUI, Bitmap, Database and Synchronization verification points in WinRunner.
  • Validated the integrity constraints on the database by creating Procedures and Functions.

Environment: Win NT, Hp-Unix, Oracle 8.i, DB2, Sybase, MS SQL Server, Infomix Web Logic 6.x, Quality Center, Performance Center, Quicktest pro, Loadrunner 7.8

Confidential, Columbus, OH

QA Performance Analyst

Responsibilities:

  • Prepared Test Automation Project Milestones and deliverables. Prepared Project Schedule, Attended the Project Status Meetings
  • Designing, developing and the execution of reusable and maintainable automated scripts.
  • Created and executed scenarios that emulated typical working conditions using LoadRunner VuGen.
  • Used Window socket Protocol to upload the GL Records Through Excel into peoplesoft and RTE Protocol to test the mainframe application. Used Loadrunner Peoplesoft Protocol to test PeopleSoft Application
  • Planned the load by analyzing Task distribution diagram, Transaction Profile and User profile and executed Performance Testing using Mercury performance center
  • Conducted Performance testing and analyzed various graphs against product requirements.
  • Tested functionality of the application using Mercury QuickTestPro.
  • Added various monitoring parameters (CPU, Memory) to the LoadRunner controller for monitoring.
  • Setting run-time parameters (Think time, Pace time, Replay options etc.).
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.
  • Participated in discussions with the QA manager, Developers and Administrators in fine-tuning the applications based on the Results produced by Mercury Analysis.

Environment: PeopleSoft 8.9, Informatica 6.2, 2000/XP Professional, Oracle 9.2.05, Oracle 9i, UNIX, Quality Center 8.2, VuGen 7. 8, Performance Center 7.8, Quick test pro 8.2, IBM Web Sphere and XML.

Confidential

QA Tester

Responsibilities:

  • Tested an internet-based system that customers will access through a secure browser to electronically create payments for bills they receive in the mail, to pay their employees or even to make electronic transfers.
  • Responsibilities include, determine test strategies based on requirements, developed test plans and test cases and executed test scripts.
  • Execute System, Integration, End-to-End, and User Acceptance Test (UAT) test cases for Web-based and JAVA applications.
  • Installes TestDirector, and used TestDirector for requirements management, planning, scheduling, running tests, defect tracking and manage the defects and executing the test cases.
  • Created test procedures, test cases, test plan development and execution.
  • Performed functional, load, Integration, regression testing and viewing, analyzing results and defect tracking, defect reporting and documentation.
  • Creating LoadRunner scripts using Virtual User generator to perform the load and performance testing.
  • Used the Load Runner controller to execute the Vuser scripts with various scenarios.
  • Using Load Runner analyzed the response times of the business transactions under load.
  • Developed reports and graphs to present the stress test results to the management.

    Regularly followed up with Development Team to discuss discrepancies identified during testing and performance tuning.

  • Responsible for weekly status, Updated showing the Progress of the automation testing effort.

Environment: Windows NT, Oracle, SQL, MS Access, Test Director.

We'd love your feedback!