We provide IT Staff Augmentation Services!

Sr. Performance Engineer Resume

2.00/5 (Submit Your Rating)

Atlanta, Ga

SUMMARY:

  • 7 + years of Quality Assurance experience with strong expertise in Performance/Load & Stress Testing using HP Performance Center/LoadRunner.
  • Experienced on Mercury Test Suit (Test Director/ Quality Center, LoadRunner, WinRunner and Quick Test Professional) and rational tools.
  • Extensive experience in automated testing of Web based and Client/Server applications with proficiency in Load and Performance Testing. Good experience in agile methodology
  • Experience in analysis, design, implementation, execution, maintenance and documentation for system testing.
  • Proficient in writing test plans, test cases, test scripts and test result reports.
  • Performed Performance Testing, Functional Testing and Regression Testing using automated testing tools including LoadRunner, Performance Center, Quick Test Pro, Quality Center, WinRunner and Test Director.
  • Significant experience Load testing various applications including .Net, Websphere, J2EE, CRM, Business Objects and Citrix implementations.
  • Extensive experience using LoadRunner for Performance Testing, Stress Testing, Longevity Testing and Regression Testing.
  • Proficient in Creating and Enhancing scripts, Executing Tests and Analyzing results using LoadRunner and Performance Center.
  • Experienced in Design and Execution of Test criteria, Scenarios, and Scripts from requirements.
  • Participated in project design and review meetings.
  • Experienced in Planning and Translation of Software Business Requirements into test conditions; execution of all types of tests; and identification as well as logging of Software bugs for business process improvement.

TECHNICAL SKILLS:

Automation tools: Loadrunner 11.0/9.5/9.1/8.1/7.8/6.5, Winrunner 6/7/8.2X, Quicktest Pro 6.5/8.0, IEX, Test director 6.5/7.0/7.6, Test Manager.

LoadRunner Protocols: Web Services, Citrix, Oracle NCA, PeopleSoft 8, ODBC, Web HTTP/HTML, Sybase Ctlib, Sybase Dblib, IMAP, SMTP, POP3.

Databases: Oracle 11i,8i/8.0/7.0, Teradata V2R3/V2R4/V2R5, Sybase 11.x/12.0, MS SQL Server 6.5/7.0/2000, MS Access 7.0/97/2000,IBM DB2 UDB7.0, Informix

Languages: SQL, PL/SQL (Stored Procedures, Functions, Triggers, Cursors), Pro*C, C/C++, HTML .

Web: Java Web Server 1.2, Microsoft Personal Web Server, Web Logic Server5.x, HTTP.

Operating Systems: Sun Solaris 2.6/2.7, HP - UX, IBM AIX 4.2/4.3, MS-DOS 6.22, Win 3.x/95/98, Win NT 4.0,Win 2000,Windows XP, SCO Unix, HP9000

Other: Ms Word 2000/XP, Visio 5.0, Testdirector 7.2/7.6.

PROFESSIONAL EXPERIENCE:

Confidential, Atlanta GA.

Sr. Performance Engineer

Responsibilities:

  • Involved in creating high level test plan.
  • Had meetings with different teams for performance test scope.
  • Gathered all the requirements for scripting the web services.
  • Involved in performance testing of server’s load, Cox Video Testing and scalability by creating multiple Virtual Users by using Load Runner Virtual User Generator component.
  • Created Web services and HTML scripts.
  • Designed multiple LoadRunner scripts (Vugen) with different protocols like Web, Citrix, and Web services for load testing different applications.
  • Developed scenarios in Controller and ran multiple tests.
  • Did Video Testing using IEX server for Cox TV.
  • Executed IEX scripts for Cox TV to capture the bit frames and to calculate the response times.
  • All the parameterization and correlation was done to all web service scripts.
  • Think time and pacing was handled by implementing C code with randomization concept.
  • Runtime setting for each script were done individually to handle in standalone mode.
  • Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Added various monitoring parameters (CPU, Memory) to the LoadRunner controller for monitoring.
  • Pinpointed all the bottle neck and suggested solutions to improve performance of the system.
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.

Environment: Windows 2003, Oracle 10i, Oracle SQL, Java, Web Services, Performance Center, Web Sphere, Load Runner 11.5, AS400, IEX Executor, Test Creator, Quality Center.

Confidential

Sr. Performance Engineer

Responsibilities:

  • Created Test Plan, which includes Testing Resources, Testing Strategy and testing of end-to-end scenarios.
  • Created automated scripts by including timers for recording response times, and test checks for confirming the retrieval of the correct page.
  • Involved in performance testing of server’s load and scalability by creating multiple Virtual Users by using Load Runner Virtual User Generator component.
  • Designed multiple LoadRunner scripts (Vugen) with different protocols like Web, Siebel, RTE, Citrix, Web services and Winsock for load testing different applications.
  • Created, Executed and Monitored the feasibility of various manual and goal oriented scenarios of an application with Load Runner Controller.
  • Identify system/application bottlenecks and work with Bottom-line to facilitate the tuning of the application/environment in order to optimize capacity and improve performance of the application in order to handle peak workloads generated via Mercury Interactive LoadRunner tool to simulate activity.
  • Configured Web, Application, and Database server performance monitoring setup using LoadRunner Controller & Performance Center.
  • Created Vusers to emulate concurrent users, inserting Rendezvous points in the Vuser scripts and executed the Vuser Scripts in various scenarios which were both goal oriented and manual using Load Runner
  • Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Added various monitoring parameters (CPU, Memory) to the LoadRunner controller for monitoring.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.

Environment: Windows 2003, Linux, Oracle 8.i & 9.i, SQL, MS SQL Server, people soft, .Net, biztalk, XML, Loadrunner 11.0, Quality Center, Performance Center, Citrix, Test Manager.

Confidential, Atlanta, GA.

Performance/Load Tester

Responsibilities:

  • Developed Test Plan, which includes entire Testing Plan, Testing Resources, Testing Strategy and testing of end-to-end scenarios.
  • Created automated scripts by including timers for recording response times, and test checks for confirming the retrieval of the correct page.
  • Involved in the entire life cycle of the project including, requirements gathering, design, test and production support.
  • Created automated scripts by including timers for recording response times, and test checks for confirming the retrieval of the correct page.
  • Designed performance test suites by creating Web (GUI/HTTP/HTML), Siebel, Web service and Click & Script test scripts, workload scenarios, setting transactions. Extensively used VUGen to create Load Test Scripts.
  • Involved to run the ETL process to make the data files into stranded file to process the data into control tables.
  • Identify system/application bottlenecks and work to facilitate the tuning of the application/environment in order to optimize capacity and improve performance of the application in order to handle peak workloads generated via Mercury Interactive LoadRunner tool to simulate activity
  • Correlated and Parameterized test scripts to capture Dynamic data and input various test data as per business requirements.
  • Loadtest summary reports for each run comparing the results with previous runs.
  • Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Added various monitoring parameters (CPU, Memory) to the LoadRunner controller for monitoring.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.

Environment: .Net,Windows 2000/XP Professional, UNIX,DB2,Oracle 10i, Quality Center 9.0, VuGen, LoadRunner 9.1,9.5, Test Manager.

Confidential, Pleasanton, CA

Sr.Performance Tester

Responsibilities:

  • Reviewing the requirements documents for testability.
  • Developing Master Test Plan, which includes entire Testing Plan, Testing Resources, Testing Strategy and testing of end-to-end scenarios.
  • Designed performance test suites by creating Web (GUI/HTTP/HTML), PeopleSoft test scripts, workload scenarios, setting transactions. Extensively used VUGen to create Load Test Scripts.
  • Correlated and Parameterized test scripts to capture Dynamic data and input various test data as per business requirements.
  • Extensively used C programming language for incorporating business logic and error handling code in to the scripts.
  • Assisted Application Developers and technicalsupport staff in identifying and resolving defects.
  • Created LoadRunner scenarios and scheduled the Virtual Users to generate realistic load on the server using LoadRunner(Load generator machine)
  • Gathering the Test Input data. Test Input data includes Pre-condition, Test Input, Test Result, Test Regression data.
  • Planning of Test Strategy on how to automate the testing. Selecting the Test Cases for Regression testing and automating the Test Cases using WinRunner.
  • Creating GUI, Bitmap, Database and Synchronization verification points in WinRunner.
  • Validated the integrity constraints on the database by creating Procedures and Functions.

Environment: Win NT, Hp-Unix, Oracle 8.i, DB2, Sybase, MS SQL Server, Infomix Web Logic 6.x, Quality Center, Performance Center, Quicktest pro, Loadrunner 7.8

Confidential, Columbus, OH

QA Performance Analyst

Responsibilities:

  • Prepared Test Automation Project Milestones and deliverables. Prepared Project Schedule, Attended the Project Status Meetings
  • Designing, developing and the execution of reusable and maintainable automated scripts.
  • Created and executed scenarios that emulated typical working conditions using LoadRunner VuGen.
  • Used Window socket Protocol to upload the GL Records Through Excel into peoplesoft and RTE Protocol to test the mainframe application. Used Loadrunner Peoplesoft Protocol to test PeopleSoft Application
  • Planned the load by analyzing Task distribution diagram, Transaction Profile and User profile and executed Performance Testing using Mercury performance center
  • Conducted Performance testing and analyzed various graphs against product requirements.
  • Tested functionality of the application using Mercury QuickTestPro.
  • Added various monitoring parameters (CPU, Memory) to the LoadRunner controller for monitoring.
  • Setting run-time parameters (Think time, Pace time, Replay options etc.).
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.
  • Participated in discussions with the QA manager, Developers and Administrators in fine-tuning the applications based on the Results produced by Mercury Analysis.

Environment: PeopleSoft 8.9, Informatica 6.2, 2000/XP Professional, Oracle 9.2.05, Oracle 9i, UNIX, Quality Center 8.2, VuGen 7. 8, Performance Center 7.8, Quick test pro 8.2, IBM Web Sphere and XML.

Confidential

QA Tester

Responsibilities:

  • Tested an internet-based system that customers will access through a secure browser to electronically create payments for bills they receive in the mail, to pay their employees or even to make electronic transfers.
  • Responsibilities include, determine test strategies based on requirements, developed test plans and test cases and executed test scripts.
  • Execute System, Integration, End-to-End, and User Acceptance Test (UAT) test cases for Web-based and JAVA applications.
  • Installes TestDirector, and used TestDirector for requirements management, planning, scheduling, running tests, defect tracking and manage the defects and executing the test cases.
  • Created test procedures, test cases, test plan development and execution.
  • Performed functional, load, Integration, regression testing and viewing, analyzing results and defect tracking, defect reporting and documentation.
  • Creating LoadRunner scripts using Virtual User generator to perform the load and performance testing.
  • Used the Load Runner controller to execute the Vuser scripts with various scenarios.
  • Using Load Runner analyzed the response times of the business transactions under load.
  • Developed reports and graphs to present the stress test results to the management.

    Regularly followed up with Development Team to discuss discrepancies identified during testing and performance tuning.

  • Responsible for weekly status, Updated showing the Progress of the automation testing effort.

Environment: Windows NT, Oracle, SQL, MS Access, Test Director.

We'd love your feedback!