We provide IT Staff Augmentation Services!

Sr Performance Tester Resume

0/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • 8+ Years of experience in Performance Engineering and Software Quality Assurance, Capacity, Availability and Performance Processes and various Performance Methodologies.
  • Strong experience in preparing Performance Test Plans, Performance Test Strategy, Performance Test Analysis Reports, Test Scenarios and Test Scripts for Automated Testing for various software applications.
  • Extensive Domain experience in Insurance, Banking, Financial Services, Mortgage, Credit cards, Stock and Mutual Funds.
  • Expertise in understanding Business Processes from provided requirements and converting them into practical Test Scenarios and analyzing the test results for reporting.
  • Proven track record of Black Box, Exploratory, Sanity, Functional, Performance, Negative, Regression, GUI, System Integration and Acceptance (UAT) Testing, load/performance, security, browser compatibility and performance testing and reporting procedures.
  • Experience in Planning, Installing, Configuring, Administering, Tuning and Troubleshooting IBM WebSphere Application Server ND.
  • Knowledge in Installation, configuration of WebSphere Application Server 5.1.x/6.0/6.1/ on AIX, Linux, Windows and Solaris Platforms.
  • Knowledge in Installation, configuration, Trouble - shooting and performance tuning of IBM HTTP Server, iPlanet, Apache Web server, IIS and Netscape Enterprise Server on different UNIX flavors and Windows Platforms.
  • Extensive experience in Back End, Client/Server and Application testing processes.
  • Performed manual and automated testing on entire Software Application.
  • Created automated scripts for load testing of multiple logins after system upgrades.
  • Developed and executed automated test scripts using Silk Performer to improve performance testing and improve confidence of fixes and upgrades.
  • Expertise in Performance tools Load Runner, Silk Performer and PC11, Neo Load, IBM Green Hat Tester, and JMeter.
  • Developed automated framework/libraries to maximize re-use and minimize time to delivery through maximizing automation and hence ROI.
  • Proven experience in defect tracking and reporting using Requisite Pro and Lotus Notes.
  • Comprehensive knowledge of Linux, UNIX and Windows Operating Systems.
  • Created, reviewed and maintained test data and test results documentation.
  • Able to work in a dynamically changing environment, as part of a team, with minimal direction.
  • Excellent written and Verbal communication & interpersonal skills.

TECHNICAL SKILLS

Operating Systems: MS-DOS, UNIX, Wi ndows 2000/2003/XP, Mac OS X and LINUX

Functional Testing Tool: RFT

Defect Tracking Tool: HP Requisite Pro and Lotus Notes

Performance Testing Tool: Silk Performer, HP Load Runner, Vugen 11.52, PC11.52, Optier, HP

Diagnostics: Neo Load, Green hat performance tester Dynatrace, OPNET, Site Scope and Optier, Wily Intrascope.

Programming Languages: C, C++, VB, PL/SQL,java, J2EE

RDBMS: SQL Server, Oracle, MS-Access, My SQL, DB2

Web Technologies: HTML, XML, Java Script, VB Script

PROFESSIONAL EXPERIENCE

Confidential, Chicago, IL

Sr Performance tester

Responsibilities:

  • Worked with AD team and DB team to gather the requirements and performance related issues, and created the test cases to meet the requirements.
  • Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Responsible for analyzing application and components like, Claims, Subscriber/Member, Provider, NetworX, Pricing behavior with heavier loads and optimizing server configurations.
  • Wrote high level LoadRunner scripts by using VUGen(Virtual User generator) for Single User, Base Line, Soak (Endurance test) scenarios and validating correct downloads of HTML pages by validating content in sources. Parameterized unique IDS and stored dynamic content in variables and pared the values to Web submits under Http protocols.
  • Dynamic Name Value pairs were handled using web custom request.
  • Web Page source was captured and analyzed when the user fails.
  • Wrote custom reusable functions for string manipulations, random think times, in C.
  • Connected to the WILY IntroScope and monitored reliable performance metrics, with the 15 seconds of interval for Windows and Unix boxes..
  • Closely monitoring the SBI servers metrics like CPU, Processes, Memory, I/O.
  • Used utility, RFHUTIL to monitor MQ Queue Depth, and to read XML format of output messages from Response Queues.
  • Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Worked on Loadrunner VuGen on Web HTTP/HTML, Web Services, Citrix, RTE, and ODBC Protocols.
  • Performed Regression tests after DB Restore and Update Stats on DB.
  • Database stored procedure executions, Table Scans, Indexes and dead locks with load were analyzed.
  • Used Clear Quest for bug tracking and analysis.
  • Memory leaks at each component level were identified and analyzed
  • Generated Data Driven scripts that access the backend database.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.

Environment: Loadrunner /Performance Center 11.5, Quality Center, Quick Test Pro, WILY Intoscope, Diagnostics, Sitescope, CITRIX, Unix, Sybase, ORACLE 10g, SQL Server, IBM Mainframes, MQ Series, SOA, Web logic 9/10, Web sphere server, Windows NT, WINTEL, JAVA, VB, .Net, C, Java Script, HTML, SOAP, XML

Confidential, Waynesboro, VA

Sr. Performance Test Engineer

Responsibilities:

  • Prepared Test Strategies, Test Plan and Test Cases as per the business requirements and Use Cases
  • Involved in Load Testing of various modules and software application using Load Runner
  • Developed the Load Test scripts using the Load Runner Virtual User Generator (VUGen) and enhanced the scripts by including transactions, parameterize the constant values and correlating the dynamic values
  • Enhanced Load Runner scripts to test the new builds of the application
  • Developing Vuser scripts and enhanced the basic script by Parameterzing the constant values
  • According to business specification Customization of scripts by using Load Runner
  • Carried out stress testing by introducing rendezvous points in the script
  • Conducted testing on the servers using Load Runner & Performance Center to establish the load capacity of the server
  • Using Load Runner analyzed the response times of various business transactions, modules login times under load, developed reports and graphs to present the test results
  • Used Sitescope & HP Diagnostics to monitor the load test and to identify the bottle necks
  • Monitored the performance of the Web and Database (SQL) servers during Stress test execution
  • Defined transactions to measure server performance under load by creating rendezvous points to simulate heavy load on the server
  • Used Load Runner for checking the performance of Various Reports and Queries under load
  • Analyzed the results of the Load test using the Load Runner Analysis tool, looking at the online monitors and the graphs and identified the bottlenecks in the system
  • Configured alerts in HP Sitescope to send auto emails to alert the CPU and memory utilization, downtime, Query execution time etc. for database and Linux machines
  • Analyzed results of Transactions Response time, Transaction under load, Transaction Summary by Vusers, Hit per Second and Throughput
  • Reported and entered bugs in Quality Center
  • Tested for the compatibility of the Browser
  • Developed High Level and Detailed Test Plans and reviewed with Team, demonstrated Customer Level experience to team
  • Identify critical functionality at business level
  • Updated test matrices, test plans, and documentation at each major release and performed Regression testing using automated script
  • Managed/Updated Shared object repository from time to time using Object Repository Manager
  • Used environment variables as global variables to pass the values between actions
  • Carried out the manual testing of different interfaces
  • Provided Test Estimates for various phases of the project
  • Reported and tracked defects in the Quality Center bug tracking system
  • Automated the test cases using Quick Test Professional
  • Monitoring on Wily Introscope for CPU, memory, Garbage collection, Thread usage and network utilizations on the Unix server using during the Test Execution
  • Creating & Monitoring dashboards on Wily Introscope.
  • Performed QA Process management by automated process, identified functional changed vs. business impact and trained QA team with cross business training
  • Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
  • Supported Production team to understand and execute the processes
  • Created application documentation to assist in the support and training of users
  • Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution
  • Created comprehensive test results report
  • Manage Performance Center servers, monitor the server status, edit server information, and check server performance
  • Manage timeslots, view user reservations, and monitor availability of time and resourcesManage timeslots, view user reservations, and monitor availablility of time and resources

Environment: LoadRunner, Performance Center, Quality Center, HP Diagnostics, SiteScope, Wily IntroScope, QTP, SQL Server, SQL Profiler, Windows, UNIX, Weblogic, Websphere, Performance Center, XML

Confidential, Columbus, OH

Performance engineer/ Sr. Performance tester

Responsibilities:

  • Identified the test requirements based on application business requirements
  • Prepared Test Cases, Test Strategy, and Test Plan based on the non-functional business requirements to meet SLA timings
  • Installed and configured LoadRunner tools for automated functionality and performance testing
  • Created virtual users using Vusers Generator and created scenarios to conduct the load test using Load Runner
  • Recorded Scripts using VuGen with web http/html and Web Services protocols
  • Performed correlation by rightly capturing the dynamic values and parameterize the data dependencies that are always a part of Business process
  • Have thorough understanding of assigning text checks, rendezvous points, parameterization, and correlation (capturing dynamic values like session id's /cookies) irrespective of the application
  • Generated Vusers and Vusers Groups in Controller and assigned to the scripts added to the Scenario
  • Simulated hundreds of concurrent users using Controller while monitoring both end-user response times and detailed infrastructure component performance (Servers, Databases, and Networks etc.)
  • Performed Load test, Stress test, Benchmark Profile test, Fail -Over test, Fail - Back test against supported configurations by uploading the Vugen Scripts in to LoadRunner
  • Extensively worked in UNIX environment for test executions
  • Checking the Application server logs on Mainframes and finding the various errors/exception thrown by the application server.
  • Strong understanding of TCP/IP networking and worked in a secured network with Firewalls
  • Used Sitescope to monitor server metrics and Performed in-depth analysis to isolate points of failure in the application
  • Configured alerts in HP Sitescope to send auto emails to alert the CPU and memory utilization, downtime, Query execution time etc. for database and Linux machines
  • Identified the long running queries from Oracle AWR report.
  • Analyzed results of Transactions Response time, Transaction under load, Transaction Summary by Vusers, Hit per Second and Throughput
  • Performed SQL querying to validate the data in the back end data base, and also to check the data flow between different modules
  • Architecting and setting up enterprise applications performance monitoring mechanism using CA Wily Introscope.
  • Identifying the various methods/ classes and designing pbd to track specific areas in the code that might be potential bottlenecks in the system.
  • Monitoring on Wily Introscope for CPU, memory, Garbage collection, Thread usage and network utilizations on the Unix server using during the Test Execution
  • Creating & Monitoring dashboards on Wily Introscope.
  • Experienced in Unix Shell scripting
  • Conducted volume testing by pulling full production volume of data for various regions from the source to the target system under test and analyzed performance of the database for potential bottlenecks
  • Prepared risk maps to identify the high risk and usage areas and drafted test strategy
  • Collaborated with architecture and development teams to analyze the application's core functionalities and its various dependencies for effectively identifying potential bottlenecks
  • Actively participated in the daily project meetings and walkthroughs
  • Gathered documentation for tracking, maintaining, creating and escalating daily issue logs
  • Coordinating with Off Shore on project issues and executions
  • Attended Defect Meetings and Status Meetings to resolve the bugs

Environment: LoadRunner 11.00, Wily Introscope, UNIX, PerfMon, Sitescope, DB2, HTML, Java, SQL Queries, SQL server, Weblogic, Websphere, Mainframes.

Confidential

Performance Engineer

Responsibilities:

  • Performed load tests on Load balancing environment.
  • Drilled down the problematic pages in Analysis to find out where the performance degradation is has been occurring.
  • Used HP Diagnostics Identify critical bottlenecks in pre-production or production regions.
  • Hands on experience with Unix Scripting used to automate the routine tasks.
  • Maintained an exclusive Load Test Database instance dedicated to Load Testing activity.
  • Responsible for generating and publishing Load Test Results and publishing the results on the internal portal
  • Responsible for performing High Availability testing
  • Responsible for the coordinating the new Transports to the Performance testing environments.
  • Responsible for Scheduling/Kicking off the Load tests through via HP Performance Center/Controller involving a variety of Load combination scenarios.
  • Tested a variety of SOA Web Services using Load Runner Web Service protocol and Custom scripting.
  • Drilled down the problematic pages in Analysis to find out where the performance degradation is has been occurring.
  • Analyzed all the various performance metrics involved in the test run like Web resources, Windows resources(Via Site Scope), IIS Counters, J2EE Monitors, .NET Counters, Oracle Counters etc.
  • Pinpointed the bottlenecks present in different layers of the Application and Identified Memory Leak in the App and made recommendations to overcome the same.
  • Performed a cross of the Load Runner results of between iterative baseline tests.
  • Involved in the Management of the Win Runner scripts, Load Runner scripts and Requirements and the bug database using Mercury Quality Center 9.0/8.0.
  • Involved in creating Templates for various monitors in HP Site Scope.

Environment: HP Load Runner 8.1, HP Quality Center 10, HP Performance Center 9.5, HP Site Scope, HP Service Test, SAP BI, F&R, BW Portal, MM, SD, XML, SAP GUI, SOAP UI, Java J2EE, HP BAC, SOAP UI, SOA, Web Services, Web Services, Oracle 10G, SQL Server 2008, Unix, Windows XP/Vista.

Confidential

Performance Engineer

Responsibilities:

  • Identify and eliminate performance bottlenecks during the development lifecycle.
  • Accurately produce regular project status reports to senior management to ensure on-time project launch.
  • Verify that new or upgraded applications meet specified performance requirements.
  • Used Controller to Launch 400,800,1600 concurrent user to generate load
  • Used to identify the queries which taking too long and optimize those queries to improve performance
  • Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
  • Independently develop LoadRunner test scripts according to test specifications/requirements.
  • Using LoadRunner, execute multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
  • Develop and implement load and stress tests with Mercury LoadRunner, and present performance statistics to application teams, and provide recommendations of how and where performance can be improved
  • Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests.
  • Identified Disk Usage, CPU, Memory for Web, APP, TUXEDO, Database servers and how the servers are getting loaded
  • Provide support to the development team in identifying real world use cases and appropriate workflows
  • Prepared Automation Test Plans and Test Data Sheets for Web Testing.
  • Implemented TestDirector for running Test sets in batch mode and analyze test results.
  • Develop and maintain Manual and WinRunner Automation Tests through Test Director.
  • Create Driver Scripts in TestDirector to run the Sanity.
  • Used Scheduler to schedule scripts run at particular time
  • Support the use of TestDirector for automation metrics tracking and test execution
  • Build Script with Data Driven Methodology which applies the Business rules to validate the components displayed on the website.
  • Implemented the Regular Expressions in GUI maps to run the tests in System Test, Integration Test, UAT
  • Customized scripts for error detection and recovery
  • Responsible for writing Startup scripts and Compiled Module Functions for front and backend validation.
  • Writes and executes SQL queries in validating test results
  • Running TSL, TestSet from Testdirector using Bat file.
  • Running Winrunner script from Bat file.
  • Compare and analyze actual to expected results and report all deviations
  • Used Virtual User Generator to generate VuGen Scripts for web (J2EE) and Citrix, MQ tester
  • Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Test Director.
  • Developed and deployed test automation scripts to do end to end performance testing using Load Runner.
  • Implemented and maintained an effective automated test environment and the QA Lab.
  • Analyzed the user requirements by interacting with developers and business analysts.
  • Written Test Plan and Test Cases by going through the Design, Functional Requirements and User (Business) Requirements Documentation.
  • Responsible for overall software product quality.
  • Carried out extensive automated testing with different test scripts which reflect the various real time business situations
  • Performed extensive Function, Integration, Regression, Multi-User, End-to-End, User Acceptance testing.
  • Prepared the Test data for interpreting the Positive/negative/regression results.
  • Involved in Batch Process and Front End Application Testing
  • Used Test Director Tool for implementing Test Scripts and Tracking Defects
  • Perform SQL queries to do database testing
  • Load and Volume testing was done by using Load Runner
  • Interacted with the various business groups and developers to get the most out of the software testing
  • Experience in evaluating current test methodology / testing practices and experience making recommendations / suggestions for improvement
  • Developed test scripts using WinRunner for Functionality, Security and Regression Testing
  • Automation of test scripts for functional and regression testing using WinRunner.
  • Participates in bug triage meeting to provide explanation of problems
  • Ability to remain agile, focused and make smart decisions in a rapidly changing, often ambiguous environment
  • Maintain bug tracking tool to report application bugs and enhancement requests.
  • Worked on Client Server and WebBased Application for Automation in LoadRunner using SSL and VuGen
  • Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production responsible for developing and executing performance and volume tests as well as developing and executing automated regression scripts
  • Develop test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems. This would require monitoring and debugging the Weblogic, Oracle, and Apache components including their key resources, performance indicators and logs.
  • Partner with the Software development organization to analyze system components and performance to identify needed changes in the application design
  • Performs in-depth analysis to isolate points of failure in the application
  • Assist in production of testing and capacity certification reports.
  • Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
  • Interface with developers, project managers, and management in the development execution and reporting of test automation results

Environment: LoadRunner,,WinRunner, Test Director, Oracle 9.x, C, C++, Java, Weblogic, 7.6, Windows NT, Tuxedo 6.5, UNIX, DB2, JCL, TSL, Java Script, HTML, XML, Oracle, C, Unix, Citrix, AMDOCS ENSEMBLE

We'd love your feedback!