Performance Test Engineer Resume
Chicago, IL
Summary:
- 7 years of extensive experience in Software Quality Assurance and Performance testing
- Expertise in Test documentation, Performance testing and execution on Client/Server, Integrated Intranet, UNIX, Linux, Mainframes and Internet applications
- Hands on experience in using automated tools like Load Runner, Test Director, Quality Centre and SharePoint.
- Experience in Performance testing of Web applications and Client/Server by using Load Runner
- Performance testing Experience in J2EE, PeopleSoft, Oracle applications by using HTTP/HTML, Web Click &Script, Oracle 2 Tier and Citrix ICA Protocol and multiple protocols.
- Expertise in Manual and Automated Correlations to Parameterize Dynamically changing Parameters values
- Monitoring system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat
- Collecting the frequency of JVM Heap and Garbage Collection in Web sphere during test
- Hands on experience and exposure in all phases of project development lifecycle and Software Development Life Cycle(SDLC) right from Inception, Transformation to Execution which include Design, Development, and Implementation.
- Performed Black Box, Grey Box and White Box in Java, Java Script, Coldfusion and spectra and other web technologies.
- HTTP (web), CITRIX, ODBC, Oracle NCA, Winsock, Ajax, RMI-Java, .NET, RTE, Flax, COM/DCOM, Tuxedo, etc.
- Performed QA/Load testing on Multimedia (Streaming Video, & Audio) Ajax, .net & Citrix applications using LoadRunner.
LOADRUNNER EXPERTISE:
- Developing /Enhancing Vuser scripts
- Monitoring Vuser status.
- Filtering and sorting information
- Configuring Run-time settings for Vugen and Controller
- Developing /Recording/Enhancing Vuser scripts.
- Well versed with all functionality of Virtual User Generator.
- Configuring Run-time settings for Action / Think-time
- Conducted performance testing, stress testing using LoadRunner.
- Performed IP Spoofing using LoadRunner
- Installing, maintaining and administering LoadRunner software.
- Used Quality Center for tracking and reporting bugs.
- Activating / Configuring monitors
- Online monitoring of Graphs/Monitors
- Analysis of cross results, cross scenarios and merged graphs
- Knowledge of Java Virtual Machine internals including class loading, threads, synchronization, and garbage collection
- Analyzing scenario performance, graphs and reports.
- Working with different Vuser types and groups.
- Proficient in putting loops into the Load runner scripts to run scripts for multiple iterations.
- Hands-on experience on different versions of Load runner.
- Activating / configuring monitors and adding desired performance counters into the Graphs.
- Utilized various performance tools such as Oracle Enterprise Manager, pmon, nmon, top and WebLogic console for monitoring database cluster contention, I/O, User, CPU activities and overall server(s) performance
- Utilized Database, Network, Application server and WebLogic Monitors during the execution to identify bottlenecks, bandwidth problems, infrastructure problems, and scalability and reliability benchmarks
- Created different scenarios to isolate bottlenecks like Smoke Test, Scalability testing, Reliability testing, Stress testing, fault tolenance testing, Performance regression testing etc.
- Extensively Worked in Web, Citrix, Click and Script, Oracle Protocol, Seibel, Winsock, Soap protocols, webservices, RTE, SAPGUI.
TECHNICAL SKILLS:
Testing Tools LoadRunner, Jmeter, Quick Test Pro, Quality Center
Monitoring Tools Wily Introscope, J2EE Diagnostics, SiteScope, Wily, TeamQuest,
Operating Systems Windows XP/NT/2000, UNIX, Linux, DOS
Languages Java, C, C++, Visual Basic, JSP, .NET
Scripting Languages Java Script, VB Script, HTML, DHTML, XML
Databases DB2, Oracle, SQL Server, MS-Access
Version Tools PVCS, Visual Safe, ClearCase and ClearQuest
App Servers WebSphere, WebLogic, Oracle AS
Web Servers IPlanet, MS IIS, Oracle 9iAS
Database Tools TOAD, SQL Navigator
PROFESSIONAL EXPERIENCE:
Client: Confidential
Role: Performance Test Engineer
Location: Chicago, IL
Responsibilities:
- Analyzed system documentation like Requirements document, User Interface Specifications to develop and Execute Test scripts
- Designed the Test Environment and the Scenarios for the Load Testing.
- Co-coordinated with different Business Analysts and Developers to define the KPI for the workflows.
- Conducted performance, load, and stress testing using LoadRunner and JMeter.
- Extensively worked with HTML/ HTTP, Web Services Protocols.
- Involved in Correlation and Parameterization for the script, to ensure the script runs successfully during replay. Monitored the activities through LoadRunner Controller.
- Involved in the preparation of Performance test plan, test cases and execution strategies
- Developed test scripts through LoadRunner Vugen and executed them using LoadRunner Controller
- Developed test scenarios in the controller and executed multiple cycles of test scripts
- Work involved coordinating with the infrastructure team to ensure proper functioning of servers either before setting up monitors in the controller or during execution of scripts
- Analyzed results using the LoadRunner analysis tool and sent out daily updates to relevant stakeholders
- Identified bottlenecks in performance and reported them to the technical/infrastructure teams for fixing defects or tuning for better performance. Logging of defects was done using Mercury Quality Center
- Involved in testing batch processes in PeopleSoft by capturing time taken for processing large volumes of data. Data for batch testing was created using LoadRunner
- Was the primary person responsible for maintaining documents (test plan execution plan, meeting minutes) and results and in the preparation of the final performance report.
Environment: LoadRunner, Jmeter, PeopleSoft Portal, Performance Center, Quality Center, Toad, Oracle, Mainframe, MQ Series, Unix, HTML, DHTML,XML, QTP, IIS, Apache, Quality Center
Client: Confidential
Role: Performance Test Engineer
Location: Erie, PA
Responsibilities:
- Extensive experience in LoadRunner’s Virtual User Generator Protocols like HTML/ HTTP, Web Services, Siebel, Citrix & Oracle NCA Protocols.
- Participated in all the phases of SDLC starting from requirement design development Testing and implementation phase.
- Extensively worked on PeopleSoft applications. Involved in load testing, System tuning of the PeopleSoft applications. Also involved in recommending changes to optimize the environment to be able handle the application better.
- Gathered and Analyzed Business requirements and procedures.
- Responsible for developing the performance test strategies, plans and cases for Siebel Application.
- Deploying and managing the appropriate testing framework to meet the testing mandate
- Executed performance test on the existing hardware to confirm the scalability of the application.
- Planned the load by analyzing Task distribution diagram, Transaction Profile and User profile.
- Extensively used Loadrunner to conduct performance testing of the application.
- HP Business Process Testing Using SOAP UI, Quality Center for .COM Web Application
- Developed Web Service Vuser scripts for a Web Service Call using Soap UI.
- Prepared Loadrunner scenarios for Load and Performance testing using different host systems.
- Developed load runner Vugen Scripts using Correlation to parameterize dynamic values.
- Used Rendezvous points Load balancing and IP spoofing to load test specific transactions.
- Responsible for setting up performance monitors using secure CRT and Sitescope.
- Analyzed various graphs generated by Loadrunner Analyzer including Database Monitors, Network Monitor graphs, User Graphs Error Graphs, Transaction graphs and Web Server Resource Graphs.
- Responsible for collecting the frequency of JVM Heap and Garbage Collection in Weblogic during test
- Tracked defects using Quality Center.
- Involved in Database tuning to enhance the application performance.
- Provide weekly updates to the client and application team based on the test results and analysis.
- Implemented, Mentored LoadRunner Test Harness, Controller Scenarios, Run-time Settings, Correlation, Paramatization and other functions in LoadRunner
- Managed entire Performance testing (like Load, Stress, Volume, Endurance and Failover) using LoadRunner (Controller, Virtual User Generator, and Analysis).
- Installed LoadRunner and acquired the license of the LoadRunner
- Participated in design review /walkthrough sessions with Project Managers, Developers, and Products Team.
Environment: JMeter, LoadRunner, Siebel, ASP, VBScript, Toad, Oracle, Mainframe, MQ Series, Unix, HTML, DHTML,XML, QTP, IIS, Apache, Quality Center, Agile
Client: Confidential
Role: Performance Engineer
Location: Detroit, MI
Responsibilities:
- Managed test planning for LoadRunner testing , Performance testing, Integration testing, Systems testing, Acceptance testing, Regression testing and Cross-Browser testing
- Used LoadRunner, (utilizing C programming) for load testing.
- Performed manual and automated testing (using WinRunner, Astra Quick Test and Quick Test Pro).
- Load, Stress, sizing, scalability and capacity planning of different Witness Systems products which are client server products involved lot of Server testing.
- Evaluated Performance Testing Tools from IBM Rational and HP Mercury Suite.
- Implemented LoadRunner 8.1 and got the licenses from Mercury Interactive.
- Designed and Implemented Performance Testing Plan for QM, eReporting, OEM's (BT, Avaya), Balance, and WFO.
- Involved in testing HIPAA EDI Transactions.
- Developed test harness using Virtual User Generator in Single and Multi protocols (HTTP/HTML, RMI, Citrix, Dual Web/Winsock and Windows Sockets).
- Correlating and paramatazing scripts, configuring the Runtime settings in Virtual User Generator.
- Creating Scenarios with different schedules like Rampup, Duration.
- Monitor Performance using windows Performance monitors and LoadRunner monitors.
- Setting up testing environment of various Applications from various builds to do Performance testing (includes installation of operating systems, raid architecture using hp tools, installing applications from the suite and installing multi databases like MS SQL and Oracle).
- Work closely with EDI to ensure accuracy in data transmissions and shared processes.
- Analyzed Test using Summary Analysis, Average Transaction Response Time, Throughput, Windows Resource, Network delay and HTTP Codes
- Implemented Quick Test Professional, paramatized scripts for data driven testing.
- Defined, estimated and assigned tasks to other Team members.
- Met with Project Managers in defining and estimating tasks and risk.
- Assisted with planning, tracking and reporting the team’s progress against schedule and reported status to upper management.
- Tested and verified data mapping to appropriate tables and columns.
- Mentored and coached junior QA Analysts.
Environment: Loadrunner, Performance Center, Sitescope, Quality Center, Unix, Windows, JAVA, Jboss, Weblogic, Oracle, XML, SQL Server, MS Access, MS Vision, MS Project, VB, J2ee analysis, HTML.
Client: Confidential
Role: Performance Engineer
Location: Omaha, NE
Responsibilities:
- Managed resources and process of Performance testing (like Load, Stress, Volume, Endurance and Failover) using LoadRunner (Controller, Virtual User Generator, and Analysis).
- Responsible for hosting daily status, weekly status, defect status calls, MRD calls and triage calls for all the applications in TST space.
- Extensively worked on UNIX to change the database connections, tracing logs, monitor resources of the machines, create users and execute batch jobs.
- Coordinated with tools team to Install Mercury Diagnostics, Wily Introscope and Sitescope on the Performance environments for triage calls to identify the bottlenecks.
- Managed on-shore and off-shore team to develop test harness, execute Performance scenarios during nights and weekends and report generation.
- Presented results of the Performance testing along with Project Management team to the client’s mainly senior management.
- Also involved with project management team to schedule the testing activities for the TST space and resource allocation.
- Designed Test Case documents for Performance testing in Quality Center and report defects.
- Coordinated with Process owners to Identify the Business Process to be Performance Tested and perform functional UAT
Environment: LoadRunner, JavaScript, HTML, DHTML, JSP, VBScript, Toad, Oracle, UML, Mainframe, MQ Series, Unix, HTML, DHTML, XML, Windows NT, QTP, Quality Center, Agile.