We provide IT Staff Augmentation Services!

Sr. Performance/system Test Engineer Resume

3.00/5 (Submit Your Rating)

Cleveland, OH

SUMMARY:

  • 10+ years of experience in the field of Performance and Capacity testing on web based and Client - Server applications in various domains
  • Strong knowledge of all phases of SDLC and Strong working knowledge of Software testing (Functional, Integration, Performance, Quality Metrics)
  • Experienced in defining Testing Methodologies, Designing Test Plans and Test Cases, Verifying and Validating Web based e-Commerce applications and Documentation
  • Expertise in test Documentation, Manual, Automation testing and Execution on Client/Server, UNIX, Linux, Mainframes and Internet applications
  • Hands on experience in using automated tools like LoadRunner, Test Director, Quality Centre
  • Worked in Agile project ensuring close, daily cooperation between business people and developers
  • Worked closely with the developers and business customers to understand the business requirements and overall strategies
  • Experience in Performance Testing of Web applications, Client/Server by using LoadRunner
  • Developed and deployed test Load scripts to do end to end Performance Testing using LoadRunner
  • Performed Scalability testing to identify major workloads and mitigate bottlenecks in the application
  • Analyze test results to identify bottlenecks and suggested possible solution to optimize the application performance
  • Involve in Performance testing environment setup and Software and Hardware Configuration
  • Involved in Performance Tuning to ensuring that the application responds to requests within an acceptable timeframe
  • Good understanding of network Load Balancing and server architecture to verify scalability during test execution
  • Experience in AJAX, J2EE, Oracle App,.net applications by using WebServices, HTTP/HTML, Web Click & Script, Oracle NCA, Citrix ICA, AJAX protocol
  • Expertise in Manual, Automated Correlations and use Correlation Rules to Parameterize values
  • Monitoring system resources such as CPU Usage, % of memory occupied, netstat, vmstat, i/ostat
  • Measured Response times at sub transaction levels at Web, App and Database Server levels
  • Strong knowledge in database/SQL (SQL Server, DB2, Oracle and Postgres) queries
  • Expert in writing, executing test cases, using of various tools for bug tracking and generating reports
  • Experience in Insurance, Financial, Retail and Pharmaceutical industries
  • Have an ability to handle multiple projects with competing priorities
  • Individual with good analytical, inter personal and problem solving skills

TECHNICAL SKILLS:

Testing Tools: LoadRunner 11.0, 11.50, 12.0, 12.50 QualityCentre,ALM, QTP9.5

Monitoring Tools: SiteScope, Compuware (VA), Wily Introscopy, Wireshark, AppDyn, Splunk

Operation Systems: MS-DOS, Windows 95/98/2000/XP/NT/7, UNIX, Linux, Mainframes OS/390

Languages: C, C++, Visual Basic, Java, SQL

Scripting Languages: VB Script, JavaScript, HTML,DHTML,XML

Databases: SQLServer2005, DB2, Oracle8i/9i, Postgres8.4

App Servers: IIS, WebSphere 4.0/ 5.1, WebLogic 5.0, 6.1, Oracle AS 10g

Web Servers: IIS, I Planet, Apache

Database Tools: TOAD, SQL Navigator

ERP Tools: Oracle 9iAS,SAP Business Objects

Version Tools: MS-Visio, Clear Case, PVCS, CVS and MS Office, SharePoint

PROFESSIONAL EXPERIENCE:

Confidential, Cleveland, OH

Sr. Performance/System Test Engineer

Responsibilities:

  • Review, evaluate and provide test input during business requirements and design specifications gathering sessions
  • Developed Performance Test Plan and Test Cases for Normal and Stress based on client requirement
  • Independently developed/updated LoadRunner (VUGen)test scripts according to test specifications
  • Enhanced the scripts by including Transactions, Parameterize the input values in a XML format along with APIKey and IGuard Token
  • Created Stress test scenarios for Baseline and Test Build tests by using the LoadRunner Controller
  • Executed the scripts in multiple Stress (Performance testing) environments compare response time
  • Monitored Response Times, Throughput and Server Resources such as CPU utilized, Available Bytes and Process Bytes
  • Monitored resources to identify performance Bottlenecks, analyze test results and report the findings to by using monitoring tolls such as AppDyn and Splunk
  • Wrote the custom SQLs to captured the WebServices Methods and StoredProcedures’s response time in the SQL Server Management Studio
  • Capture the Avg and 95% Ile response times form the CIF Logs in the CIF database(Common Instrumentation Framework)
  • Log outcomes and verify test execution then analyze and recover from execution errors
  • Identify and communicate testing issues, implement process improvement and issue solution
  • Identify the target test items to be evaluated by the test effort and define the appropriate test requirement

Environment: LoadRunner 12.50, Web Services, ASP.NET, MS Windows Server 2008, SQL Server 2008, Visual Studio 2012, DM, BizTalk, Oracle, IIS Web/App, MyAppsQA, AppDyn and Splunk, Responders, Performance Analyzer (PAL), SharePoint, SCM

Confidential, Cleveland, OH

Sr. Performance Test Analyst

Responsibilities:

  • Develop Test Approach, Test Plan and Test Scenario documents to cover all Performance testing
  • Find out Load Critical transactions with response times, number of hits per peak day or peak hour from Production data from SSRS Reports, in order to mimic the real time scenario in our test
  • Setting up the environment, configure the Responders to make external call (outside of
  • Develop/deploy test automation scripts to do end to end performance testing using Load Runner
  • Develop LoadRunner Scripts and enhanced to accommodate different concurrent virtual users
  • Capture dynamic values such as VIEWSTATE and EVENTVALIDATION in .NET Application and create a Correlation rule in the recording options
  • Using LoadRunner, execute multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller
  • Prepare the test data for the Parameterized values in the LoadRunner scripts for multiple scenarios
  • Extensively worked on PAL Performance Monitoring tool to analyzed the CPU, available Memory, Memory leaks, hits/sec
  • Resource counters are configured before execution and analyzed the results to identify the potential performance bottleneck
  • Analyzing the LoadRunner reports to calculate Response time and Transactions Per Second
  • Analyze the Transactions performance by using SQL Profiler and capture the Execution Plan to validate the Stored Procedures’s performance
  • Use SCM automated tool to deploy the code in Performance test servers and update/modify the Web.Config files according to our environment

Environment: LoadRunner 11.50, Web Services, ASP.NET, MS Windows Server 2008, MF, SQL Server 2005, DB2, IIS Web/App, Performance Analyzer (PAL), SharePoint, SCM, Quality Center

Confidential

Load Performance Analyst

Responsibilities:

  • Developed Performance test plans and test cases for Normal, Peak, Stress and Endurance loads based on client requirement
  • Independently developed LoadRunner test scripts according to test specifications/requirements
  • Developed the Load Test scripts using the LoadRunner Virtual User Generator (VUGen)
  • Enhanced the scripts by including transactions, parameterize the constant values and correlating the dynamic values
  • Created Load/Stress scenarios for performance testing using the LoadRunner Controller
  • Involved in Load Testing of various modules and software application using LoadRunner
  • According to business specification Customization of scripts by using LoadRunner built in functions
  • Used Manual and Automated Correlation to Parameterize Dynamically changing Parameters
  • Conducted Load testing on multiple servers to establish load capacity of the servers at 10,000 users
  • Analyzed the response times of various business transactions, modules login times under load
  • Monitored the performance of the Web and Database (SQL) servers during Stress test execution
  • Defined transactions to measure server performance under load by creating rendezvous points to simulate heavy load on the server
  • Monitor resources to identify performance bottlenecks, analyze test results and report the findings to the clients, and provide recommendation for performance improvements as needed
  • Identified the performance issues, including: deadlock conditions, database connectivity problems
  • Used LoadRunner for checking the performance of Various Reports and Queries under load
  • Analyzed the results of the Load test using the LoadRunner Analysis tool, looking at the online monitors and the graphs and identified the bottlenecks in the system
  • Reported and entered bugs in bug tracking tool called Site Help Desk

Environment: LoadRunner9.10/9.50, QTP10.0, IIS, JavaScript, Jet Script, Apache, MS Windows Server 2003 HTML, DHTML, XML, Barracuda and Big IP (F5) (Load Balancer), Wireshark, Snagit

Confidential, New York City, NY

Performance Tester

Responsibilities:

  • Developed test scenarios and test cases in Test Director to cover all Performance testing
  • Find out Load Critical transactions such as load profile, load levels, volume of records present in the database and hardware/software components
  • Developed/deployed test automation scripts to do end to end performance testing using Load Runner
  • Installed the LoadRunner and SiteScope software and configured according to our requirements
  • Develop LoadRunner Scripts and enhanced to accommodate different concurrent virtual users
  • Capture dynamic values such as Session ID’s and VIEWSTATE in .NET Application and create a Correlation rule in the recording options
  • Using LoadRunner, execute multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller
  • Debug the scripts by using step by step mode and lr debug message function
  • Used LoadRunner for checking the performance of Various Reports and Queries under load
  • Defined transactions to measure server performance under load by creating rendezvous points to simulate heavy load on the server
  • Prepare the Huge data for the Parameterized values in the scripts for multiple scenarios
  • Conducted Performance testing by creating Virtual Users and Scenarios using LoadRunner
  • Extensively worked on SiteScope Performance Monitoring tool to analyzed the CPU, available Memory, Memory leaks, hits/sec
  • Resource counters are configured before execution and analyzed the results to identify the potential performance bottleneck
  • Analyzing the LoadRunner reports to calculate Response time and Transactions Per Second

Environment: LoadRunner 9.10, ASP.NET, MS Windows Server 2003, Oracle 9i, IIS Web/App, Linux, SQL Navigator, SiteScope, Quality Center

Confidential, Minneapolis, MN

QA Analyst/Performance Tester

Responsibilities:

  • Developed test cases using Test Director and Quality Center to cover overall QA testing
  • Documented test cases, performed Manual Testing and Added, Reviewed and Traced defects till completion using TestDirector
  • Responsible for documenting and executing Test Scenarios by reviewing all the Business, Design and Architecture documents
  • Performed Functionality, Integration, End to End and User Acceptance Testing
  • Performed database testing in Real Time Inventory module
  • Involved in creating Automated Test Scripts, wrote SQL statements
  • Worked with other QA testers to ensure that all projects are executed accurately and completely
  • Performed System, Integration, End to End and User Acceptance testing and monitored the applications behavior during different phases of testing
  • Used Quality Center as repository for requirement analysis, design test cases, Execute test cases, Bug tracking and reporting
  • Used SharePoint as a centralized repository for shared documents, as well as browser-based management and administration
  • Generated VUser scripts and Executed Performance Tests using LoadRunner
  • Developing VUser scripts and enhanced the basic script by Parameterizing the constant values using LoadRunner
  • Used Manual and Automated Correlation to Parameterize Dynamically changing Parameters
  • Monitored the metrics such as response times, throughput and server resources such as CPU utilized, Available Bytes and Process Bytes by using LoadRunner Monitors
  • Monitored system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat
  • Monitored Net Statatistics, Load Balance and network traffic in each of the JVM's
  • Collecting the frequency of JVM Heap and GC Cleaned up in WebSphere during test execution
  • Analyzed the reports to calculate Response time and Transactions Per Second in Analysis
  • Collected the information and Compiled the reports in the Excel Spread Sheet for reporting
  • Summarized and did weekly reports of Performance data and metrics
  • Participate defects meeting to discuss the bottlenecks and long running queries

Environment: Java, J2EE, JSP, Servlets, EJB, HTML, IPlanet, WebSphere, DB2, XML, SQL, Windows XP, UNIX, Linux, LoadRunner 8.1 FP4, TestDirector, Quality Center, SharePoint

We'd love your feedback!