We provide IT Staff Augmentation Services!

Performance Tester Resume

0/5 (Submit Your Rating)

Buffalo Grove, IL

SUMMARY

  • 7+ years of strong expertise in Performance/Load & Stress Testing using HP Load Runner/ PerformanceCenter.• Extensive experience in automated testing of Web based and Client/Server applications with proficiency in Load and Performance Testing. Good experience in agile methodology.
  • Experience of working with LoadRunnerPerformancetesting, Splunk or Dynatrace essential
  • Good working knowledge in APM tools App Dynamics for diagnosing complexperformance problems.
  • Extensive experience in Functional Testing, System Testing, Integration Testing, Regression Testing, Interface Testing, Load Testing, SOAK Testing, User acceptance Testing, UI Testing & Sanity Testing.
  • Perform the monitoringPerformanceof the application and database servers during the test run using tools like AppDynamics and SiteScope.
  • ExecutedPerformance tests - load, capacity and stress test using HP LoadRunner and Microsoft Visual Studio Load Test 2010.
  • Strong experience with Profiling tools like Dynatrace or AppDynamics
  • Execution of automated test scripts using Mercury Tools (Test Director/Quality Center, LoadRunner, and QTP), JMeterbased on business/functional specifications.
  • Experience in automating the web tests using VSTS
  • Good Experience on Selenium IDE and creating Scripts in selenium --RC by using Java
  • Expertise on various monitoring tools like HP Site scope and HP Diagnostics, to keep track of the test Performance and identify various bottlenecks.
  • Good working knowledge in using the VSTS tool for Performance Testing
  • Executed performance tests - load, capacity and stress test using HP LoadRunner and Microsoft Visual Studio Load Test 2010.
  • Experience in log file analysis for finding performance bottleneck using tools like - Splunk, GC Viewer etc.
  • Expertise in SQL queries to perform Backend testing • Experience in log file analysis for finding performance bottleneck using tools like - Spunk, GC Viewer etc.
  • Experience in monitoring Web Servers and Application Servers such as Microsoft IIS, web logic, Web Sphere and Database Servers such as SQL Server and Oracle during the Performance Test with and without firewalls.
  • Expertise on Web Services and experienced in using SOAP UI for testing of SOA environment.
  • Participated in Integration, System, Smoke and User Acceptance Testing.
  • Experience inPerformance testing of Web applications and Client/Server by using Load Runner
  • Well versed with all functionality of Virtual User Generator and Correlating Statements, configuring Run time settings for HTTP, iterations, Simulated Modem speeds to bring the testing scenario to real world
  • Strong knowledge of using Single and Multiple protocols in Loadrunner VUGen like Web Http, WebServices, Ajax TruClient, Web Click and Script, Citrix ICA,ODBC and Oracle NCA.
  • Good understanding of object oriented methodologies, software development life cycle (SDLC) and software testing methodologies
  • Excellent ability to understand complex scenarios and business problems, and transfer the knowledge to other users/developers in the most comprehensible manner
  • Good knowledge on Object Oriented Programming, experienced with C programming (C#), HTML, XML, CSS.
  • Experience in coding the .Net applications.
  • Experienced working with developers in White Box Testing, and debugging codes for better performancesresults
  • Quick learner by respect to latest technology, most excellent put into practice and system

TECHNICAL SKILLS

Testing Tools: Load Runner, QTP, PerformanceCentre, Silk Performer, Jmeter

LoadRunner Protocols: Web-HTTP, Web Services, TrueClient, JavaVuser, Oracle NCA

Scripting: JAVA SCRIPT, VB Script, SHELL

Programming Languages: C#, VB.net, ASP.net, HTML

Web/ Application: Servers MS IIS, Apache, Web sphere, Web Logic

Database: Oracle, Db2, SQL Server

Service Oriented Architecture (SOA): Web Services, XML, SOAPUI, WSDL, WCF

Monitoring tools: HP Sitescope, TMART, DynaTrace, HP BPM, Wily interscope, Splunk

PROFESSIONAL EXPERIENCE

Confidential, Philadelphia, PA

Sr. Performance Tester

Responsibilities:

  • Gathered business requirement, studied the application and collected the information from Business Analysts, Project Managers, Solution Architect's, Developers, and SME's.
  • Responsible for all phases, planning, developing scripts, execution ofPerformanceCenter scenarios and analysis in Agile environment
  • Responsible for Monitoring the Application'sperformanceunder Load using the key Web Server Monitors, Web Application server monitors for WebSphere, IIS 5.0, Apache monitors and NT PerformanceMonitors
  • Executed Test Scenarios using Latest version ofperformance Center 12.20 tool.
  • Responsible smoke andperformancetesting.
  • Created new scripts after code drop and also debugging scripts to after maintenance.
  • Worked with different teams - Micro strategy team, Portal team, DB2 and development team in debugging issues and supporting them in the process.
  • Involved in writing Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and Presentation of results.
  • Worked with App-Dev, Production, Technical and Business Managers in planning, scheduling, developing, and executingperformancetests.
  • Creating GUI, Bitmap, Database and Synchronization verification points in WinRunner.
  • Developed Load Runner test scripts according to test specifications/ requirements.
  • Responsible fortesting Web, Web Services and Ajax TruClient request.
  • Extensively monitored the all the applications using HPperformanceCenter and Sitescope
  • Created Various Vuser Scripts basing on the Critical Transactions Used by the Real Time users using VuGen of Load Runner. Identify and eliminateperformance bottlenecks during theperformance lifecycle
  • Verify that new or upgraded applications meet specifiedperformancerequirements.
  • Used HP Diagnostics to obtainPerformancedata for problem solving, trend analysis, and capacity planning.
  • Planned, designed, executed and evaluatedperformancetests of web application and services and ensured optimal application performance using Load Runner.
  • Usingperformance Center Performed manual increment and decrement user (Ramp Up and Ramp down) scenarios and when how many users can at a time do a transaction.
  • Setting run-time parameters(Think time, pace time, Replay options etc.), ramp up and load distribution.
  • Worked on executing Baseline, Benchmark, Stress, Memory leak test scenarios for internal and customer facing applications based on application's work load model.
  • Responsible for preparing System test environment (UNIX OS) using FTP.
  • Verify that new or upgraded applications meet specifiedperformancerequirements.
  • Ensured daily production support and administration of portal operations like end userperformance, identifying the bottlenecks before and after production migration and maintenance upgrades.
  • Implemented monitoring solutions with Graphite.
  • Creation ofperformancescripts in JMeter, VSTS and VUGen.
  • Actively involved in automating the Regression Testing process of the application using the existing manual testing scenarios using C#, ADO.Net, VSTS
  • Implemented the SQL automation from VSTS tool using ADO.net connections.
  • Extensively used Unix commands for debugging and used, modified & ran Shell Scripts for daily reports and data collection.
  • Worked closely with software developers and take an active role in ensuring that the software components met the highest quality standards.
  • Deployed Graphite + Tasseo for real time metrics collection and dashboards.
  • Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Database testing using SQL queries to compare data accuracy of backend for reports.
  • Monitored different graphs like transaction response time and analysed serverperformancestatus, hits per second, throughput, windows resources and database server resources etc.
  • Developed load test scripts using VuGen to make them flexible and useful for Regression testing.
  • Analysed the results and Created Analysis Report throughperformanceCenter Analysis, prepared and submitted Exit Report with Recommendations.
  • Analysed the system resource graphs, network monitor graphs and error graphs to identify transactionperformance, network problems and scenario results respectively.

Environment: SQL Server 2012, C#, HTML, HP ALM QualityCentre, SOAP UI, Load Runner(Web (HTTP/HTML, Web services, AJAX True Client, JavaScript Vuser), HPperformanceCenter 12.x.

Protocols: LoadRunner, Java over HTTP User, Web Services, Web (HTTP/HTML)

Confidential, Buffalo Grove, IL

Performance Tester

Responsibilities:

  • Participate in all meetings planned for particular release and obtain necessary technical requirement and such meetings include design review, test execution timeline etc.
  • Involved in writing test plans and test cases using requirements and use case documents and business requirement documents.
  • Created Test Strategy and Test plan for the testing effort.
  • Conducted Smoke, Non Functional, Functional, System and Integration testing
  • Used SoapUI Pro to perform Web Service test.
  • Gathering and analysing business and technical requirements forPerformance Testing purposes.
  • Responsible for generating the key Virtual user scripts using the Load Runner VUGen utility for web (HTTP/HTML), LDAP and WINSOCK Protocols.
  • Uploaded and configured WADL file to SOAPUI and JMeter applications to test the web services application.
  • Created VUsers to emulate concurrent users, inserting Rendezvous points in the Vuser scripts and executed the Vuser Scripts in various scenarios which were both goal oriented and manual using Load Runner.
  • Developed web services automated scripts from API document to verify Restful web service calls using XML and JSON format.
  • Responsible for Monitoring the Application'sperformanceunder Load using the key Web Server Monitors, Web Application server monitors for Web Sphere, IIS 5.0, Apache monitors and NTPerformanceMonitors
  • Responsible for Running the Load Runner scenarios for the Vuser using Load Runner Controller and monitoring the server response times, throughput, Hits/sec, Trans/sec Transaction Response under load, Web Server Monitors, App server monitors, system monitors such as java processes and a host of otherPerformancemetrics.
  • Implemented IP Spoofing techniques to simulate unique users' requests while running the tests.
  • Made many enhancements to the recorded scripts by correlating, parameterizing, inserting debugging messages, string manipulation, and any other script enhancements as and when needed.
  • Configured various Web Sphere monitors for WAS applications to figure out which of the several servlets/JSPs caused the problem.
  • Created quantifiable load with test-scenarios for various applications (both standalone and integration) using Load Runner's Controller.
  • Used various servers and ran SQL queries in SQL Server 7.0 on the back end to ensure the proper transaction of data during various tests.
  • Participated in discussions with the QA manager, Developers and Administrators in fine-tuning the applications based on the Results produced by Analysis Tool.

Environment: Windows NT, Unix, Linux, Oracle 8.i & 9.i, SQL, MS SQL Server, Oracle Finance, .Net, Web Services, XML, HP Load Runner (LDAP, WinSock, Web HTTP/HTML) Test Manager.

Protocols: Web Http, Oracle NCA, Microsoft ADO.Net.

Confidential, Kansas City, MO

Performance Tester

Responsibilities:

  • Prepared Test plan and Test specifications based on Functional Requirement Specifications and System Design Specifications for ecommerce application.
  • Analysed system designs specifications and developed test cases for overall quality assurance testing
  • Extensively did functionality test using WinRunner
  • Identified tests to be automated and converted to Test Scripts using QTP for Functional and Regression testing
  • Executed scenarios using HPPerformanceCenter and analysed the results throughPerformance Center Analysis to find the bottlenecks in networks and server resources including deadlock conditions, database connectivity problems and system crashes under load.
  • Parameterized actions and created files using random, sequential and unique options in VUGen.
  • Developed Virtual User Scripts using protocols like Web (http/html), Web Services, .Net, Oracle NCA.
  • Reviewed and analysed Web Services contracts, WSDL, XSD and XML files.
  • Used Correlation to handle the Dynamic return values and scheduled the test under work load compositions.
  • Developed load test scripts using VUGen to make them flexible and useful for Regression testing.
  • Configured monitors to monitorperformanceof individual Hosts behaviour on load.
  • Created scenarios using HP Controller to do Load and Stress test.
  • Extensively Used HP Diagnostic to monitor theperformance bottle necks.
  • Load Runner was used to simulate multiple Vuser scenarios. Defined Rendezvous point to create intense load on the server and thereby measure the serverperformanceunder load.
  • Analysed the results and Created Analysis Report throughPerformance Center Analysis, prepared and submitted Exit Report with Recommendations.
  • Analysed the system resource graphs, network monitor graphs and error graphs to identify transactionperformance, network problems and scenario results respectively.
  • Enhanced Vuser scripts with transactions, functions, parameterization and correlation.
  • Measured the response time at different points in the application using Site Scope monitoring tool.
  • Configured Web, Application, and Database server performance monitoring setup using LoadRunner Controller and HP Site scope.
  • Parameterized various links in the application for Functional/Integration testing.
  • Created and maintained Requirement Traceability matrix.
  • Performed UAT testing for each UAT release build.
  • Provided back end testing for database auditing and data validation using SQL scripts.
  • Build and execute SQL queries to verify the data updates to various tables and ensure data quality and integrity.
  • Created detailed test status reports,performancecapacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.

Environment: Agile Methodology, .NET, IIS 6.0, Oracle, UNIX, HPLoadRunner 9.10/9.51 (Oracle Web Application, ODBC, Web Http/HTML), HP Sitescope 9.02, QualityCenter, Test Creator.

Protocols:Web Http, Microsoft ADO.NET, JavaScript Vuser

Confidential

Performance Tester

Responsibilities:

  • Responsible for Requirements gathering for the new enhancement in Defect Tracking system.
  • Designing the Test Architecture and the Scenarios for the Automation.
  • Created and documented the Test Scenarios for each functional area mentioned in Test Plans to develop the test scripts (automated scripts).
  • Responsible for Test case purpose writing, checkpoints and Test case steps writing.
  • Validated the test cases based on the functional specifications.
  • Modified the test cases, which were not following the specification flow.
  • Participated in internal and external Reviews for Test Description document and track the • Review records till closer.
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Developed Loadrunner Scripts in Web, Web services and Database protocols.
  • Responsible for Database testing to verify records in backend after updating, modification and deletion of records from fount end and vice versa.
  • Handled day-to-day activities to client and was involved in meetings and calls with offshore management.
  • Executing system test cases, regression test cases.
  • Complete defect management and reporting.
  • Participated in regular meetings with developers for reviews, walkthroughs and bug dance.
  • Reported bugs using Quality Center Bug Tracking system and verified fixes with every deployment.
  • Using Test Director for complete defect management and reporting.
  • Responsible for weekly status, updated showing the Progress of the automation testing effort.

Environment: Windows 2000/XP Professional, Oracle 9.2.05, Oracle 9i, UNIX, Quality Center 8.2, Test Director, .Net, IBM Web Sphere and XML.

We'd love your feedback!