We provide IT Staff Augmentation Services!

Jr. Performance Tester Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • 5 years of professional experience in Software testing and quality assurance. Exposed to all phases of Software development life cycle and Software testing Lifecycle. Possess strong analytical, verbal and interpersonal skills wif capability to work independently.
  • Experience in testing of Web based, Client/Server and SOA applications
  • Diverse experience in information technology wif emphasis in Performance Testing and Performance Engineering using various performance testing tools and monitoring tools.
  • Experience Performance testing on Multi - tier Environments for different Software platforms
  • Expertise in developing Test Plans, Test Functional /Non-Functional automation scripts, creating test scenarios, analyzing test results, reporting Bugs/Defects, and documenting test results.
  • Expertise in different testing methodologies like Agile, Scrum, Waterfall and Mobile etc.
  • Experience in Collaborating wif Key Stakeholders-Business Representatives, Product/Project Managers, Developers, DBA’s, Infrastructure leads, Architects, Middleware etc.,
  • Strong experience in designing, documenting, and executing test cases using Mercury Quality Center.
  • Created Performance scenarios and scripts for various types of tests (load, stress, baseline/ benchmark/ Capacity).
  • Good experience in Unit, Smoke, Sanity, Functional, Regression, Integration, System, Load, Performance, Stress and UAT testing methodologies.
  • Sound noledge of Load, Stress and performance test using various performance testing tools such as Load Runner, Performance center, ALM Performance Center, QA Load, JMeter.
  • Expertise in recording/Coding Vugen scripts using different protocols in all types of environments.
  • Expertise in parameterization, manual correlation, run time settings and C.
  • Excellent noledge and skills in test monitoring for transaction response times, web server metrics, Windows / Linux / AIX system resource, Web App, Server metrics, Database metrics, and J2EE Performance.
  • Experience in analyzing Performance bottlenecks, Root cause Analysis and server configuration problems using Load Runner Monitors, Analysis, Site Scope and J2EE Diagnostics.
  • Knowledge of Java Virtual Machine internals including class loading, threads, synchronization, and garbage collection.
  • Monitored and analyzed teh performance of web applications and database servers.
  • Worked on various Load Runner Protocols like Web (HTTP/HTML), Web Services, Siebel Web, FLEX, Citrix, SAP-GUI/Web, .NET, JAVA Vuser, Java Over HTTP, Sybase CTlib/DBlib, Oracle 2-Tier, Ajax and Multi Protocols.
  • Proficient in Monitoring all Tiers (Web, App, Network, DB, Mainframes) in QA, Pre Prod (Performance) and Prod Environments using various monitoring tools such as Site Scope, HP Diagnostics, Wily Interoscope, Dynatrace, App Dynamics, Team Quest, Perfmon. NMon etc.,
  • Well experienced in Installing, Configuring and Maintaining teh performance test tool environment (Hands on experience on all Versions of Load runner and JMeter)
  • Prioritizes tasks, accomplishing deliverables on time while working independently wif minimum supervision toward realizing targets.
  • Well Versed wif different defect tracking tools (Test Director, Quality Center, ALM, JIRA, Clear quest, Rational Clear Quest etc.) for managing defects.
  • Experience in Tuning and resolving performance bottlenecks
  • Strong in Analyzing Functional and Business requirement and Developed Test Plans, Test Scripts and Test Cases and execute them for Manual and Automated Testing.
  • Did JVM Tuning on teh Garbage collection, which is a key aspect of Java server performance and also Revised JVM Heap Sizes analyzing teh Performance of teh Application.
  • Took Thread Dumps and Heap dumps for finding and analyzing teh Bottleneck areas.
  • Expert in deliverables like Test Report and Test Analysis (Weekly Status Report, Work Break down structure, Defect Trend etc.
  • Good experience in engaging wif business contacts and stakeholders for requirements gathering, architecture review and results analysis.
  • Good analytical, interpersonal and communication skills. Driven, committed and hard working wif a quest to learn new technologies and undertake challenging assignments.

TECHNICAL SKILLS

Testing Tools: HP Load Runner 8.1/9.5/11.0/11.50/12.02/12.50, HP Performance Center 9.5/ 11.0/11.5/12, ALM 12.50, HP Quality Center, JMeter 2.5/2.7/2.8/2.9/2.10, SOAP UI and UFT

Languages: Microsoft C#, C, C++, .Net, Java

Markup/Scripting Languages: DHTML, CSS, J Query, JavaScript, XML, HTML

RDBMS: MS SQL, Microsoft Access, SQL Server, Oracle Database

Operating Systems: AIX, HP-UX, Solaris, UNIX, Windows XP,2003,2000,Vista, Windows NT and Linux

Monitoring Tools: Performance Center, Wily Intro scope, Site Scope, Dynatrace, App Dynamics, HP Diagnostics, Transaction Viewer, Splunk, Windows Performance monitor, Nmon

PROFESSIONAL EXPERIENCE

Confidential

Jr. Performance Tester

Responsibilities:

  • Involved in requirements gathering and developingperformancetest plans.
  • Designedperformancetest suites by creating Web (HTTP/HTML), Web services, Java over HTTP and Click & Script test scripts, workload model. Extensively used VuGen andJmeterto create Test Scripts
  • Identify system/application bottlenecks and worked to facilitate teh tuning of teh application/environment in order to optimize capacity and improveperformanceof teh application in order to handle peak workloads generated via LoadRunner tool to simulate activity.
  • Created scenarios to emulate concurrent Vusers, tested using Rendezvous points in teh Vuser scripts and executed teh Vuser Scripts in various scenarios which were both goal oriented and manual using Load Runner.
  • Correlated and Parameterized test scripts to capture Dynamic values and input various test data as per business requirements.Involved in teh creation of PT calendar, which gives teh details of teh load test schedules.
  • Developed and Executed teh test scripts forBaseline, load, Stress and Endurance (Soak)Testing.
  • Performed performance load, stress and smoke test for teh scripts created in theUFTby testing simulating multiple users inLoad Runner.
  • Parameterized test data to accurately depict production trends.
  • PerformedperformanceBaseline, load, Stress and Endurance (Soak)Testingby simulating multipleVusersusingLoad Runner.
  • Inserted Transaction Points to measure theperformanceof teh application inLoad RunnerandJmeter.
  • Validated teh scripts by running multiple iterations to make sure they has been executed correctly and meets teh scenario description.
  • Executed Test scenarios and monitoredWeb, AppandDB server'susages ofCPU, Heap, Throughput, Hit Ratio, and Transaction Response Time and Garbage collections.
  • Prepared Test Analysis, Test Result Summary in html Reports and presented wif exclusive graphs in excel sheet for teh all runtime Metrics.
  • App behavior monitoring wifApp Dynamics/Dynatraceby creating custom dashboards and reports.
  • Text checks were written, Created scenarios for Concurrent users. Run time settings were configured forHTTPiterations. Used maximum bandwidth speed to bring thetestingscenario to real world.
  • CPU, Memory, Transaction Response time, Dead locks, Thread Count, Hogging Thread Count, Queue Length and through put were mainly monitored while runningPerformanceBaseline,Load, StressandSoaktesting.
  • Used JMeter tool for load testing client/server applications, web services application and provide analysis ofperformanceand to test server/script/object behavior under heavy concurrent load.
  • Written high-levelLoad Runnerscripts for Single User by storing dynamically varying object IDs in variables and validating correct downloads ofHTMLpages by validating content in sources.
  • Parameterized unique IDS and stored dynamic content in variables and pared teh values to Web submit under Http protocols.
  • UsedALM, Jiraas defect tracking tool to track teh defect, to report and to coordinate wif various groups from initial finding of defects to final resolution.
  • Uploaded teh Scripts to Git-Hub
  • Configured teh Jenkins Servers and written Groovy Code to execute teh pipeline from Jenkins
  • Facilitate variousperformance-tweaking exercises and work wif developers to resolveperformancerelated issues
  • Created and delivered detailedPerformanceTest Results, documented test findings, and made recommendations on applicationperformanceduring teh tests to relevant stake holders dat includes developers, DBAs, Product Owners, and Project Managers

Environment: HP Load runner 12.0/12.5, HP Unified Functional Testing (UFT), Performance center 11.5/ALM, Jmeter, Jenkins, Git-Hub, Dyna Trace, app Dynamics, Windows XP, VUgen, Integration Servers, Windows 2008, Windows Vista, Web applications, XML files, J console, J Visual VM, SOAP UI.

Confidential, Dallas, TX

Sr. Performance Test Analyst

Responsibilities:

  • Analyzed Business requirements and use case documents and developed Performance test cases; test scenarios and test plan accordingly.
  • Created Test Strategy, Test plan, Test closure reports, and metric reports for all Performance Testing efforts.
  • Handled project independently and Lead projects wif offshore team.
  • Planned, designed, Implemented and Executed Stress/Load/Performance Testing.
  • Coordinated, scheduled and implemented teh testing phase for new applications as well as upgrades to existing applications.
  • Responsible for scripting, Analyzing results using JMeter &HP Load Runner 12.0/12.50 tool.
  • Extensively used Load Runner Protocols Web HTTP/HTML, Siebel Web and Mobile Web Http/Html Protocols.
  • Responsible for creating scripts and running performance test for Android Applications as part of Mobile performance testing.
  • Performed Endurance tests by executing teh test for longer hours in order to Record teh memory Foot print and find out any Memory Leaks, slow resource consumption problems etc.,
  • Analyzed teh results of teh tests dat were used to assist in teh identification of system defects, bottlenecks and breaking points.
  • Involved in working wif development teams and concerned teams to fix performance issues.
  • Reported Defects in Defect Tracking Tool (JIRA).
  • Simulated hundreds of concurrent users using Load Generator while monitoring both end-user response times and detailed infrastructure component performance (Servers, Databases, and Networks etc.)
  • Conducted Duration test, Stress test, Baseline test and several other performance tests.
  • App behavior monitoring wif App Dynamics/ Dynatrace by creating custom dashboards and reports
  • Used Splunk to monitor teh Logs in depth for all teh Servers of teh Application
  • Used Dynatrace, J probe tools for profiling teh application to find out where teh performance issue
  • Did deep diagnostics using DynaTrace Tool, Monitored DB and Application Servers to trouble shoot root cause of problem
  • Carried out deep dialogists using DynaTrace to capture memory leaks in teh application by carrying out Longevity tests.
  • Used DynaTrace to measure web site performance in test environment to capture performance metrics of key product features
  • Analyze Heap behavior, throughputs and pauses in Garbage collections as well as tracking down memory eaks.
  • Tuned abnormal long GC pauses by breaking it down into smaller incremental pauses
  • Tuned number of full GC and its CPU spikes at high memory conditions by increasing heap size and thereby eliminating JVM abnormalities
  • Capacity planning / sizing recommendations i.e. increase JVM heap memory, JVM database connections, additional number of JVM’s, additional hosts etc. based on current production metrics/ capacity baselines
  • Monitoring application health using Dynatrace and reviewing performance of different components of web pages, also comparing daily, weekly and monthly trends for deep down analysis.
  • Setting up user profiles, configuring and adding application servers on Dynatrace tool.
  • Reviewing web components using Dynatrace client, analyzing and giving feedback to improve performance.
  • Analyze Heap behavior, throughputs and pauses in Garbage collections as well as tracking down memory leaks.
  • Tuning support wif metrics from AWR Reports and Explain Plans.
  • Worked closely wif Development and Business team to get an understanding of teh system architecture, system component interactions, application load pattern and teh Performance SLA.
  • Developed and maintain performance library wif focus on potential reuse.

Environment: HP Load runner 12.0/12.5, Performance center 11.5/ALM, Dyna Trace, app Dynamics, Splunk Android/IOS, Mobile performance Testing, Windows XP, VUgen, Integration Servers, Windows 2008, Windows Vista, Web applications, XML files, J console, J Visual VM, SOAP UI.

Confidential, Clearwater, FL

QA Analyst

Responsibilities:

  • Performed End-to-End system testing and reported defects in Quality Center.
  • Maintaining noledge of Medicare and Medicaid rules and regulations pertaining to teh Facets and evaluating teh impact of proposed changes in rules and regulations.
  • Carried out various types of testing at teh deployment of each build including Unit, Sanity, & Smoke Testing
  • Provided teh management wif test metrics, reports, and schedules as necessary using MS Project and participated in teh design walkthroughs and meetings.
  • Worked wif teh clients on teh final signing process in teh User Acceptance stages.
  • Maintained teh Test Case Execution Matrix.
  • Processed Testing files using Web services and make sure it’s converted into standard XML.
  • Prepared and Execution of test scripts using Selenium IDE, WebDriver, Eclipse IDE.
  • Ran tests by using Selenium IDE for Firefox browsers.
  • Performed Browser Compatibility Testing and Web Testing.
  • Participated in daily Scrum meetings and provided feedback from QA standpoint. Also worked as Scrum Master in certain stand up meetings.
  • Executed SQL queries to retrieve data from data bases to validate data mapping.
  • Tested and delivered Inbound/Outbound Facets interfaces
  • Wrote and executed complex SQL queries to validate successful data migration and transformation.

We'd love your feedback!