Performance Tester Resume
Miami, FL
PROFESSIONAL SUMMARY:
- 5+ years of IT experience as a Performance/Software Tester using tools like LoadRunner/Performance Center, JMeter,Gatling, Postman, Jenkins, Github, AWS, TFS and various other Testing tools
- Experience in understanding technical and Non - functional specifications (test environments, test methodologies, test strategies, and test procedures), test plans, test scenarios and reviewing and monitoring test approaches
- Extensive experience in talking with stakeholders and understanding business requirements for testing purposes
- Expert in scripting, debugging, performance testing and performance tuning of various applications
- Extensive experience in using LoadRunner components (Vugen, Controller, Analysis, Load Generator) and Performance Center
- Experience in writing test plans, test strategy, planning, scripting, executing, analyzing and reporting based upon system requirement specifications or SLA
- Extensively used LoadRunner Vugen to develop scripts
- Expertise in Parameterization, Correlation, Run-time settings, Parameter settings and C functions to enhance script ready for performance testing
- Working experience in Load Testing, Stress Testing, Endurance Testing and Spike testing
- Involved in monitoring using Splunk, New Relic, HP SiteScope, AppDynamics, HP Diagnostics and Dynatrace
- Strong knowledge of all phases of Software Development Life Cycle (SDLC) and Application Life Cycle Management (ALM) with concentration on implementation of performance testing for various applications
- Expertise in manual as well as automatic correlation of dynamic values coming from server
- Highly experienced in scripting of Web (HTTP/HTML), Web Services and Truclient-Web using LoadRunner Vugen
- Proficient in performance test tools like LoadRunner, Jmeter
- Experience on writing and execute scripts using HP Performance Centre/Load Runner/Jmeter
- Well experienced with using LoadRunner Controller and its functions such as thread, rendezvous point, ramp up rate, duration, and ramp down rate
- Expertise in creating test scenarios, analyzing test results, reporting bugs/defects, and documenting test results in various formats
- Experience in analyzing performance bottlenecks, root causes and server configuration problems using Performance Center, Monitor and LR Analysis
- Well versed with Adobe Reader, Adobe Flash Player, MS-Processer, MS-PowerPoint, MS-Excel, MS-Word and other MS-Office suites
- Strong skills and working experience in different types of Performance and Reliability testing (Load, Stress, Endurance, Capacity, Volume, Scalability, Reliability etc) by using Load Runner
- Excellent working experience in a fast-paced environment using agile methodologies
- Knowledge on Performance Tuning Activities
- Hands on Experience in working with technologies like Java, Pega and .Net enterprise applications
- Managed performance across the application lifecycle by using Application Performance Management tool suite
- Hand on experience in Rally, JIRA, TFS, Quality Center test management tools
- Document, track and escalate issues using TFS
- Experience with defect tracking and analysis using different defect tracking tools such asTFS (Team Foundation Server), Quality Center
- Strong knowledge of software architectures like client-server, n-tier, web services and service-oriented architecture
- Co-ordinate activities related to one or more modules, investigating software defects and interacting with developers to resolve technical issues
- Able to prioritize and work within tight time scales
- Experience in working onsite/offsite and offshore models
- Excellent problem-solving skills with a strong technical background to support and good interpersonal skills
- At ease in high stress environments requiring superior ability to effectively handle multi-task levels of responsibility
- Flexible and versatile to adapt to new environments and work on new projects
- Sound understanding of all technical aspects of testing.
- Driven to get the job done, with a willingness to work out of hours at short notice. Ability to work around setbacks and blocks to achieve the solution
- Excellent problem-solving skills and the ability to find solutions to issues, work around conflicting priorities (different test phases/projects all with different priorities) with a view to commercial reality
- Experience in understanding different application architectures to find potential performance bottlenecks or security vulnerabilities
- Ability to recommend design changes to improve performance of the application
TECHNICAL SKILLS:
Testing Tools: Micro Focus LoadRunner, Micro Focus Performance Center, Splunk, New Relic, Apache JMeter, HP Quality Center/Test Director, HP SiteScope, Wily Introscope, HP Diagnostics, SoapUI, HP QTP/UFT, Bitbucket, Crank
Databases: MS Access, MS SQL Management Studio R2, Oracle10g, DB2, SQL
ECommerce: VBScript, Application Server, Web logic, Backend, C
Web Technologies: HTML/XHTML, XML/JSON, PHP, JAVA, .Net
Protocols: Web (HTTP/HTML), SAP Web, Web Services, TruClient Web, SAP GUI, Web Click and Script, TruClient, Oracle NCA, Citrix ICA
Operating Systems: Windows NT/2000/2003/XP/Vista/2007/2010, UNIX/OS X
PROFESSIONAL EXPERIENCE:
Confidential, Miami, FL
Performance Tester
Responsibilities:
- Gather the NFR (Non-Functional Requirement) from Client
- Identify the Business-Critical Transactions and Generate the test Scripts
- Migrated the existing load test scripts from Gatling to Jmeter
- Recording the JMeter script using BlazeMeter and Implementing the scenario based on Gatling project
- Established the database connection and added the jdbc requests to get values from sql query
- Capturing the security token using boundary extractor
- Adding the required assertion similar to what we have in Gatling
- Passing the generic values into the JMeter script in the form of user defined variables
- Capturing the values using Regular expression and Xpath extractor and pass them into following requests in the form of variables
- Creating usernames with the help of JSR223 sampler
- Creating the Load test simulation using stepping thread group in JMeter
- Running the test from local Linux server and generating the report by running the script in non-gui mode
- Developed, enhance and apply different logics in the Apache JMeter scripts for load and stress testing
- Emulated the real-world scenarios for the identified business workflow
- Co-ordinate Non-Functional test activities including across multiple internal and external teams
- Executed different performance tests (Smoke Test, Baseline Test, Load Test, Stress Test, Capacity Test, and Endurance Test).
- Interacted with developers, DBA, Infra, Scrum Master, Project Manager and other team member in defect raising situations
- Worked closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
- Created various Apache JMeter Scripts based upon the Critical Transactions/workflow used by the real-time users.
- Monitor Memory, CPU, Disk and Network graphs for the execution of load test
- Did bi-weekly prod analysis through Splunk
- Used JMeter for executing load tests using web services protocol
- UsedJenkinsto automate the batch testing and load testing
- Created CI\CD Pipeline in Jenkins to execute the performance test
- Developed and deployed test load scripts to do end to end performance testing using Performance Center
- Worked on Pega applications, Recorded the JMeter scripts through blaze meter extension identify web-sockets in pega and configure pega in JMeter
- Implemented and maintained an effective performance test environment.
- Identified and eliminated performance bottlenecks during the development lifecycle.
- Found out the stability of the application (if the system is away from the memory leaks), using the Endurance test (the test would run for the long duration).
- Created Jira tickets and accurately produced regular project status reports to senior management to ensure on-time project launch
- Created JMeter Scripts by using Blazemeter Extension, Executed through the Jenkins
- Debug and executed the JMeter Scripts
- Extensively worked onJMeterto create Thread Groups and test Web Application for various loads on key business scenario
- Ongoing role within a small Agile software product development team
- Experience creating and executing tests within an Agile continuous integration environment
- Performance test development for 'Continuous Benchmarking' usingJMeter and Jenkins
- Design, Develop and Execute load testsusing JMeter and Jenkins
- Provided support in the performance testing usingJMetertask includes developing test plan, test script and reports
- Develop performance test suites, creating threads and setting up sampler using JMeter Testing tools
- Involved in Localization testing and Performance testing of web-based modules, handled Load testing using JMeter
- Provided support in theperformancetesting usingJMetertask includes developing test plan, test script and reports.
- Presently working in an Agile development environment. Participate in weekly Scrum meetings for the applications development.
- Develop scenario-based testing for theJMeterscripts
- Create, schedule, and run the scenarios usingJMeterand generate necessary graphs
- Extensively worked onJMeterto create Thread Groups and test Web Application for various loads on key business scenarios
- Recorded the Scripts by using BlazeMeter Extension and Convert it to JMeter
- Performancetest development for 'Continuous Benchmarking' usingJMeterand Jenkins
- Created and executedJMeterscripts forperformancetesting of portal
- Used APM tool Dynatrace, AppDynamics to Monitor End User Experience, Overall Application.
- Performance, Business Transaction Performance and Application Infrastructure Performance across all tiers (web/app/DB) of the applications. Adding Dynatrace headers to the VuGen scripts to monitor response times closely
- Used SPLUNK tool to check whether the messages are triggering at back end
Confidential, Arlington, VA
Performance Tester
Responsibilities:
- Attended all the project meetings to understand the project business flow
- Generated scripts in Vugen component of LoadRunner and JMeter
- Customized scripts by doing correlation and parameterization.
- Applied many different logics using C Language in Vugen Scripts
- Executed scripts in Controller
- Monitored while scripts were executing
- Forwarded the execution reports to the Client
- Performed performance testing using LoadRunner and BlazeMeter
- Analyzed typical business day traffic scenario discussing the requirements and business activity with the end users and business analysts
- Verified the systems logs to identify the highly hit pages and modules of the application
- Identified the key scenarios impacting the application response time and created Vugen Scripts using Web (HTTP / HTML) and Java protocol
- Monitored performance tests using Wily Introscope
- Customized Vugen scripts includingfeatures like documentation, Transactions, Rendezvous points and Think Time
- Parameterized test scripts and provided the required unique test data for multiple users.
- Created Test Scenarios to meet the requirements by creating the necessary User groups, allocation of Vusers for scripts, configuring the load generators and set up of execution schedules
- Participated in Performance analysis meetings and helped the project team in tuning the application and improving the application response time
- Creation of performance scripts in JMeter, VSTS and Vugen
- Prepared different LoadRunner scenarios as per test plan
- Developed a comprehensive Performance Test Plan and executed all aspects of the plan. Ensured that the implementation, key user types, locations, and scenarios are assessed as part of overall plan
- Worked with the implementation team to identify critical business processes and key application transactions to define performance requirements
- Coordinated all aspects of test design, planning, and execution with product vendor and service providers
- Created Scripts for Web Services and Executed from Performance Center
- Worked on Rest API, Created Scripts for Rest API and Executed from Performance Center
- Closely worked on SOAP UI and conducted Functional Testing
- Conducted Analysis of Testing Results and recommended solutions for software and infrastructure tuning
- UsedNew Relic to troubleshoot application bottlenecks to dig-out which modules are consuming excessive resources
- Measured response time and API response time during performance testing for Ajax Truclient protocol scripts
- Worked with various protocols like Web (HTTP/HTML), Web (Click and script), Ajax (Click & Script), LR Java protocol for performance testing on HP LoadRunner 12.5
- Worked with the business team to set expectations and establish key performance benchmarks to use in end user communications as well as QA/Performance Team.
- Set up processes to monitor application performance on a regular basis and take remedial actions
- Ensured all Performance Work stream deliverables are delivered on time and on budget.
- Developed status reports and communicates appropriate level of detail on Testing, Results, and Issues to team members and senior management
- Identifying the critical transactions to be load tested and base lining the performance using ALM Performance Center
- Executed different performance tests (Smoke Test, Baseline Test, Load test, Stress Test,
Confidential, Dallas, Texas
Performance/QA Tester
Responsibilities:
- Involved in analyzing the Functional Requirements.
- Designed and Created Test Cases using Test Director
- Project was developed followingAgileandScrummethodologies.
- Performed checking the user profiles and their login scripts.
- Modified the Test Plan and Test Scripts.
- Checked the basic functionality by Manual testing
- Involved in examining the severity and priority of the Defects and Test Results Reporting
- Performed User Acceptance Testing (UAT), interacted with users for execution of test cases in UAT
- Performed Functional, Integration, and Regression Testing under various browsers
- Verified the data from backend by running the SQL and PL/SQL Queries
- Reported various defects in user-friendly format using Quality Center as a test management tool and defect-tracking tool
- Performed End-to-End system testing and reported defects in Quality Center.
- Maintaining knowledge of Medicare and Medicaid rules and regulations pertaining to the Facets and evaluating the impact of proposed changes in rules and regulation
- Carried out various types of testing at the deployment of each build including Unit, Sanity, & Smoke Testing
- Provided the management with test metrics, reports, and schedules as necessary using MS Project and participated in the design walkthroughs and meetings
- Worked with the clients on the final signing process in the User Acceptance stages
- Maintained the Test Case Execution Matrix
- Processed Testing files using Web services and make sure it’s converted into standard XML
- Performed Browser Compatibility Testing and Web Testing
- Participated in daily Scrum meetings and provided feedback from QA standpoint. Also worked as Scrum Master in certain stand-up meetings
- Executed SQL queries to retrieve data from data bases to validate data mapping
- Tested and delivered Inbound/Outbound Facets interfaces
- Wrote and executed complex SQL queries to validate successful data migration and transformation