Performance Tester Resume
Suitland, MD
SUMMARY
- Individual with six years of experience in testing various Web based and Client Server applications on various technologies, platforms, and domains using performance, manual and automation testing (Web - services).
- As a Software Quality Assurance Test Engineer, I’m experience in all phases of software development lifecycle, Developing Test Cases, Test Plans, Performance testing, Manual testing, andAutomation Testing using, Loadrunner, Performance Center, J-meter, Badboy, UFT = Unified Functional Testing (HP QTP + HP Service Test) and automating WebServices.
- Experience in Functional and non-functional testing, Load testing, Stress testing, Soak testing, Volume testing, Scalability testing, Integration testing, Regression testing, GUI testing, Back-end testing, Browser Compatibility testing, Black Box Testing, and System Testing.
- Significant exposure to test toolslike LoadRunner, Performance Center, JMeter, Badboy, QTP/HP UFT, SoapUI, Quality Center.
- For project reporting and bug tracking: HPALM, Jira, and Bugzilla.
- Expertise inPerformance, Manual and Automation Testing (Web-services)in different testing methodologies likeAgile, V-model and Waterfall, Scrum,etc. Good skills in data analysis and problem-solving test strategies, test cases.
- Experience in using monitoring tools like, Newrelic, Dynatrace, and Sitescope to dig and drill down into performance monitoring issues-concurrency and components issues.
- Experience in DBMS: My-SQL, MS-SQL, Oracle DB2, for database testing and to retrieve parameters for application enhancement.
- Experience in application server’s environments: WebLogic 12C, Jboss, and WebSphere, to look into thread contentions using monitoring tools.
- Experience in Functional testing, Unit testing, Integration testing, System testing, Regression testing, GUI testing, back-end testing.
- Tested WebServices using testing standards like validating XML, JSON, WSDL Using LoadRunner, SoapUI
- Expertise in QA Testing in distributed Unix/Linux and Windows Environment and performed end-to-end testing.
- Experience in preparing, analyzing, executing Performance Test Plans and Test Cases.
- Identifying Test environment, identify and analyze performance acceptance criterial, plan and design the test-workload model (how to spread the loads across the script according to functionalities)
- Configure test environment with the consent of system admin and developers, and lunch the application in Virtual User Genetaror-Vugen
- Maintaining Test data in Excel files and Importing Excel files from Resources module in Quality Center (HPALM)to HPUFTdata table and read the values based onfunctional requirement.
- Perform Performance and Load testing using LoadRunner generated Scripts in VUGEN using HTTP/HTML, and Jmeter using HTTP.
- Experience in creating Virtual User Scripts, defined User Behavior, and Conducted Load Test Scenario, inserting; transaction points, Rendezvous points, content checks, Correlations, Parameterizations, and Comments into the Vuser scripts to understand load conditions better.
- Experience in using LoadRunner to analyze applications performance for varying Loads and Stress conditions.
- Extensive experience in web-based application testing using System testing, Functional testing(web-services), Integration testing, Regression testing, Back end testing UAT.
- Familiar with all phases of Software Development Life Cycle SDLC and STLC, including Agile, Waterfall, V-Model.
- Experience in designing and executing test scenarios, in Controller.
- Experience in identifying the performance bottlenecks of the website under heavy load using Load Runner and checked its compatibility on different browsers.
TECHNICAL SKILLS
TESTING TOOLS: LoadRunner, JMeter, Badboy, Performance Center, HP ALM (Quality Center), Unified Functional Testing (UFT), SoapUI
OPERATING SYSTEMS: Windows, UNIX (Solaris), Linux (Red Hat Enterprise, Ubuntu LTS)
PROGRAMMING AND SCRIPTING: C, JAVA, XML, JavaScriptRDBMS MS-SQL Server, IBM DB2, Oracle, My-SQL
WEB TECHNOLOGY/PROTOCOLS: WebServices, WSDL, SOAP, HTTP, HTML, XML, JavaScript, CSS
PROFESSIONAL EXPERIENCE
Confidential, Suitland, MD
Performance Tester
Responsibilities:
- WebServices Testing using Loadrunner and Soup UI, JMeter, Badboy, validated request and response in XML, and JSON format.
- Used JMeter to load test, SOAP and REST API, validate response with assertions, samplers by adding SOAP/XML RPC request sampler.
- Parametarization in JMeter by adding CSV data set Config, Correlation in JMeter through Regular Expression Extractorand Regular Expression Tester, debug script using debug sampler, debug postprocessor, log viewer and generate JMeter reports.
- Browser Compatibility testing, Black Box testing, and System Testing.
- Complete performance requirement analysis with the client and developers and develop Performance Test Plan.
- Identify test scenarios and prepare test cases for UAT test phase.
- Execute Integration and Smoke / Build Acceptance testing manually.
- Use ALM to track, fix and maintain Defects and to produce defect reports.
- Extensively use in LoadRunner's Web (http/html) and Web Services protocols.
- Develop, maintain, and upgrade automated test scripts and architectures for testing application products load behavior under LoadRunner.
- Insert rendezvous points to create intense load on the server and thereby to measure server performance.
- Complete performance measurements for, Oracle, Web sphere servers in LoadRunner controller and monitored online transaction Response Time, Hits/Sec, TCP/ IP Connections, Throughput, CPU Utilization, Memory Utilizations, various HTTP requests etc.
- Identifying test environment, identify and analyze performance acceptance criterial, plan and design the test-workload model (spreading the loads across the script according to functionalities).
- Configure test environment, meetings with stack holders-walk through meetings, writing test cases according to modules and futures of the application.
- Configure test environment with the consent of system admin and developers, and lunch the application in Vugen.
- Create load test scripts, by capturing the business process of the client using vugen -LoadRunner 12.02 and enhance the script with correlation, parameterization, and content check, insert rendezvous points.
- Developed base line test, load test Stress test, endurance test of the application by creating virtual users using controller.
- Monitoring test using monitoring tools by identify the performance bottlenecks-concurrency and component issues of the application under heavy load, and using LoadRunner and checked its compatibility on Internet Explorer browsers.
- Analyzes the test using Analyzer in Loadrunner, generating reports and proffer recommendations.
- Attend the functional and technical presentation meetings, work through meetings, Subject Matter Expert Meetings and provide feedbacks to manager(s).
- Prepare and present daily, weekly, and monthly (High level) Status Reports to the different Stake holders of the project.
- Participated in application servers’ environments: WebLogic 12C, Jboss, and WebSphere, to look into thread contention using monitoring tools.
Environment: LoadRunner, Performance Center, Unified Functional Testing, Web Logic, J2EE, DB2, Oracle, Visual basic, windows and UNIX.
Confidential, Reston, VA
Performance Tester
Responsibilities:
- Create, customize and edit LoadRunner scripts for identified business processes.
- Create load testing scenarios that simulate real life application usage and its business processes.
- Create scripts using protocols like LoadRunner HTTP/HTML webServices, and executed scripts using Virtual Users
- By using LoadRunner 12.02 and 12.50, I analyzed, generated various reports for higher management.
- Work with the development, network teams and other project teams to tune the test environment as necessary.
- Extensively worked on Vugen and used Controller to perform Load Test and Stress Test.
- Define performance test strategy, performance test cases, load scripts and documented the issues and re - tested software fixes to ensure the issues are resolved.
- Identify the bottlenecks in the application performance, conduct analysis by monitoring available graphs and provide recommendations to supervision to improve the performance.
- Generator, Controller & Analysis.
- Teste up to 1, 000 virtual users using Controller.
- Monitor analyzed server performance in various virtual user’s load.
- Creates scripts using J-Meter, and HP Loadrunner as load testing tools and generated reports for management.
- Inserted J-Meter Threads groups, Listeners, Assertions, Samplers, Timers; including Synchronizing timers, handle dynamic values, data drive in J-Meter….
- Conduct daily Scrum meetings with Product Owner, Scrum Master, Business and development team to verify bug fixes and update bug status for third party applications.
- Prepare test strategies documents for Manual & Automation testing.
- Identifying Test environment, identify and analyze performance acceptance criterial, plan and design the test-workload model (spreading the loads across the script according to functionalities).
- Configure test environment with the consent of system admin and developers, and lunch the application in Vugen, capture business process of client and enhance your script, before moving to controller of performance center.
- Customize Vuser scripts in LoadRunner for parameterization, correlation queries, transaction points, rendezvous point and created scenarios using LoadRunner Controller.
- Used ALM/Quality Center 12.01 to attach screenshots and associated the defect with the proper severity and priority, and linked the related Test Script, from which the Defects were generated.
- Used Quality Center (ALM) to open tickets for the Defects.
- Using ALM Site Administrator created domain name, project name, user name and assigned users to project.
- Responsible for tracking and metrics of program accuracy and defect reporting/trending
- Performed and managed defect reporting and tracking using HP Quality Center ALM
- Participated in application server environments: WebLogic 12C, Jboss, and WebSphere, to look into thread contention and thread dump analyses.
Environment: LoadRunner, Web sphere, Performance Center, DB2, QTP, Oracle, Windows and UNIX/Linux.
Confidential, Atlanta, GA
Performance Tester
Responsibilities:
- Analyze the business and technical requirements, created the Test scenarios and developed Test plan and Test cases.
- Attend functional spec, test strategy, test plan and test cases reviews ensuring they meet business requirements, Estimated the effort required to build the automation scripts.
- Configure test environment with the consent of system admin and developers, and lunch the application in Vugen, capture business process of client and enhance your script, before moving to controller of performance center.
- Complete Performance Testing by creating Virtual users and analyzing the reports in LoadRunner.
- Create Vuser scripts in the VuGen and inserted Transactions and Think Time within the Virtual User Script to emulate heavy user load on the server.
- Use Parameterization and Co - relation of the VuGen scripts to ensure the real-time load conditions using LoadRunner.
- Perform Scalability tests which encompassed testing with many hundreds of users and can be broken down into the following areas: Stress, Workload generation, Reliability test, and Configuration tests using LoadRunner.
- Use of Parameterization and Co - relation of the VuGen scripts to ensure the real-time load conditions using LoadRunner.
- Prepare Requirement Traceability matrix, Test data requirements, Test strategy, Test Coverage Matrix and Conducted System Testing, Integration Testing, and Regression Testing and documented the results in Quality Center.
- Use Quality Center to Tracked defects and Center for Error logging.
Environment: Java, My SQL Server, Oracle, HP Quality Center, LoadRunner, MS Excel, PowerPoint, Windows, Unix/Linux.
