We provide IT Staff Augmentation Services!

Performance Test Engineer Resume

Columbus, OH

SUMMARY:

  • 7 years of professional experience in Information Technology especially in Performance Testing and Manual testing and automation and quality assurance of Client/Server and Web based applications. Extensive experience using automated tools like HP LoadRunner & JMeter.
  • Experienced in Insurance, Banking, Health & Telecom application.
  • A solid understanding in Software Development Life Cycle (SDLC). Good understanding of web technology and client server architecture and ability to troubleshoot server issues and identify problems.
  • Experience in performance test design, planning, work load modelling, and elicitation of non - functional requirements for testing.
  • Hands on experience in profiling and performance monitoring. Hand-on experience in HP Load Runner, Performance Center & JMeter tool.
  • Experience with a variety of VUGen protocols such as: Web, Mobile TruClient, SOA, Web Services, RDP, Citrix, Java, .Net, and Oracle/ Seibel.
  • Experience profiling tool (Java) and integration in JMeter and BlazeMeter and Hands on great flexibility of using tools like Dynatrace, JVisualVM, Hp Diagnostics and Perfmn.
  • Solid understanding on creating dashlets, dashboards for various applications and having knowledge for debugging in pure paths technologies with Dynatrace Client.
  • Experience with BlazeMeter Cloud running load test using JMeter and Jenkins.
  • Strong knowledge on profiling front-end application and Hands- on Experience with Fiddler and HttpWatch for debugging website and security testing and using for performance evaluations.
  • Experienced in load, stress, performance testing and hands on experience in profiling and performance monitoring.
  • Experience in RDBMS such as Oracle, DB2, and SQL Server. Expert in developing complex SQL scripts for database testing
  • Depth knowledge of Microsoft Dynamics CRM.
  • Excellent knowledge in Mobile testing in Android, Windows Phone, iPhone devices for Mobile Performance testing.
  • Excellent analytical, problem solving, communication and interpersonal skills and motivated hard worker with professional attitude and work ethics. A self-starter.

TECHNICAL SKILLS:

Testing Tools: HP LoadRunner, JMeter, Performance Center/ALM, BlazeMeter, HP Quality Center, PUTTY, CRM Dynamics, Quick Test Pro (QTP), HP Diagnostic, SoapUI, MS Diagnostic.

CM Tools: SVN, Star Team, GIT and JIRA

Web Technology Tools: Fiddler, HttpWatch, Yslow, PageSpeed Insights and Webpagetest.

Profiling / APM Tools: DynaTrace Client, JVM, Ajax Dynatrace, Sitescope, Wily Introscope, Standard Set, .Net, JCONSOLE

Languages: VBScript, C, UNIX Shell, XML, WHDL, SQL, HTML, CSS, SOAP/REST Services, JSON, C#, Java and PHP.

Databases: MySQL, Oracle, SQL Server, DB2 and Rapid SQL.

Operating Systems: Windows, Linux, Mac OS X, AS/400, DOS, Fedora, Red Hat, Solaris.

Methodologies: Agile/Scrum- Confidential, Waterfall, Iterative, V-Model and RUP.

Servers: IIS, Tomcat Apache, WebSphere, JBoss, Apache Http server, WebLogic, J2EE, Ftp Pro, J2EE & Microsoft Azure cloud.

Others: Microsoft Office 2013, Microsoft Project, TFS, VMware, PerfAnalyzer, SharePoint, JSON Viewer, GIT, Jenkins &Taurus, Mobile Technologies iPhone Tester, Device Anywhere, MobiTest, SeeTest Automation.

PROFESSIONAL EXPERIENCE:

Confidential, Columbus, OH

Performance Test Engineer

Responsibilities:

  • As a Senior Performance Test engineer, I was collaborating with relevant stake holders and Dev team to understand, identify, and define performance engineering requirements.
  • Performing Plan, design, executing and evaluating Performance tests of web application and services and ensure optimal application performance using HP LoadRunner and HP Performance Center.
  • As a Senior Performance Test engineer, collaborating with relevant stake holders, Business Analyst, and Dev team to understand, identify, and define performance engineering requirements.
  • Develop and execute Test scripts based on the scenarios identified, monitor system under test. Maintaining tests environments and performance test scripts.
  • Design and develop HP-Load Runner 12.01 & 12.50 scripts Using (http/web, Web Services, and Mobile API).
  • Creating VUGEN scrips and Execution (Controller) and Analysis / Reporting. Performing correlation mechanism & parameterizing for load simulations.
  • Involve in Performing Load and Stress tests on the application with LoadRunner, JMeter and communicate performance reports to my lead and others.
  • Being Responsible for implementing LoadRunner, Performance center, JMeter based infrastructure including: Architecting the load testing.
  • Monitor and measure resource consumption of components, processes, and latency at measurable points in the systems under test and Monitoring Average Response time, TPS, Throughput, Web Page Breakdown graphs, and analyzing for each scenario.
  • Collaborating with development team to ensure that testing issues are resolved on the basis of using defect reports. Monitoring application server, Database server. Analyzing server access logs debugging application performance issues.
  • Modified the run-time settings during execution to emulate a real-time user by changing browser versions, think- time, pacing, content caching, action blocks.
  • Analyzing Get, Post method and analyzing error count and warning messages Using HttpWatch.
  • Recording the HTTP traffic and working with website loads and performance.
  • Performing Security and performance inspection using Fiddler and HttpWatch and to create performance scripts such as; SmartView scenarios.
  • Identifying application Performance issues of the various applications. Performing Confidential Rest Web API, Web services, Java/JASON sampler Using JMeter tool.
  • Configuring CSV data and Testing Performance for load and stress test for 300 to 350 concurrent users using JMeter plugins.
  • Monitoring the performance and measured application response time JMeter plugin’s like Webdriver selenium, standard set for performance metrics like CPU, TPS, hit per second, memory utilization and others.
  • Uploading scripts to BlazeMeter cloud and analyzing Test Summary, Throughput, errors and response time and sharing reports within the performance team.
  • Install and Configured Dynatrace AppMon 6:5 and added Dynatrace header calls in JMeter script and analyzing the header calls.
  • I was patching and unpatching Loadrunner script using Dynatarce client and monitoring business transactions hotspots, response time hotspots etc. and applications servers.
  • Performing purepath technology to find out the root cause of the bottlenecks.
  • Using Linux Commands such as; Top, Wget, Sir and Vmstat and Monitoring windows PERFMON counters and windows resources such as; CPU, Memory and threads.
  • Performing Database Testing and written SQL Queries for MS SQL Server .Working closely with Test Lead during the Performance Testing and design.
  • Managed the requirements using Quality Center. Performed Smoke, Integration, Functional, Performance, Regression, and Backend testing.
  • Participated in QA team meetings and walk-throughs for weekly QA testing review.

Environment: LoadRunner,JMeter,Dynatrace AppMon, Performance Center, Fiddler, HttpWatch, BlazeMeter, ASP, Quick Test Pro, Requisite Pro, Clear Quest, SQL Server, Load Runner, MS Office, Oracle, Java, JavaScript, VBScript, C, HTML & DHTML, SSL, Windows, Unix, IE, NS, Web Sphere, SQL Server, XML, JIRA, TCP/IP, SNMP

Confidential, Washington, DC

Performance Tester/ QA Test Analyst

Responsibilities:

  • Contributed IT process with the Agile team and Collaborated with Business Units' subject-matter-experts (SMEs) and Business Analysts in order to perform technical analysis, determine scope and estimate level of effort for implementation of functional and non-functional requirements.
  • Participated in continuous process improvement, development and institutionalization of technical standards and their implementation on all assigned projects.
  • Conducted Integration, System, Functional, GUI, Regression, Smoke, Database Integrity, User-Acceptance (UAT) and Performance testing.
  • Designed, developed and executed complex automated software test plans, test scripts, test scenarios, and analyzed performance tests using JMeter, Load Runner, Citrix, Jenkins, SOA, and BlazeMeter.
  • Performed automated load/performance testing across multiple messaging protocols (including HTTP, Web Service, Mobile protocol), focusing on overall application performance, leveraging knowledge of computer science and software development.
  • Independently developed Test scripts using any of the LoadRunner protocols, created test scenarios, executed tests and uploaded them in the LR Controller.
  • Created Database virtual user scripts including multiple Transaction Points and Rendezvous Points using Virtual User Generator.
  • Analysed Full scenario of the Work model and Designed for accurate Load Testing Ramp Up and System Breaking Point Recognition progressing to 2000 virtual user loads using LoadRunner Controller Scenario Scheduling.
  • Measured Response time of the important action of users using start and stop transaction functions.
  • Extensively worked on Performance monitoring and analyzed the response time Memory leaks, hits/Sec and throughput graphs
  • Analyzed test results using LoadRunner. Analysed and wrote test reports and Co-ordinated with other testers to resolve test problems.
  • Performed troubleshooting and debugging of routine issues RESTful services via JSON and Analyzed test results to look for bottleneck in Software or/and infrastructure.
  • As a QA Tester, I was Implemented Data Driven Testing Using QTP to check for the functionality of the application. Scripts were improved using features like library files, and code reusability.
  • Used parameterization and used various checkpoints like text checkpoint, standard Checkpoints, table checkpoints, and database checkpoints in QTP.
  • Responsible for creation of Test Plan, Test Cases, Test Execution, and Test Reporting using ALM Quality Center.
  • Conduct troubleshooting by performing root cause analysis and Analyzed production pattern in production and conduct performance test to create production like scenarios.

Environment: LoadRunner, JMeter, BlazeMeter, QTP, Jenkins, Citrix, HTML, C++, XML, ASPs, Visual Basic, Oracle, SQL, PL/SQl, Clear Quest, ALM QC, Microsoft Visio, Java, JSON JavaScript, VBScript, HTML & DHTML, SSL, Windows, Unix, Web Sphere.

Confidential, Chicago, IL

Performance Tester/ QA Test Analyst

Responsibilities:

  • Contributed and Collaborated with the Agile team and Business Analysts in order to perform technical analysis, determine scope and estimate level of effort for implementation of functional and non-functional requirements.
  • Design and develop test strategies and test plans.
  • Created customized LoadRunner VuGen 11.04 scripts and Scenarios and used a variety of VUGen protocols such as: Web/HTTP, Web Services, Database and Oracle/ Seibel.
  • Designed and developed LoadRunner scripts using Oracle Web Application 12 & Oracle Web Application 11i and Reviewed scripts with Performance Team and Leads.
  • Designed and developed test scenarios and executed tests in Performance Center 11.0.
  • Executed Test scripts based on the scenarios identified, monitored system under the test, measured and recorded key performance indicators.
  • Analyzed and interpreted results, debug errors, generated tests reports and presented them to management with recommendations as needed.
  • Involved in setting up and configured HP-SiteScope 11.0 monitors, monitored application response time, CPU and Database calls with the Performance team and helped creating excel datasheet of performance test results.
  • I was working as a Web admin and was performing the following test: Functional, Integration, Regression and was communicating with developers.
  • Written SQL queries to test Data Integrity, Referential Integrity, and performed Backend Testing. Testing the data on Online Screens and wrote SQL’s to retrieve the data from the Database to verify the data updated properly.
  • Implemented Data Driven Testing Using QTP to check for the functionality of the application. Scripts were improved using features like library files, and code reusability.
  • Used parameterization and used various checkpoints like text checkpoint, standard Checkpoints, table checkpoints, and database checkpoints in QTP.
  • Responsible for creation of Test Plan, Test Cases, Test Execution, and Test Reporting using Quality Center.
  • Actively co-coordinated with the Team lead in conducting internal walkthroughs on a weekly basis.

Confidential, Reston, VA

Software QA Engineer

Responsibilities:

  • Reviewed Business Requirement Documents (BRD) and the Technical Specification
  • Involved in developing test automation strategy
  • Responsible for designing Automation Test Plan and developing scripts using QTP
  • Created various verification points to verify properties, object data, window existence and also AUT insert, update and delete records correctly using QTP
  • Designed and Created Test plans and Test Scripts as per business requirements and functional specifications using QTP
  • Performed manual testing by executing Test cases for all modules before creating automated scripts to validate the test procedure
  • Writing and executing test cases and test procedures for different scenarios based on the business requirements in QC
  • Developed SQL queries to test for the data validations and other business functionalities such as required fields, data format and data integrities
  • Extensively involved in back-end testing writing SQL Queries using Oracle
  • Performed regression testing for modifications in the application and new releases
  • Worked closely with the developers to resolve defects, issues and understand the functionality

Hire Now