We provide IT Staff Augmentation Services!

Performance Test Engineer Resume

West Chester, PA

SUMMARY

  • Around 9 years of experience in Performance Testing of various Web Applications across multiple domains.
  • Extensive experience in automated testing of Web based and Client/Server applications with proficiency in Load andPerformanceTesting.
  • Good experience in agile methodology.
  • Perform the monitoringof the application and database servers during the test run using tools like Dynatrace, Splunk
  • Execution of automated test scripts using Mercury Tools (Quality Center, Load Runner, and QTP), Performance Center, Apache JMeter based on business/functional specifications.
  • Expertise in SQL queries to perform Backend testing.
  • Experience in monitoring Web Servers and Application Servers such as Microsoft IIS, web logic, Web Sphere and Database Servers such as SQL Server and Oracle during the PerformanceTest with and without firewalls.
  • Expertise on Web Services and experienced in using SOAP UI for testing of SOA environment.
  • Participated in capacity planning of along with performance testing.
  • Experience inPerformance testing of Web applications and Client/Server by using Load Runner
  • Well versed with all functionality of Virtual User Generator and Correlating Statements, configuring Run time settings for HTTP, iterations, Simulated Modem speeds to bring the testing scenario to real world.
  • Strong knowledge of using Single and Multiple protocols in Load Runner VUGen like Web Http, Webservices, Ajax True Client, Web Click and Script, ODBC and Oracle NCA.
  • Good understanding of object - oriented methodologies, software development life cycle (SDLC) and software testing methodologies.
  • Good Experience in HP ALM and quality Center.
  • Excellent ability to understand complex scenarios and business problems, and transfer the knowledge to other users/developers in the most comprehensible manner
  • Working experience with the teams from Development, QA, DBA and SiteOps to measure, analyze, and help optimize the performance and scalability of new releases

TECHNICAL SKILLS

Programming Languages: C, C++, MySQL, HTML, JAVA, XML

Operating System: Win 9X, Win XP Pro., Windows 2000 Advanced Server, UNIX

Project Management and Testing & Monitoring Tools: Microsoft Project, HP Load Runner 9.x,11.x,12.x Performance Center & HP-ALM, Dynatrace 6.3,6.5,7.0, Thread-dump Analyzer, IBM Heap Analyzer, Perfmon, MQmon, Nmon, Fiddler

Automation and CI Tools: HP-Load Runner/Performance Center/Storm runner, HP ALM, Quality Center, Jira, SharePoint, TFS QTP, SOAP UI, Postman.

Databases: ORACLE, MS SQL Server.

Monitoring/APM: Dynatrace, Splunk, Appinsights and site scope

PROFESSIONAL EXPERIENCE

Confidential - West Chester, PA

Performance Test Engineer

Responsibilities:

  • Reviewed product requirement documents, functional specifications, and involved in developing test strategy, test plan and test case documents.
  • Gathered Non- functional requirement, designed, developed and executed a performance measurement plan used as the basis for assessing process capability.
  • Developed HP LoadRunner Vugen Scripts utilizing Virtual User Generator that emulates important application Load critical transactions. And conducted test executions to capture system and application performance.
  • Performed root cause analysis of a system and finding the bottlenecks using APM monitoring tool Dynatrace.
  • Coordinated with the functional QA team and Developers regarding the issues.
  • Performed Automated Load, Stress, Endurance and Peak Hour testing.
  • Monitored resources to identify Performance Bottlenecks, analyzed test results along with development team and database team and report the findings to the clients using LoadRunner.
  • Involved in Performance test planning, setting goals for the release and concerned in project level status meetings.
  • Tracked and reported the errors discovered using ALM Performance Center.
  • Involved in preparing the high-level Test Plan and developed Test Cases in accordance with the functional specifications
  • Responsible for creating test scenarios with Web-Load and executing and documenting the results and the scenarios.
  • Created LoadRunner automate test scripts for Load, Stress, Performance Test in SAP, HTTP/HTML, Web Service protocol-based applications

Environment: Windows, Linux, LoadRunner (vugen, Controller and Analysis), Performance Center, Dynatrace Managed, Jenkins, SOAP UI, WSDL, XML, Json, Azure, CI/CD

Confidential - Cary, NC

Performance Engineer

Responsibilities:

  • Reviewed product requirement documents, functional specifications, and involved in developing test strategy, test plan and test case documents.
  • Creating Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and Presentation of results.
  • Analyzed the requirement and design documents.
  • Written Load Runner Scripts, enhanced scripts with C functions, Parameterized Users, stored dynamic content in Load Runner functions, used client-side secure certificates.
  • Extensively Worked in Web-HTTP/HTML, Web Service and Ajax TruClient Protocol in Load Runner.
  • Text checks were written, Created scenarios for Concurrent (Rendezvous) and Sequential users.
  • Created Single User, Base Line and Soak test scenarios. Random pacing between iterations was introduced to get the desired transactions per hour.
  • Added performance measurements for Oracle, Web Logic, and IIS in Load Runner Controller.
  • Worked extensively on TDM services and MQ (Message queues) technology.
  • Used SPLUNK to retrieved data (volume information) from Production to match the Load/Endurance/Stress test to match the production.
  • Used Data Power for Performance test.
  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations.
  • Create, schedule and run the scenarios using JMeter and prepare the results.
  • Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Used Load Runner/ Performance Center tool for testing and Dynatrace for monitoring.
  • Analyzed results using Load Runner Analysis tool and Splunk to analysis, and analyzed Oracle database connections, sessions, Web Logic log files.

Environment: Micro foucs Load runner 12.55, Performance Center, VUGen, JMeter, Jenkins, Splunk, ORACLE, SQL Developer, Splunk, Dynatrace, Putty, Java, HTML, Microsoft Excel.

Confidential -Charlotte, NC

Automation Tester

Responsibilities:

  • Analyzed the Business Requirements and closely worked with the Business Team to get the clarifications addressed.
  • Developed Performance Test Plan, Test Cases and Test scripts for the load & Performance testing to perform testing at different load levels.
  • Designed performance test suites by creating test scripts, workload scenarios, transactions, custom properties and test them on a general mode to validate the scripts in SOASTA Akamai CloudTest.
  • Enhanced the scripts by adding custom properties, parameters based on the scenario.
  • Used JavaScript to add logic to load testing scripts.
  • Configuring Run-time settings and Servers according to the scenarios to be tested.
  • Created different scenarios like Load test, Stress test, endurance test etc.
  • To find out the bottlenecks and improving the quality of performance testing.
  • Created intense load on the server to measure the server performance under load.
  • Review results captured from SOASTA and analyzed Transactions Response time, Transaction Summary, Database, servers, hits per Second, latency, Throughput for all the microservices using AppDynamics and Instana.
  • Monitor the Infrastructure resources using AWS CloudWatch and check if there is any scalability required on any of the microservices.
  • Purposefully introduced the failures in the system to test the resiliency of the system in chaotic situations.
  • Also monitored AWS resources and applications running on amazon infrastructure and set up alarms if any of the resources reaches its threshold.
  • Prepared detailed test reports and summary report highlighting the different load tests conducted and the performance achievement made from the engagement.
  • Provided testing analysis of complex systems to ensure systems are compliant with requirements and can perform well in production.
  • Used Octopus to verify if the same Prod version is deployed to Clone environment to conduct the performance testing in clone.
  • Diligently worked with development team and guided them in finding and fixing the performance defects.
  • Utilized Jenkins to create different jobs that can trigger the tests in SOASTA.
  • Defect tracking and reporting using JIRA.
  • Actively participated in the daily project meetings and walkthroughs.
  • Attending conference calls with offshore team to discuss the Testing status and to assign the defects to the concerned developers.

Environment: SOASTA Akamai CloudTest 53.42, AWS CloudWatch, JavaScript, JIRA, HTTP/HTML, AppDynamics, Instana, Jenkins, Octopus, Oracle.

Confidential - Raleigh, NC

Performance Tester

Responsibilities:

  • Worked closely with Business Analysts and Developers to gather Application Requirements and Business Processes in order to formulate the test plan.
  • Worked as a liaison between business team and development team, translated business specifications and requirements into functional test cases; worked closely with the development team both onside and off shore on analysis & design to meet business requirements and site roll out
  • Developed scripts using Load Runner by recording/playback and as well as by writing custom functions.
  • Created batch files to launchJ consolemonitoring tools.
  • Created batch files to start/restart/stop application servers
  • Involved in test environment build and designed Load (capacity) model on the basis of current volume and projected percentage increase in volume.
  • Developed VUGen test scripts in Load Runner for Oracle forms and JSP pages using NCA and HTTP protocols. used customize framework for automation in Test Complete.
  • Used Virtual User Generator to generate VUGen Scripts for Web (Http/Html), .Net and Web Services protocol.
  • Actively took ownership of defects and coordinate with different groups from initial finding of defects to final resolution.
  • Analyzed Load Runner on-line graphs and reports to identify network/client delays, CPU /memory usage, I/O delays, database locking and other issues at server level.
  • Responsible forPerformanceTuning Java Application.
  • Performed both automatic and manual correlation using the options in Load Runner.
  • Analyzed system usage information such as task distribution diagram and load model to create effective test scenarios.

Environment: Test Complete, ALM 10.x, Load Runner,JMeter, JIRA, Microsoft, PL-SQL, Jconsole, Hybrid (Waterfall -Agile) Methodology, Windows 7/XP, UNIX. Protocols: Web HTTP/HTML, NCA, Web services.

Confidential

Performance Tester

Responsibilities:

  • Developing Test Plan, which includes entire Testing Plan, Testing Strategy and testing of end-to-end scenarios.
  • Correlated and Parameterized test scripts to capture Dynamic data and input various test data as per business requirements
  • Identify system/application bottlenecks and work with Bottom line to facilitate the tuning of the application/environment in order to optimize capacity and improve performance of the application in order to handle peak workloads
  • Performed baseline test with 1 user and 5 iterations. Performed benchmark test under a load of 50 users using LoadRunner controller.
  • Developed Vugen scripts and theLoadrunnerController for execution.
  • Enhanced Vuser scripts by introducing the timer blocks, by parameterizing user id's to run the script for multiple users.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.
  • Tracking and reporting test plan, Test activities, including test results, test case coverage, Test Metrics, required resources, defects discovered and their status

Environment: HP LoadRunner 9.5/11, SAP ECC, SAP GUI 7.2/7.3, SAP FICO, SAP MM, DB2, Windows 2000/XP Professional, Linux, SQL server 2005/2008, Quality Center 11.0, Sitescope 9.5/11

Hire Now