Performance Engineer Resume
Dallas, TX
SUMMARY
- Around 5 years of diversified experience in Software testing and quality assurance. Exposed to all phases of Software development life cycle and Software Testing Lifecycle. Possess strong analytical, verbal and interpersonal skills with capability to work independently.
- Experience inn testing of web based, Client/Server and SOA applications.
- Experience in performance testing on Multi - Tier Environments for different Software platforms.
- Expertise in different testing methodologies like Agile, Scrum and waterfall.
- Hands on experience on Cloud platforms Amazon AWS and Windows Azure.
- Highly proficient in performance testing using LoadRunner, PerformanceCenter, Neoload, Apache Jmeter and RPT.
- Experience in various protocols in LoadRunner including Web (HTTP/HTML), Web Services, Citrix, Ajax TruClient.
- Created Performance scenarios and scripts for various types of tests (load, stress, baseline/ benchmark/ capacity)
- Identifying and Preparing Test cases for Performance Business Critical Transactions.
- Expertise in recording/Coding Vugen scripts using different protocols in all types of environments.
- Expertise in parametrization, manual correlation, run time settings and C.
- Excellent knowledge and skills in test monitoring for transaction response times, web server metrics, Windows/Linux system resource, Web App, Server metrics, Database metrics and J2EE performance.
- Working with Software Engineering to design, code and test basic functionality via scripting(e.g., JavaScript, Python, Perl) and compiled (e.g., Java, C++/C#) languages.
- Experience in analyzing Performance bottlenecks, Root cause Analysis and server configuration problems.
- Work experience in integrating Automation scripts on Continuous Integration tools likeJenkinsfor nightly batch run of the Script.
- Experience in SOA/API testing.
- Good experience on selenium IDE and writing scripts in selenium webdriver using Java.
- Good knowledge on SOA, DB, Siebel, PeopleSoft, SAP, Java, J2EE and .Net based applications.
- Proficient in developing scripts using various samplers in Apache Jmeter and executing various load tests.
- Extensively used NeoLoad for executing and creating test scripts.
- Execution of Manual Test Scripts and responsible to track and log the defects usingQuality Center ALM.
- Used APM for monitoring purpose of web application on java and .Net platform.
- Extensive working experience on Servers Monitoring with the help of HP Sitescope, Dynatrace, App Dynamics HP Diagnostics, Wily Introscope, Splunk and Perfmon.
- Used Wily Introscope to monitor the health of various applications.
- Experience with Quality Center/HP Application life cycle Management (ALM), JIRA, Bugzilla as Test management and defect tracking tool.
- Expertise on Web Services testing using SOAP UI.
- Expert in analyzing performance bottlenecks such as high CPU usage, memory leaks, Buffer Usage, Queue Length, Connections, Heap Usage, Thread Usage, Missing Indexes, Recursive call ratio, SQL Queries usage, Roundtrip times and Deadlocks.
- Took Thread Dumps and Head Dumps for finding and analyzing the bottleneck areas.
- Familiar with writing SQL queries in the scripts to query the database.
- Good experience in engaging with businesscontacts and stakeholders for requirements gathering, architecture review.
TECHNICAL SKILLS
Testing Tools: Load Runner, Performance Center, IBM RPT, Silk Performer, ApacheJmeter, QTP, Selenium, QC/ALM, JIRA.
Load Runner Protocols: Web-HTTP, Web Services, Citrix, Oracle, NCA, SQL Scripting, JavaScript, ADO.NET, AjaxTruClient.
Monitoring Tools: Dynatrace, AppDynamics, Grafana HP Sitescope, New Relic, Wily Introscope, Splunk, Perfmon.
Programming Languages: C, C++, Java/J2EE, C#, Pearl, Python.
Web/Application Servers: MS IIS, Apache, Tomcat, WebSphere, Weblogic.
Database: Oracle, Db2, SQL Server, MySQL
GUI: VB, JSP, Java, Applets, ASP, HTML
Other: Service Oriented Architecture (SOA), Web services, Fiddler, Firebug, SOAPUI, WSDL, WCF.
PROFESSIONAL EXPERIENCE
Confidential, Dallas, TX
Performance Engineer
Responsibilities:
- Gathered requirements from business teams and stakeholders, prepared test plans.
- Set up meetings with Product Owner, Business team, Architecture team, development team in preparing the Non-Functional Requirements for the application under test.
- Prepare Test Plan/Strategy and share with key stake holders, product manager, business lead, & development team for Sign-Off.
- Prepare and Document performancetest cases and test data setup process before start of testing.
- Develop scripts using Web (HTTP/HTML) & Web Services protocols through Load runner tool and enhance the scripts to support error/data handling.
- Evaluated Think Time & Pacing calculations in preparing the scenario designs for testing with various loads ranging from 5 TPS to 600 TPS.
- Test execution includes: Smoke Test, Capacity Test, Stress Test, Endurance Test, and HTTP Watch Test - Client-Side Page analysis.
- Execute single user test from front end to do page load analysis for test transactions using HTTP Watch Tool.
- Analyze the front-end metrics involving response times, hits, and throughput obtained from load runner tool for each executed load test.
- Analyze the tomcat server metrics - JVM Heap Usage metrics: Used Heap after Collection, %time spent in GC, Young Generation Usage, Old Generation Usage, Perm Gen Usage; Process CPU.
- Prepare the consolidated test report for each executed load test in MS Word, MS Excel & Email format after completion of analysis.
- Used Splunk tool queries to pull the peak hours' load volumes for the business-critical transactions to be performancetested.
- Track and Tabulate the defects pertaining to Key PerformanceIndicators not meeting the historical baselines test results/SLAs and functionalities not working as expected during manual validation after code deployment.
- Prepared and published detail Test Report for app team, solution architects, product owners and management.
- Support Testing in Production region to replicate the Prod issues (P1 tickets) in Test region then validate the fix and provide Sign-Off. The P1 tickets were mostly about the ones listed below:
- Connection termination with CPU reaching threshold of 85% - fix here involved Optimization of Search Java Code that avoids duplicate calls hitting the DB.
Environment: Load Runner, HP Performance Center, HTTP/HTML, Dynatrace, Windows, Splunk, Grafana.
Confidential, Anaheim, CA
Performance Engineer
Responsibilities:
- Interact with business leads, solution architects and application team to develop and mimic production usage models by collecting non-functional requirements for multi-year rollout of large volume SOA.
- Involved in gathering all the requirements from various teams and worked on the test plan and test strategy documents for projects based on the NFR's.
- Follow Agile (Scrum) process, the performance validation process goes by the 'Work Done & Ready to Go' approach from time to time, release to release and in specific sprint by sprint.
- Worked with developers in understanding the code and application in order to develop the Load scripts.
- Created Performance Load Test detailed plan and Test Case Design Document with the input from developers and functional testers.
- Developed the Performance Scripts using VuGen.
- Analyzed scalability, throughput, and load testing metrics against AWS test servers to ensure maximum performance per requirements.
- Developed scripts and executed load test in Production Simulating Peak day volume.
- Developed Test Plans, Test Scenarios, Test Cases, Test Summary Reports and Test Execution Metrics.
- Developed robust Vuser Scripts using Web (HTTP/HTML, AjaxTruClient,& web services, Citrix protocols in load runner for applications.
- Worked extensively with JSON/XML data and SOAP protocols in Non-UI Web services (SOA) Testing.
- Conducted Testing on vendor environment to capture baseline and benchmark after migrating to Azure.
- Responsible for setting up Site scope monitors to monitor network activities and bottlenecks and to get metrics from App/Database servers.
- Configured and used DynaTrace for performance monitoring and performed trouble shooting on Bottlenecks with performance testing along with response times, analysis and profiling the application to find out where the performance issue.
- Monitored Metrics on Application server, Web server and database server.
- Used Splunk to monitor and collect the metrics of Performance test servers.
- Reported various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Network time.
- Analyzed JVM GC verbose logs and Heap dumps to find out potential memory leak issues.
- Responsible for Analysis, reporting and publishing of the test results.
Environment: HP Load Runner 12.53, HP performance center, HTTP/HTML, Azure, JIRA,Dynatrace, Windows, Linux, Excel, SQL, Oracle Database 11g/12c, Oracle SQL Developer, Splunk.
Confidential, New York, NY
Performance Engineer
Responsibilities:
- Participating in sessions to analyze Business Requirements and the Functional Specifications.
- Studied high level design documents and flow charts and interacted with business analysts and functional managers to clarify issues upon business requirements.
- Attending / conducting daily status meeting to review overall status.
- Reporting daily test execution status to the team / weekly status to the test manager.
- Writing Test Plans and Test Cases corresponding to business rules and requirements.
- Coordinating the test effort with the various teams including developers, designers, architects, BA.
- Preparing Test scripts to run in the Performance environment.
- Implementing automated performance test process usingLoadRunner.
- Used Virtual User Generate VuGen Scripts for web protocol.
- Enhanced the performance of the application by identifying the bottlenecks and providing advancedperformance tuning.
- Analyzing the test results to determine the performance of the business application and reporting.
- Monitored PerformanceMeasurements such as end-to-end response time, network and server response time, server response time, middleware-to-server response time.
- Developed User-Acceptance Test scripts and assisted users in conducting UAT.
- Performing System, Functional, and Regression Testing using selenium.
- Involved in various performance testing such as Load, Stress and Volume testing using LoadRunner andApache Jmeter.
- Used Jenkins to automate the batch testing and Load testing.
- Run SQL queries to retrieve data from backend systems.
- Supporting testing during code deployments to production
- Involved in checking the functionality of the system in pre/post production release processes.
- Responsible for monitoring the Infrastructure behavior using App Dynamicsduring Load Test execution to identify performanceBottle Necks if any.
- Usage of VTS for test data creation to avoid data mismatch.
- Updated the defect list and performed the regression testing on new version of the application.
- Worked on the multi-tier structure.
Environment: HP LoadRunner, Performance center, ApacheJmeter, Selenium, Jenkins, ALM, AppDynamics, Web(HTTP/HTML), Web Services.
Confidential, Long Beach, CA
Performance Tester
Responsibilities:
- Created Performance Test Plans
- Designed and developed performance testing automation artifacts (scripts, functions, scenarios, processes) for simple to complex testing situations using HP Load Runner.
- Created scenarios like Basic schedule by load test/group, Real world schedule by load test/by group as per the requirement in HP Performance Center
- Create clear documentation to scripts for the benefit of peers who are not involved in the scripts' creation.
- Assessments of Functional Analysis Documents, Business Studies, Business Related Documents to create Test Plan and Test Strategies.
- Analyzed requirements, detailed design, and formulated test plan for the functional testing of the application.
- Developed Vuser scripts in Web HTTP/HTML and Web service protocols in Load Runner using Load Runner VuGen.
- Inserted transactions, checkpoints into LoadRunner Web Vugen Scripts and Parameterized, Pacing and Correlated the scripts.
- Worked on handling the application response for Positive and Negative 'Testing.
- Execution of test cases and interaction with the coding team to report and correct errors for every Version release.
- Responsible for Integration and Regression testing.
- Editing of automated scripts by inserting logical commands to handle complicated test scenarios.
Environment: DB2, CA Wily Introscope, SOAP UI, Gomez, HP ALM, Web services, Ajax TrueClient, WSDL, VMware, Citrix, windows, Load Runner, Performance Center, J Meter, Java.
Confidential
QA/Performance Tester
Responsibilities:
- Involved in Functional Testing, GUITesting, Compatibility Testing and Performance Testing.
- Gathered business requirements, studied the application, and collected information from developers and architects.
- Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
- Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
- Involved in Regression Testing using Selenium.
- Written LoadRunner Scripts, and enhanced scripts with C functions.
- Developed Vuser Scripts using LoadRunner Web HTTP/HTML protocols based on user workflows.
- Developed and maintained stress as well as load test scripts along with business cases for PeopleSoft Timesheets and Labor, Financials, HRMS and eRecruit.
- Involvement in Automation Environment Setup Using Eclipse, Java, Selenium WebDriver Java language bindings and TestNG jars.
- Validated the scripts to make sure they have been executed correctly and meets the scenario description.
- Analyzed results using LoadRunner Analysis tool.
Environment: HP LoadRunner, Selenium, TestNG, HP QC/ALM, HTML, UNIX/Windows.