We provide IT Staff Augmentation Services!

Performance Engineer Resume

CA

SUMMARY:

  • Around 5 years of diversified experience in Software testing and quality assurance. Exposed to all phases of Software development life cycle and Software Testing Lifecycle. Possess strong analytical, verbal and interpersonal skills with capability to work independently.
  • Experience inn testing of web based, Client/Server and SOA applications.
  • Experience in performance testing on Multi - Tier Environments for different Software platforms.
  • Expertise in different testing methodologies like Agile, Scrum and waterfall.
  • Highly proficient in performance testing using LoadRunner, Performance Center, Neoload, Apache Jmeter.
  • Experience in various protocols in LoadRunner including Web (HTTP/HTML), Web Services, Citrix, RTE, Ajax TruClient, and SMTP.
  • Created Performance scenarios and scripts for various types of tests (load, stress, baseline/ benchmark/ capacity)
  • Identifying and Preparing Test cases for Performance Business Critical Transactions.
  • Expertise in recording/Coding Vugen scripts using different protocols in all types of environments.
  • Expertise in parametrization, manual correlation, run time settings and C.
  • Excellent knowledge and skills in test monitoring for transaction response times, web server metrics, Windows/Linux system resource, Web App, Server metrics, Database metrics and J2EE performance.
  • Experience in analyzing Performance bottlenecks, Root cause Analysis and server configuration problems.
  • Knowledge of Java Virtual Machine internals including class loading, threads, synchronization, and garbage collection.
  • Good knowledge in using VSTS tool for the performance testing.
  • Experience in automating the web tests using VSTS.
  • Good experience on selenium IDE and writing scripts in selenium webdriver using Java.
  • Good knowledge on SOA, DB, Siebel, PeopleSoft, SAP, Java, J2EE and .Net based applications.
  • Proficient in developing scripts using various samplers in Apache Jmeter and executing various load tests.
  • Extensively used NeoLoad for executing and creating test scripts.
  • Execution of Manual Test Scripts and responsible to track and log the defects using Quality Center ALM.
  • Used APM for monitoring purpose of web application on java and .Net platform.
  • Extensive working experience on Servers Monitoring with the help of HP Sitescope, Dynatrace, App Dynamics HP Diagnostics, Wily Introscope, Splunk and Perfmon.
  • Used Wily Introscope to monitor the health of various applications.
  • Experience with Quality Center/HP Application life cycle Management (ALM), JIRA, Bugzilla as Test management and defect tracking tool.
  • Expertise on Web Services testing using SOAP UI.
  • Expert in analyzing performance bottlenecks such as high CPU usage, memory leaks, Buffer Usage, Queue Length, Connections, Heap Usage, Thread Usage, Missing Indexes, Recursive call ratio, SQL Queries usage, Roundtrip times and Deadlocks.
  • Took Thread Dumps and Head Dumps for finding and analyzing the bottleneck areas.
  • Familiar with writing SQL queries in the scripts to query the database.
  • Good experience in engaging with business contacts and stakeholders for requirements gathering, architecture review.

TECHNICAL SKILLS:

Testing Tools: Load Runner, Performance Center, IBM RPT, Silk Performer, Apache Jmeter, QTP, Selenium, QC/ALM, JIRA.

Load Runner Protocols: Web-HTTP, Web Services, Citrix, Oracle, NCA, SQL Scripting, JavaScript, ADO.NET, AjaxTruClient.

Monitoring Tools: Dynatrace, AppDynamics, Grafana HP Sitescope, Wily Introscope, Splunk, Perfmon, Nmon

Programming Languages: C, C++, Java/J2EE, C#, Pearl, Python.

Web/Application Servers: MS IIS, Apache, Tomcat, WebSphere, Weblogic.

Database: Oracle, Db2, SQL Server, MySQL

GUI: VB, JSP, Java, Applets, ASP, HTML

Other: Service Oriented Architecture (SOA), Web services, Fiddler, Firebug, SOAPUI, WSDL, WCF.

PROFESSIONAL EXPERIENCE:

Confidential, CA

Performance Engineer

Responsibilities:

  • Interact with business leads, solution architects and application team to develop and mimic production usage models by collecting non-functional requirements for multi-year rollout of large volume SOA.
  • Follow Agile (Scrum) process, the performance validation process goes by the 'Work Done & Ready to Go' approach from time to time, release to release and in specific sprint by sprint.
  • Worked with developers in understanding the code and application in order to develop the Load scripts.
  • Involved in Mobile app performance testing for different iOS versions.
  • Created Performance Load Test detailed plan and Test Case Design Document with the input from developers and functional testers.
  • Involved in setting up AWS performance environment in cloud.
  • Used cloud watch for monitoring of the EC2 instances.
  • Analyzed scalability, throughput, and load testing metrics against AWS test servers to ensure maximum performance per requirements.
  • Production Testing executed using HP Storm Runner/AWS and associated load test scenarios for cloud performance testing are build.
  • Developed scripts and executed load test in Production Simulating Peak day volume.
  • Gathered business requirements, analyzed the application load testing requirements and developed the performance test plans for DOTCOM/Web API and other enterprise applications
  • Developed Test Plans, Test Scenarios, Test Cases, Test Summary Reports and Test Execution Metrics.
  • Developed robust Vuser Scripts using Web (HTTP/HTML, AjaxTruClient, SAP & web services, Citrix protocols in load runner for applications.
  • Created test data needed for performance tests using service virtualization tools like LISA/Mounte bank.
  • Worked extensively with JSON/XML data and SOAP protocols in Non UI Web services (SOA) Testing.
  • Responsible for setting up Site scope monitors to monitor network activities and bottlenecks and to get metrics from App/Database servers.
  • Configured and used DynaTrace for performance monitoring and performed trouble shooting on Bottlenecks with performance testing along with response times, analysis and profiling the application to find out where the performance issue.
  • Added Header with the script and monitoring the script Using DynaTrace Client.
  • Monitored Metrics on Application server, Web server and database server.
  • Used Splunk to monitor and collect the metrics of Performance test servers.
  • Reported various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Network time, Page Refresh Time and the Page Rendering Time.
  • Analyzed JVM GC verbose logs and Heap dumps to find out potential memory leak issues.
  • Responsible for Analysis, reporting and publishing of the test results.

Environment: HP Load Runner 12.53, HP performance center12.20, HTTP/HTML, AjaxTruClient, HP Storm Runner, AWS, Web sphere, HP Site scope, Dynatrace, Grafana, Windows, Linux, Excel, SQL, Oracle Database 11g/12c, Oracle SQL Developer, Mountebank, LISA, Splunk.

Confidential, NY

Performance Engineer

Responsibilities:

  • Participating in sessions to analyze Business Requirements and the Functional Specifications.
  • Studied high level design documents and flow charts and interacted with business analysts and functional managers to clarify issues upon business requirements.
  • Attending / conducting daily status meeting to review overall status.
  • Reporting daily test execution status to the team / weekly status to the test manager.
  • Writing Test Plans and Test Cases corresponding to business rules and requirements.
  • Coordinating the test effort with the various teams including developers, designers, architects, BA.
  • Preparing Test scripts to run in the Performance environment.
  • Implementing automated performance test process using LoadRunner.
  • Enhanced the performance of the application by identifying the bottlenecks and providing advanced performance tuning.
  • Analyzing the test results to determine the performance of the business application and reporting.
  • Monitored Performance Measurements such as end-to-end response time, network and server response time, server response time, middleware-to-server response time.
  • Developed User-Acceptance Test scripts and assisted users in conducting UAT.
  • Performing System, Functional, and Regression Testing using selenium.
  • Involved in various performance testing such as Load, Stress and Volume testing using LoadRunner and Apache Jmeter.
  • Run SQL queries to retrieve data from backend systems.
  • Supporting testing during code deployments to production
  • Involved in checking the functionality of the system in pre/post production release processes.
  • Responsible for monitoring the Infrastructure behavior using App Dynamics during Load Test execution to identify performance Bottle Necks if any.
  • Usage of VTS for test data creation to avoid data mismatch.
  • Updated the defect list and performed the regression testing on new version of the application.
  • Worked on the multi-tier structure.

Environment: HP LoadRunner 11/11.5, Performance center, Apache Jmeter, ALM, AppDynamics, Web(HTTP/HTML), NCA, Web Services.

Confidential, Long Beach, CA

Performance Tester

Responsibilities:

  • Created Performance Test Plans
  • Designed and developed performance testing automation artifacts (scripts, functions, scenarios, processes) for simple to complex testing situations using HP Load Runner.
  • Created scenarios like Basic schedule by load test/group, Real world schedule by load test/by group as per the requirement in HP Performance Center
  • Create clear documentation to scripts for the benefit of peers who are not involved in the scripts' creation.
  • Assessments of Functional Analysis Documents, Business Studies, Business Related Documents to create Test Plan and Test Strategies.
  • Analyzed requirements, detailed design, and formulated test plan for the functional testing of the application.
  • Worked on handling the application response for Positive and Negative 'Testing.
  • Execution of test cases and interaction with the coding team to report and correct errors for every Version release.
  • Responsible for Integration and Regression testing.
  • Editing of automated scripts by inserting logical commands to handle complicated test scenarios.

Environment: DB2, CA Wily Introscope, DynaTrace, SOAP UI, Gomez, HP ALM, Web services, Ajax TrueClient, WSDL, ESB, VMware, Siebel CRM, Citrix, windows, Load Runner12.02, Performance Center, J Meter, Java.

Confidential, Miami, FL

QA/Performance Tester

Responsibilities:

  • Involved in Functional Testing, GUI Testing, Compatibility Testing and Performance Testing.
  • Gathered business requirements, studied the application, and collected information from developers and architects.
  • Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Involved in Regression Testing using Selenium.
  • Written LoadRunner Scripts, and enhanced scripts with C functions.
  • Developed Vuser Scripts using LoadRunner Web HTTP/HTML protocols based on user workflows.
  • Developed and maintained stress as well as load test scripts along with business cases for PeopleSoft Timesheets and Labor, Financials, HRMS and eRecruit.
  • Involvement in Automation Environment Setup Using Eclipse, Java, Selenium WebDriver Java language bindings and TestNG jars.
  • Validated the scripts to make sure they have been executed correctly and meets the scenario description.
  • Analyzed results using LoadRunner Analysis tool.

Environment: HP LoadRunner, Selenium, TestNG, HP QC/ALM, HTML, UNIX/Windows.

Hire Now