We provide IT Staff Augmentation Services!

Senior Performance Engineer Resume

0/5 (Submit Your Rating)

Alpharetta, GA

SUMMARY

  • Overall 9 years of experience in performance testing as Performance Engineer, Performance Consultant, Performance tester, QA Analyst and QA Tester.
  • Worked on various domains like Banking, Telecom, Finance, Healthcare and Retail with a special skill set.
  • Extensively in using different protocols in LoadRunner like Web (HTTP/HTML), web services, oracle NCA, True Client, Siebel, Siebel web etc.,
  • Strong experience in Application Testing and Quality Assurance of web based applications, Client Server, SOA (Service Oriented Architecture), DB and Mobile Applications.
  • Expert in performance testing, manual testing and automation testing and experienced in using different tools like HP Quality Center, ALM, LoadRunner.
  • Experienced with Open Source tools like JMeter, SOASTA, SOAPUI.
  • Extraordinary understanding in Software development lifecycle (SDLC) and QA methodologies.
  • Experienced in working with Agile and Scrum methodologies.
  • Well experienced in using different tools like Selenium, Web - driver, Cucumber, Dynatrace, Quick Test Professional (QTP), Perfmon, HP Site scope.
  • Experienced in working with MongoDB Ops Manager.
  • Experienced in working with load test MongoDB using JMeter.
  • Involved in Performance Tuning of the applications using Various Monitoring tools, Bottleneck finding, Performance Tuning, Deployment and Retesting.
  • Maintaining close interaction with the architecture team for performance tuning.
  • Thorough exposure in different technologies like JAVA, Oracle, web services, WebSphere, SOAP.
  • Strong in different phases of Performance testing like scalability, stability, volume, Spike, stress, Smoke, Load, Stress, database, and failover testing.
  • Extensive experience in determining the systems configuration for LoadRunner, involved in negotiating with HP and In Commercial talk for HP LoadRunner to the client.
  • Expert in doing performance testing in agile and Waterfall Methodology. Raised and Tracked Defects using JIRA, QC tools.
  • Also attended Scrum stand up meetings and taken part in various Sprints for projects.
  • Expertise in using monitoring tools like Sitescope, Dynatrace, Interscope, and SPLUNK.
  • Experience in writing complex SQL Queries, extract data from Oracle, MS-SQL Server and IBMDB2 Data base.
  • Good at carrying back end testing using SQL Queries, TOAD, and Teradata SQL assistant.
  • Excellent in using load testing, stress testing, smoke testing, regression testing, endurance testing, integration testing and UAT testing.
  • Expert in analyzing the business requirements and writing the teat cases.
  • Proficient in installation and administration using Performance Center and Quality Center.
  • Expertise in communication and documentation skills in QA process.
  • Thorough in tuning application to improve response times, queues and performance using LoadRunner.
  • Experienced in using JMeter for database back end testing with JDBC and ODBC connections.
  • Strong analytical, logical, presentation and communication skills.

TECHNICAL SKILLS

Operating Systems: Windows, Solaris and Linux

Languages: Java, JSP, Html, DHTML, Visual Basic, Oracle, C, C++, SQL, XM, .Net, C #, ASP.

Databases: Oracle 9i/10G, DB2, MS SQL Server, MS- Access, My Sql

GUI: VB, JSP, ASP, HTML

Web Related: XML, XSLT, XPATH, XSL, IIS (7.0/6.0/5.0/4.0 ), XHTML, SOAP, WSDL, UDDI, XML Web Services, DHTML

Testing Tools: LoadRunner 8x/9x/11x/12x, HP Performance Center, HP Quality Center ALM, Jmeter, Web load Quick Testpro and Win Runner, QTP

LoadRunner Protocols: Web (Http/html), Web Services, Citrix, Oracle NCA, SQLScripting, JAVASCRIPT, ADO.net, Ajax TruClient and Mobile HTTP/HTML

Web/ Application Servers: MS IIS, Apache Tomcat, Websphere, Web Logic

Monitoring Tools: HP Sitescope, HP Diagnostics, Wily Introscope, PerfMon, and Dynatrace

Other tools: MS Project, MS Office, MS Visio

PROFESSIONAL EXPERIENCE

Confidential, Alpharetta, GA

Senior Performance Engineer

Responsibilities:

  • Expert in performance testing, manual testing and automation testing and experienced in using different tools like HP Quality Center, ALM, LoadRunner.
  • Used JMeter Post Process Regular Expression Extractor to parameterize the input values and correlate dynamic system unique values.
  • Extensively worked on web services by using SOAP UI and checked the performance of UI applications.
  • Finding Bottlenecks and solving the issues on Linux servers using different monitors.
  • Expert in implementing scrum methodology by using JIRA to track and close the bugs. Actively participated in sprint meetings.
  • Strongly used HP quality center and ALM to ensure traceability and to know the progress of testing efforts.
  • Used Jenkins to set up CI for new branches, build automation and plugin management.
  • Worked as Software QA tester with full system development lifecycle including designing, developing, and implementing test plans and test cases.
  • Managing performance test activities for multiple projects.
  • Worked on two cloud-based web applications and developed automation scripts using Java script, cucumber.js, and protractor.
  • Project installation and setup for protractor, cucumber and selenium software.
  • Designed and executed test cases by selenium automation testing and worked on functional and regression testing.
  • Used Blazemeter to convert VuGen scripts in LoadRunner to JMeter scripts.
  • Used HP Diagnostics, Insight and Dynatrace Ajax for server-side monitoring during load test, stress test, longevity test and regression test.
  • Incorporating new changes in web application at code optimization level by using dynatrace monitoring during performance testing.
  • Created scripts for web applications by components using LoadRunner, JMeter, and Visual Studio.
  • Strong in creating test plans, test cases and test scenarios based on non-functional business requirements.
  • Created and monitored dashboards on Wily Introscope for enterprise applications.
  • Used LoadRunner analyzer to analyze performance results, and to monitor transactions per second, response time, hits per second, throughput, window resources and data server resources.
  • Expert in performance testing, load testing, and stress testing using LoadRunner and JMeter.
  • Professional in Performance Center and Monitoring tools like Site scope and Topaz.

Environment: LoadRunner 12.50,12.53,12.55, JMeter 3.3, 4.0 JIRA, JAVA, Cucumber, Site scope, Performance center, SOAP UI, TOAD, SQL, UNIX, HP Quality Center, ALM.

Confidential, Washington DC

Senior Performance Engineer

Responsibilities:

  • Worked on Performance Center to create scenarios and execute load tests by parameterization and correlation with the help of VuGen.
  • Strongly used various protocols in LoadRunner to create and test the scripts - Web http/html, TrueClient, webservices, Siebel, Siebel web etc.,
  • Expert in using LoadRunner for application tuning to improve the response time, queues and performance.
  • Developed automation test scripts supporting regression test using selenium web driver API and Python on PyUnit framework.
  • Used parameterization, correlation and content check, etc., to develop the scripts in Load runner.
  • Used Eggplant Function, JMeter, and Selenium to obtain proof of concept for applications.
  • Strong in using Dynatrace to measure web application performance in test environment and to capture the performance metrics.
  • Worked on Soasta/ JMeter to create, run performance tests and create reports.
  • Involved in creating and maintaining of the projects using HP Performance Center 9.52/11.52/ Soasta cloudtest /Jmeter.
  • Created Soasta cloud test scripts and monitored the tests and performed various performance tests.
  • Implemented parameterization in JMeter both manually and by data driven wizards and recorded scripts in JMeter.
  • Generated different graphs and reports in SPLUNK.
  • Used JProfile to resolve performance bottlenecks, pin down memory leaks and to understand threading issues.
  • Implemented performance scripts for overhead, scalability and stability.
  • Used reliability concept to check the stability of the system under heavy work load.
  • Experienced in defect tracking and test case execution using ALM QC.
  • Performed various type of performance testing like baseline, smoke testing, stress testing, load testing and endurance testing depending upon the requirement.
  • Implemented work using Agile and waterfall methods.
  • To discover potential ALM (Application Lifecycle Management) initiated and worked with the business.
  • Monitored CPU/memory utilization using tools like UNIX, Axibase.
  • Generated scripts using Web http/html, Siebel web, Oracle, web service scripts.
  • Applied agile methodology on SFDC and Siebel release.
  • Used different SSO login tests and executed them for SAP and Siebel applications.
  • Participated in Planning and Coordinated in entire QA life cycle.
  • Actively participated in performance test results analysis to establish the individual benchmarks and baseline for J2EE and .NET applications.
  • Worked in Agile methodology by involving in scrum and sprint planning meetings.
  • Monitored and drill down by using tools like Perfmon, HP Site Scope, Dynatrace, and new Relic.
  • Used HP ALM to test the application life cycle management and defect tracking.
  • Used HP quality center for test executions and JIRA to report the defects and collaborate with development team.

Environment: LoadRunner 12.00, 12.50, 12.53, JIRA, JMeter 3.0, 3.1 SQL, Dynatrace, SOAP UI, AppDynamics, JAVA, Linux, Sitescope.

Confidential, Cranston, RI

Performance Engineer

Responsibilities:

  • Used LoadRunner to test Python script integration. Monitoring AppDynamics for Python based applications.
  • Used controller in LoadRunner to run the scripts and check whether the scripts are working as expected on multiple load generators.
  • Used JMeter 2.11 and Silk tool for recording the scripts.
  • Used JMeter, LoadRunner to check the application performance designed and executed scenario.
  • Used JIRA to track the test summary and execution of the Scenarios in the test plans document. Found and logged defects in Rally/JIRA.
  • Strong in using Agile test processing with Scrum methodology for best project results.
  • Analyzing the reports, solving bottleneck issues and troubleshooting using performance center with the help of monitoring tools and Splunk performance metrics.
  • Executed code profiling and performance tuning with database optimization and tuning.
  • Used HP Sitescope to monitor team export control backup for server monitoring and used HP BSM for application monitoring.
  • Executed internal testing using LoadRunner and performance center for ERP and Non ERP within the team.
  • Reporting about the number of scripts created, number of tests executed, average time completion for each week and monthly for EC, ERP and Non ERP.
  • For each end to end functional scenario prepared scripts using VuGen in LoadRunner.
  • Monitored CPU and Memory usage using Throughput, Hits/sec, and Running vusers, error graphs in load test or stress test.
  • Used HP LoadRunner to setup multiple load generators to generate the load.
  • Used API web services to test with SOAP UI tool.
  • During performance testing ETL’s are executed to monitor the database servers.
  • HP diagnostics is used to configure and monitor servers.
  • To meet the SLA for a non-functional business requirement created test cases, test strategy, test plans and test scenarios.
  • Used Eclipse tool forJAVA and Selenium, TestNG to test GUI functionalities as per the test cases.
  • Configuration of AppDynamics to mission critical applications. Application server configuration for AppDynamics monitoring.
  • Transform existing Sitescope, Dynatrace to AppDynamics. Developing Dynatrace dashboards, develop own dashboards and troubleshoot issues with Dynatrace.

Environment: LoadRunner 11.5,12.0, 12.2, JMeter 2.11, 2.12 SQL, JAVA, AppDynamics, Dynatrace, JIRA, Sitescope, Silk performer, Python, Selenium, Performance center, Splunk, HP ALM, Wily Introscope, Quality center.

Confidential, Columbus, Ohio

Performance Tester

Responsibilities:

  • Used Rendezvous to generate load onto the server in LoadRunner and used Parameterization and Correlation to measure its performance.
  • In LoadRunner script is generated using VuGen, defined scenarios according to the client requirement and the script is run using controller.
  • Used different protocols like Web, Web Services, AJAX, AJAX true client, Citrix and RTE Protocols.
  • Performed different performing tools like JMeter, HP LoadRunner and Oracle performance testing.
  • Using LoadRunner created performance scripts for applications like sales tools.
  • Used Linux virtual server to conduct load testing and database testing.
  • Used automation scripting and regression test is being performed.
  • Used performance monitoring to analyze graphs like Response Time, Hits/sec, TPS and Throughput.
  • Worked dynamically to meet the SLA (Service Level Agreement) for an application.
  • Involved in regular meetings with developers, data base analysts, business to meet the performance of an application.
  • Used SQL navigator to perform and execute SQL commands.
  • Experienced in using QTP regression scripts to check the Peoplesoft functionality.
  • Used Ramp - up and Ramp - down concepts in LoadRunner.
  • To find the root cause analysis Wily Introscope is used.
  • Performance resources are monitored over Oracle Web logic, Tomcat, AIX layers.
  • Expectations in SQL, LoadRunner 11.5, Oracle, JMeter 2.7, QTP 11.5, Dynatrace.
  • Used Mercury LoadRunner to develop and execute LoadRunner test Harness.
  • Used web HTTP/HTML, Sybase, Oracle 2-Tier Protocol and RTE in LoadRunner.
  • Designed and used load test, stress test to validate the application.
  • Used LoadRunner to perform scripts for OMS/AMSS/CRM applications.
  • Used JDBC and ODBC connections in JMeter for database testing.
  • HTTP protocol is used to creating the scenarios in load test.
  • Monitored Tibco services in User Acceptance Testing (UAT) and Vendor Integration Testing (VIT) using Tibco server operations.
  • Draining queues by performing JMS queues and Hermes.
  • Root cause analysis is used to resolve data quality issues.
  • Experienced Toad database connections and obtained TNS name priorities by using extensive back-end database testing with complex PL/SQL queries in toad.
  • Handled multiple projects for QA and UAT team.
  • To combine the test requirements based on the design specifications and to upload them in the quality center HP quality center is used.
  • System requirement specifications, technical specifications and functional design documents are revised to better analyze and understand the project.
  • Used Perl to develop and deploy automation test scripts.
  • JIRA is used to report the defects.
  • SPIRA defect tracking system is used to detect the defects.
  • Identifying bottleneck issues and analyzing the results.
  • Monitoring memory, thread usage and network utilization on UNIX server.
  • Involved in working with SFTP, FTPS, HTTPS Protocols by using client tools like core FTP, PSFTP for SFTP.
  • Worked with different teams DBA, AD to troubleshoot the issues.
  • Experienced in using Dynatrace and J profile to monitor and test the application.

Environment: s: TOAD, TIBCO, Quality Center, LoadRunner 11.5, 12.0 UNIX, Dynatrace, JIRA, SPIRA, SQL, JMeter 2.5.1, 2.6, 2.8, 2.9, 2.10

Confidential

QA tester

Responsibilities:

  • Thorough in defining goal and objectives depending upon the client requirement.
  • Monitoring HP LoadRunner controller.
  • Experienced in Using HP LoadRunner to include transaction points, rendezvous points in Vuser script.
  • LoadRunner and system administrator knowledge is used to enhance the production testing.
  • Preparing scripts using JMeter.
  • Monitoring logs in VuGen, controller and observing performance, errors, messages and server logs and debugging the issues.
  • Oracle J2EE in general automation is used including maintenance and execution.
  • QTP programming and VB scripting are executed with the help of SoapUI validation.
  • SOAP and WSDL are used to develop web services and backend systems.
  • Executing and analyzing test results.
  • Direct contact with the client when the performance testing is under process to monitor the system and get ready for optimization.
  • Allocating and providing LOE (level of effort) to the project manager based on the requirements.
  • By performance tuning and recording the recommendations bottleneck situation are identified and handled.
  • From Skill Port to SURE involved in ETL (extract, transform and load) testing which is used for data transformation
  • Decision analysis and resolution (DAR) is prepared for performance checklist.
  • Preparing monthly dashboard and weekly status report.
  • Used APM tools like Dynatrace, App dynamics used to monitor the applications and finding performance issues.
  • Developed and executed test scripts for functional and regression using selenium Web driver.
  • Experienced in Executing test plans, SQL and PL/SQL and enhancing the software to meet the system requirements.
  • Creating MS Excel and MS Access reports and extracting data from Oracle System.
  • Environment: HP LoadRunner 9.52, SQL, PL/SQL, Selenium Webdriver, JMeter 2.3.4, QTP 11.0, J2EE, ETL, SoapUI validation, Dynatrace.
  • Used VuGen (Virtual user Generator) to develop the scripts and perform testing.
  • Used controller to create goal-oriented scenarios and run the scenarios in LoadRunner.
  • While generating the script parameterization and correlation is used to capture the dynamic value and avoid hard-coded values.
  • Thorough in creating scenarios to test different kind of performance testing like load test, stress test, endurance test and smoke test.
  • Contributed SDLC (System Development Life Cycle) in performance testing, management, analyzing, prototyping and trouble shooting.
  • LoadRunner Diagnostic tools are used which relates to controllers and agent machines.
  • Automation tasks which includes smoke test executions covering 8 applications.
  • Web services protocol is used in performance testing that is related to web services.
  • Worked on QTP (Quick Test Professional) on functional testing and regression tests.
  • QTP and VB Scripting is used for SoapUI validation.
  • Test application report is delivered including AWR and ASH.
  • Implementing automation testing with different type of automation tools based on the project.
  • Worked with development team to capture the performance issues. Regular meetings with developers is maintained to discuss about the issues.
  • Reporting bugs with JIRA and discussions with developers and business analyst to resolve.
  • Used GUI testing which includes validation and screen display.
  • To know the usability of the application manual testing is used.
  • Regression testing is performed on different levels of the management software.
  • Experienced in using SQL queries in order to test the integrity of the data by inquiring the database.
  • To extract the test data mainframe JCL is submitted along with TSO and ISPF editor.

Environment: QTP 10.0, VB, SoapUI, AWR 10.1.0.5, ASH Oracle 10g, RTM, LoadRunner 9.51, 9.52, 11.0, JCL, TSO, ISPF, JAWS JFW 10.0, SQL 2008.

We'd love your feedback!