We provide IT Staff Augmentation Services!

Lead Performance Engineer Resume

2.00/5 (Submit Your Rating)

Washington, DC

SUMMARY

  • Over 8+ years of experience in Performance Engineering & Performance Testing Tools using HP Load Runner, JMeter, Performance Center, Blaze Meter, Neo Load etc.
  • Proficient in different phases of testing like Sanity Testing, Functional Testing, GUI Testing, Regression Testing, Integration Testing, System Testing, Performance Testing, User Acceptance Testing (UAT)
  • Extensive hands - on experience in Web (HTTP/HTML), Truclient - Firefox/IE, CITRIX, Web services
  • Ability to identify non-functional requirements through business requirements analysis and application architecture information.
  • Experience is creating the load models for applications and their service level agreements by working with the project team
  • Working with various Project teams and determine performance testing needs / goals, Review and finalize deliverables /documentation.
  • Extensive experience on monitoring tools like Dynatrace, Perfmon, Splunk, Wily Introscope, and AppDynamics
  • Extensive experience on end to end activities in Performance Engineering / Testing including requirement gathering discussions with Business teams, creating test plan documents, finalizing the type of test as per the requirement, creation of test scripts, creating test scenarios, conducting test executions with support teams, analyzing results, creating reports, giving result walkthrough to Business / app teams & providing recommendations and create defects.
  • Having good experience in resource planning for upcoming projects. Active participation in task allocation plan meetings with the team.
  • Expertise in Analyzing logs & Graphs of scenario to find the issues in the application and work with the stakeholders to resolve them
  • Extended performance testing experience on various applications includes CITRIX, Web services, REST API, etc.,

TECHNICAL SKILLS

Operating system: Windows2000, Windows NT, AIX, UNIX, Red Hat Linux, Solaris

Environment: Web and App Servers, Web Logic, Web Sphere, MQ Series, MTX, IIS, LDAP, Apache server, SQL Servers, MQ Series (IBM and MS)

Databases: MS SQL Server, Oracle, IBM DB2

Languages: Java, JSP, Html, Dhtml, Visual Basic, Oracle, C, C++, SQL, XML, .Net, C sharp, ASP

Testing tools: Load Runner, Jmeter, Performance Center, QTP

Monitoring tools: Splunk, HP SiteScope, CA Wily Introscope, New Relic, Dynatrace, AppDynamics, NMON, LPAR2RRD

Methodologies: RUP, Performance Engineering, CMM, TQM, Quality Assurance

PROFESSIONAL EXPERIENCE

Confidential - Washington DC

Lead Performance Engineer

Responsibilities:

  • Supervised and created comprehensive test plans for Performance testing and Regression testing
  • Verified and validated test data through VuGen/LoadRunner scripts where a smoke test was necessary before a performance test run for the data integrity and the scalability of the system
  • Supervised the system test plan and test case walkthroughs with application partners
  • Oversaw routine generation of defect metrics and communicated them to upper management on a timely basis
  • Performed extensive hands-on testing of the claim payments system for routine enhancements that were used towards processing 150 million claims annually
  • Performed load, stress testing on COVID-19 changes with changes on procedure code and diagnosis code in different applications
  • Organized and conducted daily touch point meetings during the test cycle and for conducting post-deployment project review
  • Experienced in microservice-based architectures and automated the REST APIs using REST Assure, Java, Python, REST APIs and postman tool, Soap APIs using SoapUI tool.
  • Involved in Google Cloud environment setup
  • Analyzed Requirements Specification; created Requirement Traceability Matrix and contributed in developing the test strategy
  • Tracked all bugs using Quality Center and reviewed the status of test execution by tracing defects to requirements
  • Reported and tracked the test results for test cases using Quality Center
  • Executed Black Box testing, Volume testing, System, Stress, User Acceptance, Load and Performance testing and Regression testing
  • Lead stress testing efforts using Stress Stimulus tool for on-premise architecture
  • Developed Web Vuser Scripts to capture end-user activities for Web based application using LoadRunner
  • Used Correlation and Parameterization to enhance Web Vusers
  • Generated the volume, time, stress graph to support performance tuning of Webserver, Application server and Database server
  • Used Load Runner analysis to develop Microsoft Word report, where higher management can easily understand Load Runner results
  • Worked on finding the system resources usage, network and application bottlenecks by using LoadRunner Analysis Reports
  • Used LoadRunner to run a successful test of the application with 100, 200, 500 Virtual Users on Production prior to application going live
  • Performed stress test with 100k file on stored procedure using JMeter by adding JDBC connection configuration to the thread along with JDBC request in Sampler
  • Provided ongoing updates of completed and anticipated testing activities, of problems encountered, and test results to the project status pages for each application during the life of the testing cycle
  • Check Mainframe systems to see if any performance degradation is happening based on the batch jobs running and turn them to low priority on needed basis
  • Used LPAR2RRD to see the memory and CPU consumption by the WAS pool, UDB pool and the servers residing in those pools.
  • Assisted the monitoring support team in the installation and integration of various applications to AppDynamics.
  • Checked AppDynamics to see the application slowness and drilled it down to whether the slowness is due to any SQL query or any queues processing the messages slow.
  • Assist Monitoring team to onboard data to Splunk using Splunk forwarders, syslog, and API integration with DB Connect
  • Troubleshoot, modify, and create Splunk reports, alerts, and dashboards, including dashboards with tokens/input filters
  • Used Java Melody to check the behavior of the JAVA based application servers.
  • Used Grafana to monitor user behavior, application behavior and also the errors during the loas testing to get more metrices on performance of application under load Run DB2 queries to check the entry of the records and retrieve the data to use as test data for performance
  • Worked with Development / Configuration / Architects to troubleshoot performance issues we encountered (WAS / LDAP /CICS / MQ Series/ EIAM / DB2 / AIX LPARS).
  • Dramatically improved quality by increasing performance bug identification prior to production releases.
  • Prepared and delivered periodic status reports on the progress of individual projects to customers and internal program and project managers
  • Assisted in the planning and development of testing schedule, test plans, and test cases for UAT testing
  • Validated data against the backend database using complex SQL queries using SQL Developer
  • Implemented extensive SQL queries to verify the data integrity
  • Used SQL queries and executed PL/SQL (procedures, functions, etc.) to validate the data
  • Creating Indexes, hints to improve the performance of database based on the SQL tuning
  • Involved in a pilot project were HBase Database is used for big data store for real time access to the data
  • Prepared daily and weekly Execution Status Report.

Environment: Load Runner, JMeter, Wily, J2EE, .Net, WRM, IBM Web Sphere Console, Quality Center, Performance Center (ALM), Google Cloud, Docker, Kubernetes, Swagger, LPAR2RRD, IBM DB2, Mainframe, Dynatrace, AppDynamics, Grafana, Java Melody, Splunk, Postman. SoapUI

Confidential - Chandler, AZ

Performance Engineer

Responsibilities:

  • Supervised and created comprehensive test plans for Performance testing and Regression testing
  • Verified and validated test data through VuGen/Load Runner scripts where a smoke test was necessary before a performance test run for the data integrity and the scalability of the system
  • Developing and execution of test scripts for performance testing using HP Load Runner, JMeter
  • Set up GitLab and Jenkins pipeline for Continuous Delivery (CD) and Continuous integration (CI) projects
  • Developed, maintained, and enhanced micro service CI/CD process
  • Supervised the system test plan and test case walkthroughs with application partners
  • Oversaw routine generation of defect metrics and communicated them to upper management on a timely basis
  • Performed extensive hands-on testing of the claim payments system for routine enhancements that were used towards processing 150 million claims annually
  • Organized and conducted daily touch point meetings during the test cycle and for conducting post-deployment project review
  • Analyzed Requirements Specification; created Requirement Traceability Matrix and contributed in developing the test strategy
  • Tracked all bugs using Quality Center and reviewed the status of test execution by tracing defects to requirements
  • Reported and tracked the test results for test cases using Quality Center
  • Executed Black Box testing, Volume testing, System, Stress, User Acceptance, Load and Performance testing and Regression testing
  • Developed Web Vuser Scripts to capture end-user activities for Web based application using Load Runner
  • Used Correlation and Parameterization to enhance Web Vusers
  • Generated the volume, time, stress graph to support performance tuning of Webserver, Application server and Database server
  • Used Load Runner analysis to develop Microsoft Word report, where higher management can easily understand Load Runner results
  • Worked on finding the system resources usage, network and application bottlenecks by using Load Runner Analysis Reports
  • Used Load Runner to run a successful test of the application with 100, 200, 500 Virtual Users on Production prior to application going live
  • Provided ongoing updates of completed and anticipated testing activities, of problems encountered, and test results to the project status pages for each application during the life of the testing cycle
  • Prepared and delivered periodic status reports on the progress of individual projects to customers and internal program and project managers
  • Assisted in the planning and development of testing schedule, test plans, and test cases for UAT testing
  • Doing end to end performance engineering using Chrome Audits, Page Speed, developer tools of IE and fiddler.
  • Worked on Kafka Backup Index and in-depth knowledge of Apache Cassandra
  • Validated data against the backend database using complex SQL queries using SQL Developer
  • Implemented extensive SQL queries to verify the data integrity
  • Used SQL queries and executed PL/SQL (procedures, functions, etc.) to validate the data
  • Prepared daily and weekly Execution Status Report.

Environment: Load Runner, JMeter, GitLab, Apache Cassandra, Controller, Wily Introscope, Dynatrace, Page Speed, Fiddler, Chrome Audit, Oracle, MS SQL Server, F5 Load Balancer, JAVA, .NET, ALM, Web, Windows 2000/XP, AIX, Linux, Windows Server 2012 R2

Confidential, Boston, MA.

QA Performance Tester

Responsibilities:

  • Gather Business requirements and create Service Level agreements based on business-critical transactions
  • Create Test Plan, Test Scripts, Test Strategy and Capacity Planning documents
  • Setup and lead Daily/weekly team meetings related to Project schedule, test status, test window and release management
  • Identify bottlenecks in the system during the performance testing and create root cause analysis report
  • Used different tools like Fiddler, BlazeMeter, Badboy, and firebug extension in developing the test scripts.
  • Extensively used different components like Beanshell pre/post processors and wrote different JavaScript codes depending upon the requirements in JMeter
  • Took responsibility as a software QA to generate multifunctional test conditions, scripting and execution of manual as well as automated test procedures both data and functionality driven.
  • Interact with performance engineers in creating scripts based on design analysis
  • Perform System Analysis for Application performance of various web and desktop applications based on throughput, bandwidth and response times.
  • Perform data validation from initial state to end state with the help of subject matter experts. Monitoring the data process and observing the system for the available space on the system, any major failures during load simulation process
  • Setting up the scenarios and running the load testing, duration testing, capacity testing and stress testing
  • Evaluate performance of ETL jobs in a disaster recovery (DR) environment during the data extraction/loading and batch execution phase.
  • To understand & troubleshoot response time degradation of each internal job at a granular level, jobs are set up sequentially in control-M and logs are captured for further investigation.
  • Create/modify existing shell scripts to execute batch jobs in streams or parallel.
  • Monitor the heat map of UNIX box using NMON/Wily while using DB trace to analyze the traffic flow between the source and target databases respectively
  • Designed and Developed tools like -Job Monitoring Dash Board, Tech Support tool, Junk Character Cleanup (hosted in IBM Bluemix) and submitted as assets in IBM Light House
  • Analyze the results in excel for various intervals of tests and creating the test report with the findings
  • Perform capacity planning based on breaking point of application for different types of server load simulations
  • Use SharePoint to store all testing artifacts, test cases, test plans, performance summary report, and performance test metrics, defect tracking and organize a calendar to reserve the testing time slot for any given environment

Environment: Load Runner, JMeter, Blazemeter Performance Center, Oracle, MS SQL Server, Wily Introscope, Weblogic, F5 Load Balancer, JAVA, .NET, Quality Center, Web, Windows 2000/XP, AIX, IBM light House, Amazon AWS

Confidential

QA Tester

Responsibilities:

  • Created Performance Test Plans in support of End-to-End system performance Evaluation and analysis.
  • Prepared Test cases, Test design, and Test strategies based on the performance test plans
  • Performed Functional, Integration, System, Regression, Back end and Acceptance Testing.
  • Created and executed SQL queries to perform Back-End database validations.
  • Examined system behavior & performance using Load Runner.
  • Involved in Configuring Load Runner, recording VuGen scripts for various scenarios and analyzing the results.
  • Used Manual Correlation and Auto Correlation technique, set-up runtime setting, created scenario, analyzed the results to find out bottlenecks and root causes.
  • Worked closely with software developers and took an active role in ensuring that the software components meet the highest quality standards.
  • Compared and analyzed actual to expected results and report all deviations.
  • Produced regular Project Status Reports to senior management.

Environment: Load Runner, Oracle, SQL Server, Windows, UNIX, XML, IIS, Java & MS-Office.

We'd love your feedback!