We provide IT Staff Augmentation Services!

Performance Test Lead Resume

4.00/5 (Submit Your Rating)

MinneapoliS

PROFESSIONAL SUMMARY:

  • Professional wif more TEMPthan 13 years IT experience in IT experience in Performance Engineering of applications using Performance Testing, Application Monitoring, and tuning.
  • Gather nonfunctional requirements, understand functional specification document for application development, enhancements and prepare test plan document for performance engineering.
  • Responsible for performance testing project activities including requirements gathering, planning, designing scripts and creating test data, executing load tests, and reporting observations & recommendations.
  • Responsible for E2E Performance Testing lifecycle, monitoring, Performance Analysis, Performance tuning and route cause analysis of Bottle necks.
  • Coordinate and Collaborate wif other developers and testers to develop, test, and implement application features or requirements.
  • Onsite & Offshore performance engineering team co - ordination.
  • Perform Proof of Concept using LoadRunner and JMeter. Responsible in development of performance test frameworks.
  • Complete Performance Engineering by using test scripting/execution skills utilizing Performance Center, LoadRunner, StromRunner & JMeter using various protocols.
  • Enhancing teh scripts wif transactions, verification checks and implement code for error handling.
  • Creating different scenarios based on teh load patterns. Upload JMeter scripts in Bitbucket and execute jobs in Jenkins.
  • Improve performance of Web Applications by identifying potential bottlenecks and system performance.
  • Monitoring server Utilizations, Heap Memory, Garbage collections and problems using Dynatrace One Agent.
  • Responsible for performing Memory and Heap dump analysis. Monitor databases like Oracle and SQL Server and provide query tuning using execution plans.
  • Monitoring and analyzing Application servers like Apache Tomcat, IBM WebSphere using Sysdig, Kibana Tools.
  • Monitor Web, Application and Database servers using HP Open view, Performance manager and Grafana.
  • Tuning teh JVMs and provide recommendations for optimal performance.
  • Responsible for providing detailed Dynatrace analysis report and tuning recommendations.
  • Responsible for providing performance issue logs from Kibana using Elastic search and Logstash.
  • Monitor Production volumes, response times and logs using Splunk.
  • Monitor and execute tests for Microsoft Azure cloud applications using AppInsights.
  • Responsible in using teh Azure DevOps, activities such as creating stories, tasks, templates, managing teh sprints & Schedules, defining points for each story.

TECHNICAL SKILLS:

Load Testing tools: Load Runner, StromRunner, JMeter (Blazemeter), Performance Center, Soap UI, Oracle SQL Developer, Putty, Postman, Jira, WinScp, SQL Query analyzer, Azure DevOps, Grafana, ELA, Kibana, Splunk, Jenkins, Bitbucket

Protocols: Web, RTE, Web services, Oracle NCA, Java Vuser, Siebel Web, SAP GUI, RDP

Languages: C, Java, .Net,SQL

Databases: Oracle 11g, SQL Server and Cassandra

Performance monitoring tools: SiteScope 11, Perfmon, UCPS, Performance Manager, Wily Introscope, Dynatrace Appmon, Dynatrace One agent, Grafana, Splunk, Elastic Serach, Logstash, Appinsights

Operating Systems: Windows series, Unix and Linux

PROFESSIONAL EXPERIENCE:

Confidential, Minneapolis

Performance Test Lead

Responsibilities:

  • Discussing and identifying wif business team for Non-functional requirements.
  • Review and finalize teh critical business processes wif business, project and support team.
  • Create test strategy and review teh test strategy wif business and all teh teams.
  • Coordination wif global teams (offshore teams) for project deliverables.
  • Review and finalize teh critical business processes wif business, project and support team.
  • Create test strategy and review teh test strategy wif business and all teh teams.
  • Configure teh performance monitoring metrics for load and peak test execution.
  • Performance test scripting using HP Vugen, Jmeter, enhance load runner scripts wif technical code for proper exceptions and business validation.
  • Complete Performance Engineering by using test scripting/execution skills utilizing HP Performance Center and Load Runner in various protocols. Review teh LoadRunner scripts and share teh comments to offshore team.
  • Execute teh Load and peak tests and share teh test results to business and project team.
  • Configure dynatrace wif application and Web servers to monitor utilizations.
  • Set up monitoring in dynatrace, Grafana for Confidential applications. Analyze purepaths and provide deep dive analysis for high response times.
  • Responsible for performing Memory & heap dump analysis. Working wif databases like Oracle, SQL Server, Tomcat. Monitoring and analyzing application severs like Apache Tomcat, IBM WebSphere using Kibana Tool.
  • Monitoring pure paths, application metrics, etc. in Dynatrace.
  • Review teh test results shared by offshore team and provide recommendations if there are any performance issues.
  • Provide tuning recommendations and retest once it is fixed to check teh performance improvement.
  • Arrange meetings wif project team and business team if there are any performance issues.
  • Prepare final test closure memo document and seek teh sign-off before teh code moves to production.

Environment: LoadRunner, Dynatrace OneAgent, JBoss, Performance Manager, Oracle, SQL Server, Grafana, Microsoft Azure, Splunk, Oracle DevOps, Kibana, Jmeter, Blazemeter.

Confidential, Wisconsin

Performance Test Lead

Responsibilities:

  • Gathering user stories and non-functional requirements for performance testing.
  • Monitored dyanTrace for high response times, CPU, GC, methods in dynaTrace.
  • Monitored QPASA, App Watch, application and Integration servers.
  • Created JMeter web service scripts and executed tests.
  • Monitored Oracle Enterprise Manager to identify DB contention and long running queries causing high response times.
  • Shared test reports and recommendations wif Client.
  • Raised defects and discussed wif development for fixing performance bottlenecks.

Environment: JMeter, dynaTrace, Cassandra, Oracle Enterprise Manager, QPASA, Web Logic 11g, QPASA, App Watch

Confidential

Performance Test Lead

Responsibilities:

  • Teh goal is to conduct teh performance evaluation in an environment dat mirrors production.
  • Responsible for test design and execution for Performance systems and is accountable for teh following responsibilities:
  • Knowledge Transition for teh applications in Performance testing scope.
  • Meet QA Environment Stakeholders and identify applications and owner.
  • Meet wif application owners and collect application data. Participating in teh performance design review wif teh implementation team and providing review comments, as and when required.
  • Scheduling, identifying and tracking progress of project milestones.
  • Offshore co-ordination and delivery/execution as per agreed SLA's.
  • Creating weekly project status reports and delivers to customer.
  • Manage project risk and escalate issues to appropriate level of leadership for resolution.
  • Manage teh full execution of teh quality assessment process including but not limited to test case execution, progress of test cycles, status and details of testing for Performance, Network &Firewall Testing.
  • Developing performance Quality consulting plan& implementation. Analyzing teh requirements & outlining best possible test options to teh customers.
  • Develop/maintain accurate scripts and scenarios to support Performance Test execution needs.
  • Environment Setup & verify infrastructure upgrade using scripts developed.
  • Providing Support and guidance in different phases of teh project by measuring teh output Quality Criteria artifacts across all different work streams.
  • Preparation of risk assessment and capacity management and advice management regarding teh application performance under various load capacities.
  • Create detailed test scenarios for identifying Performance bottlenecks and implement teh same using HP Load Runner/Performance Center.
  • Design and execute different types of performance workload scenarios for enterprise applications.
  • Collecting and compiling various usage reports from multiple sources and preparing teh final performance analysis report.
  • Monitoring load balancer to analyze teh load sharing between servers and optimize them to handle bulk requests to avoid bottlenecks in teh network.

Environment: HP Application Life Cycle Management - HP Quality Center, CA Wily Intra scope

Confidential, USA

Performance Test Lead

Responsibilities:

  • Interacting wif Client for gathering requirements, Volumes, Application configuration, etc..
  • Created Test Strategy/Plan and approach as per Non-functional requirements.
  • Created Work-Load Model using teh production volumes.
  • Discussing wif functional, data migration team, SME’s for data set up and data creation.
  • Creating SAP GUI scripts using Load Runner and validating teh scripts in ALM PC.
  • Reviewing teh Load Runner scripts based on scripting standards.
  • Monitoring teh SAP system components and resource utilizations during test execution.
  • Creating teh monitoring set up and identifying key metrics for SAP web/application servers.
  • Monitored ST03, STAD, ST04, SM12, SM66, SDF MON.
  • Preparing & finalizing teh test analysis report.
  • Extract teh Automatic Work Load Repository (AWR) report and analyze teh database performance.
  • Tuning teh SAP application/web servers if there is any response time’s deviation.
  • Involved in creating Performance project sign-off document.

Environment: SAP Oracle, SAP HANA, OTC, PTP, STP, PME, Performance Center 11.52, QC 11.52HP Open View, ST03, STAD, ST04, SM12, SM66, SDF MON

Confidential

Senior Performance Test Engineer

Responsibilities:

  • Involved in gathering requirement for Episys, store line and Bizerba applications using transactional volume model and discussing wif business team.
  • Preparation of test strategy, test plan and approach as per Non-functional requirements.
  • Discussed wif business team and finalized teh business critical scenarios.
  • Successfully tested Oracle Fusion Middleware’s like oracle data integrator(ODI),service oriented architecture(SOA), Store line service bus(SSB), Retail integration Bus(RIB) during end to end testing.
  • Created data management plan and involved in test plan creation.
  • Created and reviewed integration end to end test cases Episys, Bizerba and Store line and uploaded in quality center.
  • Discussed wif development, integration and database teams and resolved critical defects.
  • Created LR scripts using Oracle web applications 11i, Web services and RDP Protocols.
  • Reviewed teh Virtual Generator scripts based on Morrisons scripting standards
  • Analyzed AWR report and provided Recommendations.
  • Analyzed Episys, Bizerba and Store line applications and created proof of concept document.

Environment: Episys, Bizerba, RIB, Store line, Oracle Fusion Middleware, SOA, ODI, SSB, Performance Center 9.52,Quality Center 9.2, Site Scope 11, Oracle SQL Developer, HP Performance Manager, WebLogic Server 11g

Confidential

Senior Performance Engineer

Responsibilities:

  • Involved in gathering requirement for Oracle Retail Warehouse Management system(ORWMS) using transactional volume model and discussing wif business team.
  • Preparation of test strategy, test plan and approach as per Non-functional requirements.
  • Discussed wif business team and finalized teh business critical scenarios.
  • Successfully tested Oracle Fusion Middleware’s like oracle data integrator(ODI),service oriented architecture(SOA), Oracle service bus(OSB), Retail integration Bus(RIB) during end to end testing.
  • Created data management plan and involved in test plan creation.
  • Created and reviewed integration end to end test cases for Retail Merchandizing System (ORMS), Oracle Retail Invoice Matching (OReIM) and Oracle Retail Warehouse Management (ORWMS) and uploaded in quality center.
  • Discussed wif development, integration and database teams and resolved critical defects.
  • Created LR scripts using Oracle web applications 11i and RTE Protocols
  • Reviewed teh loadrunner scripts based on morrisons scripting standards
  • Monitored all teh components and resource utilizations during test execution.
  • Prepared & finalized teh test analysis report.
  • Created Performance Test completion report after project signoff.
  • Created Project Hand Over document after project signoff.
  • Involved in creating Performance Test Quality Gate document.
  • Analyzed RMS and WMS applications and created proof of concept document.

Confidential

Performance Test Engineer

Responsibilities:

  • Involved in Gathering Performance requirement and prepared teh Proof of concept.
  • Preparation of test strategy, test plan and approach as per Non-functional requirements.
  • Assistance in validating teh volumetric requirements based on transaction volume model.
  • Tested ASDA application wif JMeter performance testing tool and prepared proof of concept report.
  • Involved in end to end Monitoring setup.
  • Create Load Runner scripts for required business Scenarios using web (html/http) protocol.
  • Reviewed teh loadrunner scripts based on Wal-Mart scripting standards and provided review comments.
  • Built Performance test scenarios to reflect expected live usage of teh system, using teh pre-defined sets of business transactions to uncover and fix performance and scalability issues.
  • Tuning teh software elements in collaboration wif teh development teams as appropriate.
  • Monitoring teh web, app and database layers wif Perfmon Counters.
  • Analyze graphs for web, App and DB server system parameters.
  • Created Performance Test completion report after project signoff.
  • Created Project Hand Over document after project signoff.
  • Involved in creating Performance Test Quality Gate document.

Environment: Apache Tomcat 2.26, Oracle 10g, Jboss 4.0.5, windows XP, Vugen 9.52, Performance Center 9.52, JMeter, Bad Boy

Confidential

Senior Technical Associate

Responsibilities:

  • Performance and Resilience testing for Application server, Middlewares, and Database.
  • Create realistic workload scenarios for Load, Resilience, Operations Readiness, Platform regression, performance regression tests and executed using Load runner.
  • Involved in teh preparation of teh test data and test environment for prior to starting tests.
  • Monitor teh application statistics using UCPS.
  • Analyze teh performance test results, document preliminary and summary reports, identify bottlenecks
  • Analyzed Backend call responses.
  • Interact wif teh clients and other teams, attending teh Conference calls.
  • Prepared teh Test track report, Risk & Issue register.
  • Extracted teh StacksPack or AWR report and analyzed teh database performance.
  • Provide measurements for future releases and Assure teh Application meets Non Functional Requirements on performance.
  • Participated in project review meetings & discussions

Environment: WEBLOGIC 8.1, Cyclone, HUB, STAA, Zeus LB, SUN V890, IBM MQ, Load Runner and Java Scripts, Quality Center, UCPS.

Confidential

Senior Technical Associate

Responsibilities:

  • Did teh pretest test activities and post test activities for all teh tests.
  • Monitored teh Weblogic console, CPU, Memory, space utilisations and TPS, response time & different graphs, etc... in LR Online monitors.
  • Analyzed teh graphs & reports based on teh target requirements.
  • Executed different types of tests like Sanity, Load, Stress and benchmark tests.
  • Identified teh bottlenecks, provided teh recommendations and tuned teh application.
  • Prepared & finalized teh test analysis report.

Confidential

Technical Associate

Responsibilities:

  • Overall coordination and monitoring of teh performance related activities for CCM Perf Testing.
  • Resolving outstanding issues and queries if any.
  • Liaising wif designers during issues encountered while pumping XML’s
  • Review deliverables, Conduct team meetings.
  • Discussed wif technical team to conform teh rate to pump teh messages in different queues.
  • Configured teh harness tool wif different order ids, queue names...etc
  • Identified teh bottlenecks, provided teh recommendations and tuned teh application.
  • Prepared & finalized teh test analysis report.

Environment: CCM Harness Tool, UCPS, Weblogic console

Confidential

Technical Associate

Responsibilities:

  • Understanding of application architecture.
  • Did teh pretest test activities and post test activities for all teh tests.
  • Monitored teh Weblogic console, CPU, Memory, space utilizations and TPS, response time & different graphs, etc... in LR Online monitors.
  • Creation of performance test scripts through load runner.
  • Involved in creation of Work Load Synthesis.
  • Analyzed teh graphs & reports based on teh target requirements.

Environment: IBM MQ, WEBLOGIC 8.1, Cyclone, Zeus LB, SUN V890, Stubs and Java Scripts, LoadRunner8.0

Confidential

Technical Associate

Responsibilities:

  • Involved in teh Analysis of volumetric during Environment Preparation and Setup phase to derive and build teh Test Cases/Scripts/Test Data required & teh provision of Test Data, teh Test Environment and Test Harnesses procured / built and configured during this stage.
  • Built Performance scripts for Seibel web protocol and configured teh dlls for teh auto correlation.
  • Built Performance scenarios (Load, Stress, Soak) to reflect expected live usage of teh system, using teh pre-defined sets of business transactions.
  • Performed Informal Test Execution to prove teh readiness of teh test environment and teh performance test tool scripts.
  • Measure teh solution/application under test performance and report against pre-determined Performance Test requirements.
  • Determine whether or not teh solution/application under test can reliably achieve teh predicted transaction throughput levels whilst maintaining an acceptable level of performance.
  • Involved in teh preparation of Test plan and Specification based on teh volumetric information.
  • Undertaken Performance tuning exercise by running baseline tests to identify areas dat are bottlenecking or under performing, Analyzed teh Loading profile, Server statistics and identifying teh bottle necks.
  • Monitoring and Verification of test results - including collection and analysis of test results, response times and business transaction throughputs, server statistics.
  • Identified teh bottle necks in teh Seibel application servers end.
  • Reporting of test results - including reporting against objectives and requirements.

Environment: WEBLOGIC 8.1, Cyclone, Zeus LB,CSM, SUN V890, Siebel

Confidential

Technical Associate

Responsibilities:

  • Understanding of application architecture.
  • Generated and enhanced teh scripts through Load runner.
  • Created Load Test scenarios based on Scheduling pattern(Ramp Up, Duration, Ramp down)
  • Created teh Click Stream Workflows for performance scenarios.
  • Monitored all teh components and resource utilizations during test execution.
  • Prepared & finalized teh test analysis report.

Environment: Siebel 7.7, Oracle 10i, Web Logic

We'd love your feedback!