We provide IT Staff Augmentation Services!

Lead Performance Engineer Resume

Jersey City, NJ

SUMMARY

  • Expertise in Full Life Cycle Performance engineering from Requirements to Release and Capacity Planning. End to End Performance Analyst specializing in monitoring and tuning.
  • Work closely with stakeholders, BAs, architecture team, developers, DBAs to understand the application’s architecture, performance and capacity.
  • Tested and monitored applications deployed in Google cloud platform(GCP), containerized environments Google kubernetes(GKE) and Openshift
  • Perform root cause analysis to detect memory leaks, bottlenecks in servers, enterprise applications, and proposed resolution for performance tuning.
  • Proficient in executing different type of Performance tests (Load, Capacity and endurance) of the system for large number of concurrent user using different tools like Load Runner, Performance Center, Jmeter and Netstorm .
  • Planning the load by analyzing Task Distribution Diagram and User Profile.
  • Worked extensively with Web (HTTP/HTML), XML data and SOAP protocols in Non UI Web services (SOA) Testing.
  • Found performance degradation issues like “Out of Memory” problems and improved Thread pool utilization, Memory Leaks, JDBC connection Pool size, & Transaction Rollbacks.
  • Expertise in review and documentation of system test plans and test strategy
  • Experience in Test Management tools such as HP Quality Center for organizing and managing all phases of application testing process, including specifying testing requirements, planning tests, executing tests.
  • Experience in creating and executing various Loadrunner Scenario.
  • Extensive Parameterization, correlation, checkpoints, error handling in VuGen scripts to ensure the real time load conditions.
  • Experience in defect logging, tracking and prioritizing defects and enhancement requests after base lining the requirements.
  • Experience in writing complex SQL Queries using joins.
  • Ability to learn new technologies and challenging concepts quickly and implement them.

TECHNICAL SKILLS

Testing Tools: HP LoadRunner, Performance Center & ALM, Diagnostics, OVPM, QTP, Rational Requisite Pro, Wily Introscope, Jmeter, Silk Performer, Perfmon, Fiddler, NetStorm, NetDiagnostics

Defect Tracking Tool: HP Quality Center, Clear Quest

Programming languages: C, C++, JAVA, SQL, VB Script, .NET, HTML

Browsers: Mozilla Firefox, Internet Explorer(IE), Safari, Google Chrome

RDBMS/Data base: Oracle, MS Access, SQL Server, DB2, MongoDB

Tool: MS Visio, MS office, Oracle SQL Developer, IBM Data Studio

Methodologies: Waterfall, RUP, Agile, Scrum

Operating Systems: Windows 2000/XP/VISTA/7, UNIX

PROFESSIONAL EXPERIENCE

Confidential

Lead Performance Engineer

Responsibilities:

  • Involved in effort estimation, Onshore - offshore team coordination, gathering non-functional requirements (NFRs) and converting them to performance test plan
  • Developed test report templates and project estimation strategy, departmental technical procedures, and user guides
  • Analyzed performance requirements, translated into detailed test cases and created performance test scripts.
  • Responsible for the creation, customization, development, programming and maintenance of performance test scripts
  • T ested APIs and applications involving protocols like web (HTTP/HTML).
  • Performed calibration, Load, Stress, Endurance, Configuration (GC Tuning), Spike and Failover test.
  • Monitoring system/pod/container/microservices resources such as CPU Usage, Memory, I/O Stat using NetDiagnostics
  • Analyzing logs, thread dump, heap dump to find out bottleneck.
  • Responsible for evaluation, analysis, and documentation of assigned performance testing
  • Attending daily scrum calls.
  • Engaged with various application development teams, testing organizations and production support to ensure delivered performance test results meets their requirements

Environment: Netstorm, NetDiagnostics, Google cloud platform(GCP), Google kubernetes engine(GKE), Openshift, Splunk, Apigee, Akamai, Postman, Jira, Datadog.

Lead Monitoring/Performance Engineer

Confidential

Responsibilities:

  • Gathered and analyzed business and performance procedures, setting of performance monitors.
  • Worked with engineering and devops to collect monitoring requirement, SLAs for alert and dashboard creation.
  • Worked with devops and product team to keep NetDiagnostics versions updated for new features and bug fixes.
  • Creation of different favorites with different monitors like resources utilization, request/response, error rates, number of instances/pods/containers, cluster/node, database, application etc.
  • Real time Monitoring and Analysis
  • Monitoring live traffic during high sale days and o bserving whether apps are behaving normally If not, alerting respective teams and collecting data on the source of the problem.
  • Monitoring of production systems/application using NetDiagnostics and Splunk .
  • Reporting
  • Provide RCA using APM tool.
  • Periodic reports of Production environment health Confidential the time of Load test
  • Trend Analysis release to release.
  • Hourly system status report during holiday season, R eport consists of health of the system request/response, error rates, etc .

Environment: NetDiagnostics, Google cloud platform(GCP), Google kubernetes engine(GKE), Splunk

Confidential, Jersey City, NJ

Sr. Performance Tester

Responsibilities:

  • Responsible for creation of Test Strategy, Test Procedure documents and managed the Test scripts, analyzed requirements, created Test Scenarios and Test Conditions.
  • Created, maintained and enhanced Web & Mobile (HTTP/HTML) vuse rs scripts in Vugen according to the Test specifications/ requirements.
  • Using various C and Load Runner functions to make the script robust and reusable.
  • Created and executed scenarios in Performance Center according to the Test Plan.
  • Using Web Page Breakdown to break the transaction response time problems.
  • Generated graphs and analyzed the reports to monitor the application performance based on the different scenarios of Performance Testing.
  • Average CPU usages, memory, hits\sec, throughput, response time, transaction per second are analyzed for each scenario.
  • Provide recommendation and scalability solution comprised of the test findings and system behavior observed under conditions of load and stress.

Environment: Windows XP, HP LoadRunner, HP Performance Center/ALM, Oracle SQL Developer, Oracle, Wily Introscope.

Confidential, Warren, NJ

Sr. Performance Analyst

Responsibilities:

  • Conducted Walkthrough Meetings of the Test Artifacts with the SMEs for migration from legacy to ATG, new functionality and enhancements application.
  • Worked closely with various teams specially architecture and application team and Isolated configuration issues and performance bottlenecks during performance testing.
  • Analyzed Load pattern to find out user volume and transaction density; created test scenarios accordingly to emulate the real time conditions.
  • Did Cable Pull test to check the performance of overall application in case of data center failure which resulted additional performance improvements.
  • Executed multiple load and capacity tests to reach expected production capacity for additional usage due to pre-orders and BAU future growth.
  • Monitor CPU, heap, hogging threads, locks/contention, and logs during test for errors & exceptions.
  • Analyzed the Transactions’ response time (RT) and calculated the requests as per API for critical transactions like Credit, Orders, etc.
  • Executed Latency tests across the data centers to isolate the impact of latency during migrations.
  • Run different scenarios/tests to find out the capacity of the JVM as per SLA.
  • Involved in root Cause Analysis for the problems in the proposed architecture to improve performance and capacity related issues.
  • Strategized with the various stakeholders to ensure that coordination and validation on environment delivery do not impact current pipelined releases and changes planned through the year.

Environment: Windows XP, HP LoadRunner, HP ALM & Performance Center 11.52 & 12.0, Oracle, Wily Introscope, Fiddler, Weblogic, ATG

Confidential

Sr. Performance Analyst\Engineer

Responsibilities:

  • Worked on Web (HTTP/HTML) and web services protocol.
  • Used Automatic and Manual Correlation to Capture Dynamic values.
  • Inserted the Checkpoints, error handling, Stress Points (Rendezvous points to load test specific transactions), and Parameterized the scripts.
  • Managed Virtual Users across various servers (Web, App, DB nodes) to ensure Load balancing.
  • Checking the app log, system logs during test for errors, exceptions and app pool refresh.
  • Used IP spoofing to test Load Balancing issues.
  • Utilized best practices for virtualization to ensure cost savings in capacity planning.
  • Ran different test to achieve the targeted transactions as per SLA and find out the capacity of the container.
  • Ran Capacity, Endurance test to find out the resource availability, leaks, DB locks, and to see the behavior of application.
  • Monitor container resources like CPU, memory and disk are available on all servers for test.
  • Analyzed Resources metrics to find the performance bottlenecks.

Environment: Windows XP, HP LoadRunner, HP Performance Center 11.52, Silk Performer, Oracle SQL Developer, Perfmon, Oracle, .Net, IIS, Fiddler

Confidential, Jersey City, NJ

Sr. Performance Analyst

Responsibilities:

  • Worked closely with infrastructure, application, database and capacity planning architects to define the technical requirements and scope of datacenter migration project.
  • Tracked all dependencies and implementations required prior to the migration of the Data Center hardware and software for better capacity planning.
  • Worked on Web (HTTP/HTML), web services (SOAP, WSDL, XML), oracle NCA and oracle 2-tier protocol in VuGen.
  • Coordinated with payments/developer team for testing batch and transactions in an enterprise wide environment
  • Responsible for testing messages from MQ by checking the depth of Queues and pending messages.
  • Monitoring the JVMs during test for any unusual activity, utilization (CPU and memory), exceptions, errors, memory leaks etc, no of sessions/instances per server/JVM through console.
  • Performance Regressions of successive builds, Memory Leak detection, bug fixes and Contention/Locking in database
  • Analyzed the Response Time (Avg, Min, Max & 90 th percentile) for critical Transactions like login, lookups, submissions and Inquiries, batch creation and batch processing.
  • Used Hp Diagnostics & Wily Introscope to further monitor various graphs like GC, Heap, threads status, context switching and collecting stack trace to pin-point issues.
  • Using Oracle SQL Developer to run SQL quires using Joins and Sub-Queries for data creation, validation, optimal performance and scalability under real-time workloads.
  • Analyzing AWR report to identify long running queries, no of times queries ran, DB CPU, memory, idle session count, active session count, and any deadlocks in database.

Environment: Windows XP, HP LoadRunner, HP Performance Center & ALM, HP Diagnostics, HP OVPM, Oracle SQL Developer, Perfmon, Oracle 9i & 10g, WAS 6 & 7, Wily Introscope, Fiddler.

Confidential, Jacksonville, FL

Performance Tester

Responsibilities:

  • Drawn Test Objectives from Business, Technical and Software Requirements documents, in order to create Test Plan.
  • Created and executed Test Cases from Business Requirement documents, FDD and created Detailed Test Strategy and assisted in Test Plan
  • Planned the load by analyzing Task Distribution Diagram and User Profile Diagram to understand the usage and decide on the scripting pattern.
  • Involved in Performance testing of the web based application using LoadRunner and Jmeter.
  • Used Correlation to Parameterize Dynamic values like Session ID’s in VuGen.
  • Analyzed the Response Time and Run Time for Critical Transaction Programs and Interfaces
  • The Average CPU usages, response time, transaction per second are analyzed for each scenario.
  • Reviewed and studied the Data Model of the company. Also, noted down the Entity Relationships (ER diagram) between various tables of the Database System.
  • Extracted data from DB2 and validated benefits according to the different lobs.
  • Used HP Quality Center for Defect Logging, Tracking and Prioritizing Defects and Enhancement Requests.
  • Generated Test Metrics and analyzed them to evaluate the progress of the testing efforts.

Environment: Windows XP, CICS, DB2, IBM Data Studio 2.2, Quest Central, SQL, UNIX, Java, Agile, Hp Quality Center, Screen Recorder and LoadRunner, Apache Jmeter.

Hire Now