We provide IT Staff Augmentation Services!

Performance Test Lead Resume

2.00/5 (Submit Your Rating)

Clinton, NJ

SUMMARY:

  • 11 years of extensive experience in Performance Testing Hands on experience and exposure in all phases of project development lifecycle and Software using various load testing tools like HP LoadRunner, Microsoft Visual Studio, JMeter and SOASTA.
  • Expert in Non - functional requirement (NFRs) Gathering & Analysis, Performance Test Plan creation, Performance Test Script Development, Performance Test Execution, End to End Performance Evaluation and Engineering, Performance Monitoring and Analysis of Web/App/DB tiers, Report Performance Test Results and defects to all stake holders.
  • Integrate Performance testing scenarios (ALM) into a continuous integration/continuous deployment system using Jenkins for Benchmark and Regression Load test.
  • Strong understanding of Web Sphere Application Server Architecture, Design, Capacity planning, performance tuning, troubleshooting, Clustering for High availability and Disaster Recovery.
  • Configured JVM settings, Heap size, Garbage collection policy depends on the application requirement, Responsible for addressing JVM Hung Issues, JVM Crashes, Slow Performance issues, Out of Memory issues, hung thread issues and monitoring the Applications and environment using Dynatrace.
  • Experience in managing test teams, test effort and size estimation, creating test strategy, methodologies and test plans, defining the entry and exit criteria for each of the test phases in the testing life cycle.
  • Analyzed the test results like TPS, Hits/Sec, Transaction response time, CPU utilization etc., by using Load Runner analysis, various monitoring tools and prepare test reports.
  • Strong experience in Troubleshooting on issues such as out of memory, memory leaks, hung sessions and session replication.
  • Experience in working with different operating systems Windows, UNIX and Linux environment with shell scripting.
  • Good Knowledge in Creating Horizontal and Vertical Clustering in WebSphere Application servers for execution environment and long-running applications
  • Experience with testing at different levels (unit, functional, integration, system, load and performance testing).
  • Lead Capacity Planning efforts in support of application throughout its life cycle process.
  • Established processes around collection and reporting of capacity/performance metrics.
  • Experience in using Automation Tools such as Quick Test Professional, SOAPUI, Selenium, Test Director, Jmeter and Load Runner and designing and implementing Automation framework.
  • Strong in Analyzing Business specifications and Developing Test Plans, Test Strategy, Test Scripts and Test Cases and executing them. Authorized to work in the US for any employer

WORK EXPERIENCE:

Performance Test Lead

Confidential, Clinton, NJ

Roles & Responsibilities:

  • Worked with the Business Analysts to determine Business Requirements and set standards for Performance Evaluation in Agile methodology.
  • Test Scripts generation in Load Runner, Customization of scripts with required logic, correlation, parameterization, pacing, logging options and preferences.
  • Lead a team of Performance Engineers to test, monitor and identify system / application bottleneck.
  • Responsible for monitoring the system under Test and analyzing results for identifying performance bottlenecks.
  • Managed releases by participating in Performance Test Requirements meetings and documentation, Test Planning sessions with shareholders, Daily status reports to teams, and Exit Review meetings.
  • Hands on experience with AWS Cloud creating JMeter pipelines using Bit-bucket application repo for all micro services and API’s.
  • Experience on AWS migration project for data transformation from DB2 to AWS cloud.
  • Created Jenkins jobs for API’s to execute via pipeline.
  • Monitoring of performance counters (CPU & Memory Utilization of servers, Throughput, Hits per sec etc.) during testing.
  • Hands on experience using Microsoft visual studio to develop web performance scripts and coded ui scripts.
  • Creating and Analyzing the Heap & Thread dumps and halping the dev team to fine-tune the application performance.
  • Analyzed Online Monitor Graphs like Runtime Graphs, Transaction Graphs, Web Resource Graphs, and System Resource Graphs.
  • Analyzed all the various performance metrics involved in the test run like Web resources, CPU, Memory, Request Analysis, DB Connection Pool, and Thread Pool etc.
  • Used Dynatrace and Wily Introscope Extensively to monitor all the Tiers for Determining any performance Bottlenecks.
  • Preparation of Load Generators and test execution, tracking metrics such as TPS, Response Time, Transaction Graphs, Run Time Graphs and Resource Graphs and Throughput, Hits per second, Error statistics.
  • Analyzed the results of the Load test using the Load Runner Analysis tool, looking at the online monitors and the graphs and identified the bottlenecks in the system.
  • Extensively used Splunk to monitor the logs and created the SPLUNK production dashboards for live metrics updates.
  • Troubleshooting on issues such as out of memory, memory leaks, hung sessions and session replication.
  • Involved in monitoring middleware application server's performance metrics like, Thread count, JVM heap size, queue size etc.
  • Pulled the reports for DB Analysis and shared with DBA team for more insight also used SQL Profiler to monitor the DB while testing the Application.
  • Configured Web, Application, and Database server performance monitoring setup using Load Runner Controller, Dynatrace, Splunk & HP diagnostics.
  • Analyzed graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server.
  • Worked on Throughput, Hits per seconds, Network delays, latency, and capacity measurements and reliability tests of on multi- layer architecture.
  • Analyzed, interpreted, and summarized meaningful and relevant results in a complete Performance Test Report. Helped establish standards and best practices for system deployments, upgrades, maintenance, and administration.
  • Acquired and maintained load test data, data usage pattern, and triage data by writing SQL, Splunk query and automation script in performance test environment.
  • Investigated, analyzed and made recommendations to management regarding technology improvements, upgrades and modifications.
  • Parameterization to test the application with different sets of data and error handling in the scripts.
  • Conducted application performance monitoring by executing UNIX/Linux/CentOS/RHEL commands (vmstat, prstat, iostat, etc.) using PuTTY & WinSCP.
  • Conducted performance troubleshooting of database connectivity issues during software performance testing with developers & DBA.

Environment: HP Load Runner 12.53, JMeter 2.6, Performance center, Microsoft Visual Studio (2010,2012,2015), IOmeter, Dynatrace 6.3, Java, Oracle, Share point, IBM Mainframe, Splunk, JavaScript, JSP Servlets, JDBC, C, VBScript, TSL, XML, HTML, MS Office, SQL, PL/SQL, Siebel, WSDL, SOA, Web Services, Web Sphere, Unix and Windows.

Performance Engineer

Confidential, Hartford CT

Roles & Responsibilities:

  • Conducted performance troubleshooting of database connectivity issues during software performance testing with developers & DBA.
  • Provide expertise on HP Load Runner tool and provide expertise in providing performance-testing solutions on PEGA applications.
  • Collaborate with development, QA, DevOps, Infrastructure team regarding performance test plan preparation, execution and result analysis.
  • Writing load test scripts, analysis of results and communicate the same to the project teams.
  • Collaborating the performance test findings with other departments in the organization and verifying their alignments with performance needs for the application.
  • Experienced in administering Unix/Linux Shell scripts to monitor installed J2EE applications and to get information from the logs and database in the required format and other daily activities.
  • Developed and deployed test Load scripts to do end to end performance testing using Load Runner.
  • Act as a lead in providing application design guidance and consultation, utilizing a thorough understanding of applicable technology, tools and existing designs.
  • Memory Leaks were identified in Different components. Protocol to Protocol Response times, Web Page breakdowns, Components sizes were analyzed and reported.
  • Analyzing highly complex business requirements designs and writes technical specifications to design or redesign complex computer platforms and applications.
  • Configured dash-lets in Dynatrace with all performance counters to capture server-side metrics and pure-paths to capture the client-side metrics.
  • Prepared Test Cases, Vugen scripts, Load Test, Test Data, execute test, validate results, manage defects and report results.
  • Added performance measurements for UNIX, Oracle, Web Logic server in Load Runner, controller and monitored online transaction response times, Web hits, TCP IP Connections, Throughput, CPU, Memory, Heap sizes, Various Http requests etc.
  • Web Interface Protocols are initially determined before generating the test scripts and later the scripts are generated. Sniffer traces were analyzed for Network Bottlenecks.
  • Performed backend testing on Oracle, executed various DDL and DML statement.
  • Developed test scripts using JMeter using http, jdbc protocols.
  • Participate in Weekly Meetings with the management team and Walkthroughs.
  • Detected defects and classified them based on the severity in Quality Center.
  • Provided Screenshots to identify & reproduce the bugs in QC.
  • Interacted with the development team to fix the defects as per the defect report.

Environment: Load runner 12.1, JMeter, Performance Center, Windows2000/NT, Citrix 9.2 OEM, RUP, Test Director, Introscope, Oracle10g, DB2, WebSphere, SOA, Struts, EJB, IIS and XML/ SOAP, TSQL

Performance Test Lead

Confidential, Berkley Heights NJ

Roles & Responsibilities:

  • Interacted with the development team to fix the defects as per the defect report.
  • Configuring CPU and Memory for WAS servers, IBM HTTP Web Server accordingly by monitoring Performance Tools.
  • Worked on tuning the JVM parameters as per the application requirements.
  • Configured and administered JDBC Connection Pools/Data Sources on Web Sphere Application Server.
  • Used Tivoli Performance Monitoring tool to monitor application resources (Enterprise Beans, Servlets) and WAS runtime resources (JVM memory, application server thread pools, database connection pool).
  • Troubleshooting on issues such as out of memory, memory leaks, hung sessions and session replication.
  • Analyzing the app server performance through Compuware-dynatrace, HPdiagonistics/Analysis tool. Tracing the performance issues if any in the code, OS & network level.
  • Wrote scripts for accessing remote Admin Servers, cleaning up logs during back-ups.
  • Involved in assisting QA team for Load and Stress testing of J2EE applications.
  • Performed routine management of Web Sphere Environment like monitoring Disk space, CPU Utilization.
  • Configured and setup Secure Sockets Layers (SSL) for data encryption and client authentication.
  • Used Web Sphere Application Server key management utility (iKeyman), for managing keys and certificates.
  • Troubleshooting errors both Application & JVM also analyzed heap/core plus error logs - configured JVM for optimum performance.
  • Performance tuning and recommended configuration changes according to Application requirements, Increasing/decreasing resources and connections of JVM.
  • Decommission of unused Middleware Server instances from Cloud by checking with Application Teams.
  • Responsible for addressing Garbage Collection Tuning, JVM Hung Issues, JVM Crashes, Slow Performance issues, Out of Memory issues.
  • Worked on Severity 1 tickets on Production environments and on various Application and environment issues like Application slowness, Server errors, updating applications.
  • Configured the JDBC Providers and Data Sources to support backend database applications for Oracle and SQL Server.
  • Used Monitoring tools Dynatrace, BMC and QPASA for Applications, servers and MQ.
  • Created and configured MQ Objects like Queue Managers, Remote queues, Local Queues, Queue Aliases, Channels, Clusters, Transmission Queues, Performance Events, Triggers, and Processes.
  • Created groups for TC servers to speed up the process of performance monitoring by checking related performance metrics.
  • Administered Data sources, JMS resources, JVM Settings, Garbage collection, Fine tune the application performance using Heap size, Database connection pools, Web container and thread pools, Created Heap dumps and Core dumps & analyzed dumps using Heap Analyzer.
  • Monitored different graphs like transaction response time and analyzed server performance.
  • Involved in application deployment in production and monitored the services when they went Live.
  • Hands on experience database Tuning using Omega Mon, PE report tools status, hits per second, throughput, windows resources and database server resources etc.

Environment: Load Runner 11.52, 11.x and 9.x, Performance Center, IBM Web Sphere Application Server 6.x/7.x/8.5, IBM HTTP Server 6.1/7.x/8.5 IBM Rational Tool, Site Scope, V fabric Hyperic, GFMon, Mule management console (MMC), SPLUNK, Quality Center, HP-Diagnostic Tool, Web, Windows 2000/XP, HP-UX, AIX.

Performance Tester

Confidential, Pittsburg PA

Roles &Responsibilities:

  • Hands on experience database Tuning using Omega Mon, PE report tools status, hits per second, throughput, windows resources and database server resources etc.
  • Collaborated on writing test plans and choosing strategies, assist with setting up test environments and preparing data, tools, and hardware for testing projects.
  • Participated in code reviews; maintain script code libraries; tested new code introduced into a function library, ensured dat test scripts follow standards for design, coding & documentation.
  • Involved in Gathering Requirements, Business Processes and Peak volume necessary for Performance Testing.
  • Verified dat Web application performance meets the requirements for page response times, numbers of concurrent users, and system resource usage.
  • Used Performance center for managing Performance testing activities
  • Python is widely used general-purpose, high-level programming language. Its design philosophy emphasizes code readability, and its syntax allows programmers to express concepts in TEMPfewer lines of code than would be possible in languages such as C++ or Java.
  • Developed scripts for .net & Java applications using Web (HTTP/HTML) protocol.
  • Developed the Load Runner scripts in VuGen and created different scenarios in Performance center as per the requirements.
  • Reviewing functional and technical design documentation to determine performance test requirements.
  • Analyzing and interpreting performance test results and evaluate impact on production infrastructure and make comparisons against previous benchmarks.
  • Coordinating with Business, Development and Support teams as required for performance test plan preparation and execution.
  • Proactively identifies problems, performs workflow analysis, and recommends quality improvements.
  • Managing activities with onshore and offshore team members to ensure quality deliverable on multiple projects.
  • Involved in creating and updating the Performance test plan based on the inputs from various sources like Business users, Business Analysts, Development team etc.
  • Worked with various business owners and users on arriving at SLAs for the AUT.
  • Responsible for scripting the Load Test scenarios from scratch using a variety of protocols like WEB, Html mobile, Web service.
  • Responsible for creating and Load Test Scenarios and Scheduled the Load Tests from Performance Center.
  • Responsible for creating the list of Application Monitors dat needs to be added from Performance Center.
  • Executed various load tests such as stress test, endurance test, throughput test, capacity test.
  • Involved in write up of the Performance test results and presented the same to the Management and other audience.
  • Debug, troubleshoot, and work with team members to find and fix software defects.
  • Used Performance Center Analysis for generating a variety of graphs including Merging, overlaying of the existing baseline results.
  • Facilitate various performance-tweaking exercises and work with developers to resolve performance related bugs

Environment: Load Runner, HP ALM, SPLUNK, Quick Test Pro, LDAP, Oracle, MS SQL Server, Web logic, Web Sphere, JAVA, Test Director J2EE Diagnostic Tool, J Meter, web, Windows 2000 / XP, Solaris, AIX, IE, Netscape, Firefox.

We'd love your feedback!