We provide IT Staff Augmentation Services!

Performance Test Lead/sr Performance Tester Resume

5.00/5 (Submit Your Rating)

Farmington, MI

SUMMARY:

  • Over 8 years of extensive experience in IT industry, specializing in Performance Engineering Process and Methodologies.
  • Experienced in Software Development Life Cycle (SDLC) and Requirement Gathering & Analysis, Planning, Design, Development, Testing and Implementation.
  • Expertise with analysis of business, technical, functional requirements, Non Functional Requirements.
  • Experience in analyzing performance bottlenecks such as very high CPU usage, memory leaks.
  • Analysis of cross results, cross scenarios, overlay graphs and merging different graphs.
  • Experience in analyzing the problematic components by using the Web page diagnostics.
  • Excellent Knowledge of programming languages like C, C++, Java to debug and execute Load runner scripts.
  • Developed SQL queries to analyze the database validity.
  • Used HP tools Quality Center (QC), Service Center, Sitescope, Performance Center and Load Runner.
  • CPU, Memory, ASP Requests, Network, Web connections and through put were monitored while running Baseline, Performance, Load, Stress and Soak testing.
  • Excellent skills in Installing, and maintaining Loadrunner software.
  • Installed and configured Dynatrace agents on the application server to collect the server stats and communicate to the Dynatrace client.
  • Good with debugging and adjusting scripts by running it within VuGen with Runtime Settings logs set to display all messages.
  • Exposure to various load runner functions used by Vuser generator for scripting.
  • Performing IP Spoofing using Load Runner for the load balancing issues.
  • Proficient in plotting and implementing scenarios and loading Load runner scripts into a controller.
  • Experience in enhancing the VuGen scripts by Auto & Manual Correlations techniques, Parameterization and Synchronizations.
  • Run time settings were configured for HTTP iterations.
  • Simulated Modem speeds to bring the testing scenario to real world.
  • Analyzed LoadRunner Metrics and other performance monitoring tools results during and after performance testing on Application server database and generated various Graphs and Reports.
  • Remarkable in oracle financials functional and technical experience.
  • Experienced in working on Oracle E - business suite version 12.1.2.
  • Expert in working on Oracle E-business applications using Oracle NCA and Oracle webapp11i protocols.
  • Involved in working on functional design documenting with functional team members in Oracle ERP.
  • Well versed with the behavior of online monitors and the techniques to fix the monitoring issues and monitoring Vuser status.
  • Experienced in Foglight, Wily Introscope and HP SiteScope, Dynatrace monitoring tools.
  • Experienced in using Database, Network, Application server and WebLogic Monitors during the execution to identify bottlenecks, bandwidth problems, infrastructure problems, and scalability and reliability benchmarks.
  • Hands of experience on different versions of Load Runner and Performance center.
  • Used Selenium framework with Selenium web driver, Selenium RC for automation and supported continuous integration using TestNG,JUnit framework.
  • Involved in automating and scheduled automation test scripts to run automatically using Jenkins.
  • Involved in working agile methodology environments.
  • Strong Judgment, Analytical, Communication and Documentation skills in all phases of QA process.

TECHNICAL SKILLS:

Operating Systems: Solaris, UNIX, Windows XP,2003,2000,Vista, Windows NT and Linux

Languages: C, C++, JAVA/J2EE, SQL

Databases: Oracle, DB2, SQL Server, MS-ACCESS, MySQL

Web Related: XML, HTML, CSS, JavaScript, JQuery, Knockout.JS

Testing Tools: Load Runner 7.5/7.6/7.8/8.0/9.1/9.5/11/11.50/11.52/12.02, Performance Center 9.5,11,11.52,12, Quality Center, Service Center, TOAD, SoapUI.

Web / Application: Servers Apache Tomcat, Web logic, Web Sphere

Monitoring Tools: Wily Introscope, HP SiteScope, App Dynamics, Dynatrace, PerfMon

PROFESSIONAL EXPERIENCE:

Confidential, Farmington MI

Performance Test Lead/Sr Performance Tester

Environment: Load Runner 11,JMeter, Performance Center 11, ALM, Quality center, Soap UI 4.1, Java, HTML, XML, Java Script, Web Sphere, DB2, Wily Introscope, HP SiteScope, APM Dynatrace, PerfMon, Wireshark and HTTP watch, Selenium for Automation, Jenkins, GitHub and JIRA bug tracking tool

Responsibilities:

  • Analyzed Business, Functional Requirements and Design Review Documents to develop Test Plan.
  • Interacted with the Business Analyst and application teams to discuss the performance requirements and load test strategy.
  • Involve in walkthroughs and meetings with Project team to discuss related issues.
  • Responsible for developing Test Scripts for Performance, positive, negative and edge cases.
  • Created customized Loadrunner VuGen scripts at API level with manual correlation, user defined functions, development libraries (classes and methods), and error handling.
  • Involved in developing clearly defined performance test plans to ensure the test scenarios performance group develop will accomplish load-testing objectives.
  • Involved in generating Vusers in Load Runner for load and performance testing using Load Runner.
  • Coordinated creation of stress environments to conduct stress\load testing.
  • Remarkable in oracle financials functional and technical experience.
  • Experienced in working on Oracle E-business suite version 12.1.2.
  • Expert in working on Oracle E-business applications using Oracle NCA and Oracle webapp11i protocols.
  • Involved in working on functional design documenting with functional team members in Oracle ERP.
  • Created Load Runner scenarios and scheduled the Virtual Users to generate realistic load on the server using Load Runner.
  • Designed performance test suites by creating VU test scripts, workload scenarios, setting transactions, rendezvous points and inserting them into suites using Load Runner.
  • Good at technical resource for modeling, simulation and analysis tool.
  • Generated Vuser scripts and Executed Performance Tests using Load Runner.
  • Developing Vuser scripts and enhanced the basic script by parameterizing the constant values using Load Runner.
  • Involved in Running Web Services in SOAP UI for load and performance testing.
  • Prepare the Huge data for the Parameterized values in the scripts for multiple scenarios.
  • According to business specification Customization of scripts by using Load Runner.
  • Used Manual and Automated Correlation to Parameterize Dynamically changing Parameters.
  • Used Load Runner monitors to measure the Transaction Response time, Network delay and Throughput.
  • Used Wily Introscope and HP SiteScope to monitor and collect metrics on production and test servers. Also used for checking transaction response time for each query.
  • Used JMeter to test performance on both static and dynamic web resources.
  • Involved in load and performance test web-HTTP,HTTPS,Soap/REST.
  • Used Performance Monitor for monitoring the required counters for application servers, web servers and database servers.
  • Involved in capturing network packets using Wire shark tool and HTTP watch tool.
  • Monitored system resources of application servers and databases using Dynatrace during the test and identified the performance bottlenecks.
  • Archiving Dynatrace sessions after the test for future review.
  • Involved in automating functional requirements using Selenium and supported continuous integration using TestNG,JUnit framework.
  • Experienced in building selenium framework from scratch and involved in setting up new environment.
  • Involved in building selenium framework for Rest and SoapUI webservices.
  • Involved in automating and scheduled automation test scripts to run automatically using Jenkins via GitHub.
  • Used JIRA Tool for defects/bug tracking and involved in sprints.
  • Experienced in working agile methodology environments.
  • Participated defects meeting to discuss the bottlenecks and long running queries.
Confidential, NH

Performance Test Lead

Environment: LoadRunner 11.0/11.5, HP ALM-PC 11.0/11.52, Foglight, J2EE, Siebel,.Net, Web Services, TIBCO, Web Services, XML, HTML, ORACLE, WAS, MS IIS Server, Web Sphere, Data Power, F5.

Responsibilities:

  • Responsible for Test Design Defect tracking, Reporting and Reviews of Test Execution.
  • Managed multiple stakeholders in Onsite-Offshore setup, Involved in all Performance engineering activities.
  • Participated in requirements and design reviews to identify test scenarios/cases to be executed for Performance and Load Testing.
  • Designed the Performance Test Environment for accurate projection by capturing the details of Production Environment.
  • Depending on the production volumes captured from the Business, Designed the Load tests, Performance, Stress tests, Volume and Long-Duration/Soak Test Scenarios.
  • Analyzed applications test Results and Environment behavior under high Loads and optimized server configurations.
  • Tested performance of J2EE, J2SE, SOA, Apache Tomcat, Web sphere App Server, F5 and IBM Data Power Appliances.
  • Analyzed CPU Utilization, Memory usage, thread usage, Garbage collection, and DB connection to verify the performance of the Application.
  • Analyzed the network connections and logs to troubleshoot any network issues.
  • Used monitoring tool such as Wily, Dynatrace, HP Performance Center 12, HP diagnostics.
  • Used LoadRunner Analysis to Analyze the LoadRunner Performance results.
  • Ensure sufficient level of stakeholders’ participation in all phases of the Performance Testing life cycle.
  • Ensured appropriate stakeholders’ signoff is obtained where required on test artifacts and exceptions.
Confidential, CT

Performance Testing Analyst

Environment: JAVA, JavaScript, VBScript, C, C++, HTML, Load Runner 9.5, Wily Introscope, APM Dynatrace, Performance Center 9.5, Service Center, Velocity, Soap UI 3.1.

Responsibilities:

  • Interacted with the Business Analyst and application teams to discuss the performance requirements and load test strategy.
  • Developed the performance Test Plans and Load Test Strategies.
  • Developed and executed scripts for Mobile applications.
  • Involved in preparing work load profile by having meetings with the project teams and business people.
  • Participated in various phases of product review, recommendation and evaluation process by working closely with Architect, Business Lead, Project Manager, Business Analyst and Windows/Unix Infrastructure SMEs to determine Business impact based on number of customers and transactions.
  • Developed Vuser scripts using Web (HTTP/HTML), Ajax (Click and Script) and Web Services.
  • Verified the SOAP message delivery to the web services and verified the XML formatted response using the SOAP UI.
  • Performance tested Middleware applications developed in SOAP environment.
  • Responsible for the Scripting all the load testing scenarios for various sub-systems using a variety of protocols including WEB(HTML/HTTP), Web(Click and script), SAP GUI, SAP-WEB, Citrix ICA, IMAP, RDP and Mobile.
  • Created customized Load runner VuGen scripts at API level with manual correlation, user defined functions, development libraries (classes and methods), and error handling.
  • Enhanced Vuser scripts by adding correlations, parameters, condition controls, checking/validation functions.
  • Monitored different graphs like transaction response time and analyzed server performance status, hits per second, throughput, windows resources and database server resources etc.
  • Found performance degradation issues like “Out of Memory” problems and improved Thread pool utilization, Memory Leaks, JDBC connection Pool size, & Transaction Rollbacks.
  • Good Experience in using DynaTrace tool for Pointing out performance issues.
  • Used the concept of web page diagnostics in analyzing the response of the web page.
  • Analyzed Load pattern and created test scenarios to emulate the real life stress conditions.
  • Conducted meetings with developers, application team and business team to analyze the defects evaluate the test executions.
  • Involved in the decision making with the management for final applications releases.
Confidential

Performance Tester

Environment: Java, HTML, IBM Web Sphere, XML, SQL, Windows XP, UNIX, Linux, Load Runner, Performance Center 9.5, SharePoint, F5 load balancer, TOAD.

Responsibilities:

  • Involved in end to end testing of the application from requirement gathering to report generation.
  • Involve in walkthroughs and meetings with Performance team to discuss related issues.
  • Involved in generating Vusers in Load Runner for load and performance testing using Load Runner.
  • Also responsible in working with the projects which are related to insurance, finance and Banking sectors.
  • Responsible for the Test lab setup including Installation of the Load Testing Infrastructure such as Controller machines, Load Generators, IP Address reservation needed for IP Spoofing etc.
  • Involved in writing SQL queries to test data integrity of the backend database.
  • Created Load Runner scenarios and scheduled the Virtual Users to generate realistic load on the server using Load Runner.
  • Generated VUser scripts and Executed Performance Tests using Load Runner
  • Developing VUser scripts and enhanced the basic script by Parameterizing the constant values using Load Runner
  • Prepare the Huge data for the Parameterized values in the scripts for multiple scenarios
  • According to business specification Customization of scripts by using Load Runner
  • Used Manual and Automated Correlation to Parameterize Dynamically changing Parameters
  • Conducted Performance testing by creating Virtual Users and Scenarios using Load Runner
  • Designed performance test suites by creating VU test scripts, workload scenarios, setting transactions, rendezvous points and inserting them into suites using Load Runner.
  • Set up cluster environment for application server and database servers and involved with Large Scale Scalability testing and setting up Load Balancer.
  • Used Load Runner monitors to measure the Transaction Response time, Network delay and Throughput.
  • Monitored Throughput, Windows resources, Network Data and Hits per Second.
  • Collaborated with team to install and configure Wily Introscope
  • Conducted memory leak testing for all the applications using Load Runner and Wily Introscope
  • Used Wily Introscope to monitor and collect metrics on production and test servers.
  • Involved in testing database applications using TOAD.
  • Participate defects meeting to discuss the bottlenecks and long running queries
Confidential

Performance Tester

Environment: Windows Server, SQL, C, Java, Win Runner, Load Runner, Quality Center.

Responsibilities:

  • Involved in Business meetings, analyzing the requirements and designed the required documents.
  • Involved in writing Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and Presentation of results.
  • Written LoadRunner Scripts, enhanced scripts with C functions.
  • Created and coded a very flexible LoadRunner scripts that allowed for fast configuration changes during testing.
  • Parameterized Users, stored dynamic content in LoadRunner functions, used client side secure certificates.
  • Run time settings were configured for HTTP iterations.
  • Created scenarios for Concurrent (Rendezvous) and Sequential users.
  • Created Single User, Base Line and Smoke test scenarios.
  • Random pacing between iterations was introduced to get the desired transactions per hour
  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
  • Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Used Load Runner tools for testing and Analysis.
  • Generated reports after the test by comparing the current test results with the previous results and checking the improvement in the performance.
  • Involved in fine tuning and suggested the options in improving the performance.
  • Helped project teams in identifying bottlenecks and suggested the options in avoiding them.
  • Used Quality center to upload the documents and for logging the defects occurred.

We'd love your feedback!