We provide IT Staff Augmentation Services!

Performance Test Lead Resume

Bloomfield Windsor, ConnecticuT

OBJECTIVE:

Looking for a challenging environment in Software; Performance, Manual, and Automation QA (Web Services) testing in HP LoadRunner - Vugen & Fiddler,JMeter (Mostly for API Calls), Badboy & Blaze Meter (For Recording and test execution in BlezeMeter Cloud), and Vugen, ALM/Performance Center, GitLab, GitHub, Jenkin, SevOne, Jira, CA Agile Center, Microsoft SQL Server Management Studio, ALM, Microsoft Azure Portal, IWS Scheduler, Perfmon etc.

SUMMARY:

  • Follow-up and coordinate with offshore Team to facilitate test activities.
  • Discussing with Client/Stakeholders about future release scope for performance testing.
  • Strong knowledge in all phases of Software Testing Life Cycle (STLC) and Software,
  • Development Life Cycle (SDLC).
  • Have always participated in gathering Non-functional requirements (NFR), by interacting with clients, Business Analysts, Project Teams….
  • I have participated in putting together business requirement documents, writing Test Plans,
  • I have extensive knowledge in writing Test Cases, designing and developing Test Scripts, analyses and present Test results.
  • I have participated in end to end testing-to figure out system dependencies.
  • I have experience in testing various Web based and Client Server applications on various technologies, platforms, and domains using performance testing (Web-services).
  • I have experience in all software development lifecycle, Developing Test, Automation Testing using, Loadrunner & Fillder, Performance Center, Jmeter, Badboy & BlazeMeter (for recording).
  • Experience in non-functional testing, Load testing, Stress testing, Soak testing,
  • I have done monitoring at mostly at hardware level, monitored RAM--memory, Disk, to figure out bottlenecks in the system under test.
  • LoadRunner, Vugen, Controller, Workload Modeling, scripting in Vugen, correlation, parameterization, content check, inserting rendezvous point, applied functions on Vugen scripts, baseline and bandwidth testing using fiddler, Analysis.
  • Have extensive knowledge in Load Test, Stress Test, Spike Test, Scalability Test, and Volume Test.
  • I have knowledge in single user baseline and bandwidth analysis using Fiddler Web Debugging tool.
  • Significant exposure to test tools like LoadRunner, Fiddler, JMeter & Badboy, Performance Center,
  • Experience in using monitoring tools like, Newrelic, Dynatrace, JViSualVM, JConsole, YourKit Java Profiler, to dig and drill down into performance issues-concurrency and components issues.
  • Experience in DBMS: My-SQL, MS-SQL, Oracle DB2, for database testing using Jmeter and to retrieve parameters for application enhancement.
  • Experience in application servers’ environments: WebLogic 12C, Jboss, and WebSphere, to look into thread contentions through monitoring tools; Dynatrace, NewRelic, JConsole, JVisualVM, Your Kit Java Profiler,
  • Tested Webservices using testing standards like validating XML, JSON, WSDL Using LoadRunner, JMeter, and SoapUI (evoke .exe file to LR), also uses JMeter to create load test, JDBC test plan, add accessions to JDBC test plan.
  • Worked in distributed Linux and Windows Environment and performed end-to-end testing.
  • Participated and have experience in preparing, analyzing, executing Performance Test Plans and Test Cases, workload model, focusing on business-critical scenarios, and frequently used scenarios on the application.
  • I have experience in end to end testing, to find out whether systems are dependent of not.
  • I have done monitoring at hardware level, monitored RAM--memory, Disk, to figure out performance issues.
  • Identified Test environment, identified and analyzed performance acceptance criterial, plan and design -workload model (how to spread the loads across the script according to functionalities), using Little’s Law.
  • Configured test environment with the consent/help of system admin and developers and lunch the application in Vugen.
  • Recorded the application in Badboy and BlazeMeter then imported it to JMeter in Java management Extension (JMX) format.
  • Executed Jmeter script in BlezeMeter Cloud, and investigated Timeline Reports KPIs, Engine Health, Errors, Logs etc.
  • Perform Performance and Load test using LoadRunner and JMeter, generated Scripts in VUGEN using HTTP/HTML, Webservices, SAP-GUI.
  • Experience in creating Virtual User Scripts, defined User Behavior, and conducts Load Test Scenario, inserting; transaction points, Rendezvous points, content checks, correlations, Parameterizations, and Comments into the Vuser scripts to understand load conditions better.
  • Experience in using LoadRunner to analyze applications performance under various Loads and Stress conditions.
  • Extensive experience in web-based application testing using System testing, functional testing (web-services), Integration testing, and Regression testing.
  • Familiar with most phases of Software Development Life Cycle SDLC and STLC, including Agile, Waterfall, V-Model.
  • I have done monitoring mostly at Internal level to figure out bottlenecks in the system under test.
  • Experience in webpage diagnostics by comparing the different metrics to determine bottlenecks in the applications.
  • Experience in designing and executing test scenarios, in Controller.
  • Experience in identifying the performance bottlenecks of the website under heavy load.
  • I have experience in using Microsoft Azure Portal, IWS Scheduler, perfmon (windows machine) for result capturing and application monitoring.
  • Have used ALM and Jira, CA-Agile Center for defect management and result sharing.
  • Used JMeter to initiate API Calls from Network Location.
  • Used Console Server/Command Prompt to execute batch jobs from blob to stating.
  • Used Jenkin, GitLab, SevOne to compiled server’s metrics to DBs, PVS team members, Stakeholders etc. for feedbacks.
  • Used SSIS Server/Command Prompt to execute batch jobs from staging to Live table.

TECHNICAL SKILLS:

TESTING TOOLS/SOFTWARE: LoadRunner, Fiddler, Badboy, BlazeMeter, ALM/Performance Center, Jenkin, TeamQuest, GitLab, GitHub, SevOne, OATS (Oracle Application Testing Suite), JMeter.

OPERATING SYSTEMS: Windows, UNIX (Solaris), Linux (Red Hat Enterprise, Ubuntu LTS)

PROGRAMMING AND SCRIPTING: C, JAVA, XML, JavaScript, Json

RDBMS: MS-SQL Server Management Studio, IWS Scheduler, IBM DB2, Mongo DB, Oracle, MySQL.

WEB TECHNOLOGY/PROTOCOLS: Webservices, WSDL, SOAP, HTTP/HTML, SAP GUI, XML, Java, CSS.

EXPERIENCE:

Confidential, Bloomfield/Windsor, Connecticut

Performance Test Lead

Responsibilities:

  • Attended Assessment/Estimation Meetings with stakeholders including team members to determine whether an engagement(s) has enough risk for PVS testing or not following request by Business Owners for PVS testing.
  • Collected requirements from application owners to build test Strategies (Test Strategy has so many inputs).
  • I have developed Questionnaires, Test strategies and worked with DBAs and Network Administrators to compare Test and Production environment.
  • I have developed and sometimes enhanced scripts by simulating all user activities before correlation, parameterization, content checks etc.
  • I have designed scenarios and executed Shakedown Tests, Volume Tests, Scalability Tests, Stress Tests, Endurance Tests, Resiliency Tests etc. in ALM/Performance Center/Controllers.
  • I have used Dynatrace, newrelic, Teamquest, YourKit Java Profiller etc. for monitoring and metrics collection.
  • I have experience in analysis including merging various graphs and webpage diagnostics etc. Attended Daily Scrum Meetings to provide daily status reports to the Scrum Master about projects.
  • Give Status reports every week to PVS team/ Manager about engagements at hand. Work with Offshore team to update them or provide them status reports from onshore about engagements.
  • Sent to offshore team all needed information about engagements they should be involved in, and equally coordinate with them for timely delivery.
  • Attended Confidential meetings and gave inputs as to which engagements should have the highest or lowest points in CA - AGILE CENTER/RALLY.
  • Created user stories in CA-Agile Center (with all needed information).
  • Logged defects (Whenever identified) in CA-AGILE CENTER after I manually navigated through applications to be scripted.
  • Prepared and sent requests to DBs through Project Lead for agents to be installed to servers targeted by the tests (Agents installation Requests).
  • Prepared and sent widows requests in a specified format through the Project Lead to DBs/Networks Administrators to allocate windows for test executions.
  • Scripting new engagements and sometimes enhanced older scripts using Vugen before test executions.
  • Executed Jmeter script in to BlezeMeter Cloud, and investigated Timeline Reports KPIs, engine health, Errors, Logs, percentile, average, min, max response time etc.
  • Dug down into artifacts in ALM/Performance Center using Run IDs to find out issues after executions during test failures.
  • Prepared LoadRunner reports using ALM/Analyzer and shared to stakeholders and copy team members as needed.
  • Used Jenkin Tool to compile and prepare reports about all servers targeted during testing after test executions.
  • Used GitLab and Jenkins to compile server metrics (TeamQuest Reports) after test executions and shared with stakeholders, Performance Engineers and DBs for review and feed backs. Used SevOne to compile IBM Server metrics after executions and shared with stakeholders for review and feed backs.
  • Prepared hand off reports in a specified format to stakeholders, performance team, DBs etc. for feedbacks and approval before going to production.

Confidential, Charlotte, North Carolina

Technical Test Lead

Responsibilities:

  • Follow - up and coordinate with offshore Team to facilitate test activities.
  • Discussing with Client/Stakeholders about future release scope for performance testing.
  • Measure time it takes files to be processed from either network/SFTP/local system to the different stages in the blobs through API (an interface), Console and SSIS server logs.
  • Engaged in client s discussions on site and provide the necessary results to offshore team.
  • Engaged in offshore discussions and provide the needed inputs to client as required.
  • Engaged in NFR gathering from client, discussed with Dev Team, present result to client for signoff before test execution.
  • Participated in reviewing business flow prior test execution.
  • Participated in workload Modeling.
  • Monitored performance Counters/Metrics under Web Server, Database Server, and Network Parameters.
  • Provided to stake holders daily reports including: Activities Completed/In-Progress, Pending Action Items, Open Issues etc.
  • Participated in result presentation to client after test executions and result gathering.
  • Provide Recommendation to stack holders including upgrading CPU and Memory with a better sizing in production for those respective servers.
  • Used Microsoft SQL Server Management Studio to determine number of records processed by batch jobs.
  • Used Console and SSIS Server logs to determine batch job processing time in the different blobs and staging.
  • Used Microsoft Azure Portal to monitor Server and Application Health.
  • Used Perfmon to monitor windows resources, import collected metrics into analyses graphs. Performance tested SAP GUI (With LoadRunner) application using different transaction codes and Vugen as protocol. Add transaction points in SAP GUI, rendezvous points, parameterizations, Conversions, verification points/functions, replaying SAP GUI, identify objects in SAP-GUI, Control IDs, Verification Points, Scenario creations in controller, and investigated performance metrics etc.
  • Load Tested Webservices (SOAP and REST) using LoadRunner, Soap UI, Postman, handed webservices scripts with protocol (Http/Html), webservices protocol using (WSDL file), webservices protocol (recording in Vugen), and by importing webservices xml file to LoadRunner. Post, Put, Get and Delete were frequently used in those
  • Utilized JMeter to initiate API calls to place the file to blob from Network location.

Environment: Java, Dot Net, Microsoft SQL Server Management Studio, Microsoft Azure PortalMS Excel, PowerPoint, Windows, IWS Scheduler etc. languages were xml, Jason, html, plain text etc.

Confidential, Austin, TX

Performance Test Analyst

Responsibilities:

  • Follow - up and coordinate with offshore Team to facilitate test activities.
  • Discussing with Client/Stakeholders about future release scope for performance testing. Measure time it takes files to be processed from either network/SFTP/local system to the different stages in the blobs through API (an interface), Console and SSIS server logs.
  • Engaged in client s discussions on site and provide the necessary results to offshore team.
  • Engaged in offshore discussions and provide the needed inputs to client as required. Engaged in NFR gathering from client, discussed with Dev Team, present result to client for signoff before test execution.
  • Participated in reviewing business flow prior test execution.
  • Participated in workload Modeling.
  • Monitored performance Counters/Metrics under Web Server, Database Server, and Network Parameters.
  • Provided to stake holders daily reports including: Activities Completed/In-Progress, Pending Action Items, Open Issues etc.
  • Engaged in analysis of findings-performance from offshore team.
  • Looked into logs from the Eppic Server for error identification and ran SQL queries in SQL Developer for more insight.
  • Assisted QA team to deploy new build for performance testing whenever possible etc.
  • I was engaged in webpage diagnostics by comparing the different metrics to determine bottlenecks in the applications or whether the metrics are in sinks or not.

Confidential, New York City New York

Performance Tester

Responsibilities:

  • Involved in Planning meetings with the Development Team, Test Team Systems and Database Management teams determine the Test Environment, Identify the test architecture, effort estimation and defining test strategy s with respect to the applications.
  • Extensively worked with business users to identify the mission critical business transactions/ modules in the application and document the test cases.
  • Experience in LoadRunner setup, installation, Administration and Troubleshoot
  • Analyze and Offer solutions for critical software & infrastructure performance bottlenecks.:
  • Work with Developers/Architect/Network/Environment support/DBA's closely to identify bottlenecks and suggest tuning opportunities to improve code base performance.
  • Involved in the configuration optimization (web/app/database servers), architecture redesign to reduce resource consumption and shorten network latency.
  • Develops and maintains load Tests for performance testing of web applications.
  • Monitor resources utilization during tests to determine the impact of load on the systems infrastructure.
  • Provides defect tracking and reporting, analysis and troubleshoot Infrastructure and application issues.
  • Creates documentation for all phases of Testing and maintains well organized records of test results and records.
  • Involved with DBA teams to analyze AWR reports to identify high response time/load consuming SQLs
  • Responsible for preparing Performance test results and presented the same to the management and required stakeholders.
  • Experienced in creating and reviewing Systems Test Plan, developing in - depth Test Cases and executing them, Test Scenarios. based on the application s features and client's requirements.
  • Identify Systems Under Test which include the Application Servers, Webservers, Database Servers and the Client machines and coordinating with the different teams to set up the Test lab environment including Installation, Code Deployment, Troubleshooting and Configuration on Windows/WebSphere application servers, Client s machines, installation of
  • Software and instance connections to the databases.
  • Extensively worked on creating automated scripts for identified mission critical flows using VU Generator and designing and executing Performance tests using LoadRunner Controller, JMeter/Blaze meter, Performance Center.
  • Experience in webpage diagnostics and compared different metrics to determine bottlenecks in the applications under test or whether the metrics are in sink or not.
  • Excellent Communication skills, Documentation Skills, Team problem solving ability, Analytical skills in quality conscious and multitasking environment.

Confidential, Arlington, Virginia

Performance Tester

Responsibilities:

  • Have participated many times in gathering Non - functional requirements (NFR), by interacting with clients, Business Analysts, Project Teams .
  • Workload Modeling, scripting in Vugen, correlation, parameterization, content check, inserting rendezvous point, applied functions on Vugen scripts, baseline and bandwidth testing using fiddler.
  • Have extensive knowledge in Load Test, Stress Test, Spike Test, Stability Test, Scalability Test, and Volume Test.
  • I have experience in end to end testing, to find out whether systems are dependent or not.
  • I have done monitoring mostly at hardware level, monitored RAM--memory, Disk, to figure out bottlenecks in the system under test.
  • I have extensive knowledge in single user baseline and bandwidth analysis using Fiddler Web Debugging tool, looking at the statistics in the request panel, raw data in the response pane and perform GZIP Encoding to reduce bytes (from no compression stage) and recommend code compression to developers.
  • Complete performance requirement analysis with the client and developers and develop Performance Test Plan. Extensively use in Load Runner s Web (http/html) and Web Services protocols.
  • Develop, maintain, and upgrade automated test scripts and architectures for testing application products load behavior under LoadRunner.
  • Have participated in preparing performance test plan, develop performance test strategies, test cases, and test scripts based on requirements. Insert rendezvous points to create intense load on the server and thereby to measure server performance. Complete performance measurements for, Oracle, Web sphere servers in LoadRunner controller and monitored online transaction Response Time, Hits/Sec, TCP/ IP Connections, Throughput, CPU Utilization, Memory Utilizations, various HTTP requests etc.
  • Identifying test environment, identify and analyze performance acceptance criterial, plan and design the test-workload model (spreading the loads across the script according to functionalities), using Little s Law.
  • Assist to configure test environment, attend meetings with stack holders-walk through meetings, writing test cases according to modules and futures of the application.
  • Assist in test environment configuration with the consent of system admin and developers and lunch the application in Vugen. I have done monitoring at mostly at hardware level, monitored RAM --memory, Disk, to figure out bottlenecks in the system under test.
  • Webservices Testing using LoadRunner & Fiddler, JMeter & Badboy, validated request and response. Create load test scripts, by capturing the business process of the client using VugenLoadRunner 12.02, 12.50, 12.53 and enhance the script with correlation, parameterization, and content check, insert rendezvous points. Developed base line test, load test Stress test, endurance test of the application by creating virtual users using controller. Parameterization in JMeter by adding CSV data set Config, Correlation in JMeter through Regular Expression
  • Extractor, and Regular Expression Tester, test databases in JMeter using JDBC configuration Connections, and JDBC Requests, debug script using debug sampler, debug postprocessor, log viewer and generate JMeter reports. JMeter &Badboy to load test, SOAP and REST API, validate response with assertions, samplers by adding SOAP/XML - RPC request sampler.
  • Monitoring test using monitoring tools like Newrelic, Dynatrace, to identify the performance bottlenecks-concurrency and component issues of the application under heavy load and using LoadRunner and checked its compatibility on Internet Explorer browsers.
  • Analyzes the test using Analyzer in LoadRunner, generating reports and proffer recommendations.
  • Attend the functional and technical presentation meetings, work through meetings, Subject Matter Expert Meetings and provide feedbacks to manager(s).
  • Prepare and present daily, weekly, and monthly (High level) Status Reports to the different Stake holders of the project. Participated in application servers environments: WebLogic 12C, Jboss, to investigate thread contention through monitoring tools.

Environment: LoadRunner, JMeter, Performance Center, Web-Logic, DB2, Oracle, windows and UNIX.

Confidential, Reston, VA

Performance Tester

Responsibilities:

  • Have participated many times in gathering Non - functional requirements (NFR), by interacting with clients, Business Analysts, Project Teams .
  • Create, customize and edit LoadRunner scripts for identified business processes.
  • Create load testing scenarios that simulate real life application usage and its business processes.
  • Create scripts using protocols like LoadRunner HTTP/HTML webservices, and executed scripts using Virtual Users.
  • I have experience in end to end testing, to find out whether systems are dependent of not.
  • I have done monitoring mostly at hardware level, monitored RAM--memory, Disk, to figure out Bottlenecks in the system under test.
  • By using Load Runner 12.50, 12.53. I analyzed, generated various reports for higher management.
  • Work with the development, network teams and other project teams to tune the test environment as necessary.
  • Extensively worked on Vugen and used Controller to perform Load Test and Stress Test. Define performance test strategy, performance test cases, load scripts and documented the issues and re - tested software fixes to ensure the issues are resolved.
  • Identify the bottlenecks in the application performance, conduct analysis by monitoring available graphs and provide recommendations to supervision to improve the performance.
  • Monitor analyzed server performance in various virtual user s load.
  • Creates scripts using JMeter & Badboy, and HP LoadRunner & Fiddler to load test and generated reports for management. Inserted J-Meter Threads groups, Listeners, Assertions, Samplers, Timers; including Synchronizing timers, handle dynamic values, data drive in J-Meter. Identifying Test environment, identify and analyze performance acceptance criterial, plan and design the test-workload model (spreading the loads across the script according to functionalities), with the use of Little s Law.
  • Assist to configure test environment with the consent of Test Lead, and lunch the application in Vugen, capture business process of client and enhance my script, before moving to controller or performance center. Customize Vuser scripts in LoadRunner for parameterization, correlation, insert rendezvous point and create scenarios using LoadRunner Controller.
  • Participated in application server environments: WebLogic 12C, Jboss, to investigate thread contention and thread dump analyses.

Environment: LoadRunner, Web sphere, Performance Center, DB2, Oracle, Windows and Linux.

Confidential, Atlanta, GA

Performance Tester

Responsibilities:

  • Analyze the business and technical requirements, created the Test scenarios and developed Test plan and Test cases.
  • Attend functional spec, test strategy, test plan and test cases reviews ensuring they meet business requirements, Estimated the effort required to build the automation scripts. Configure test environment with the consent of system admin and developers, and lunch the application in Vugen, capture business process of client and enhance your script, before moving to controller or performance center.
  • Complete Performance Testing by creating Virtual users and analyzing the reports in LoadRunner
  • 12.02.
  • Create Vuser scripts in the VuGen and inserted Transactions and Think Time within the Virtual User Script to emulate heavy user load on the server.
  • Use Parameterization and Correlation of the VuGen scripts to ensure the real time load conditions using LoadRunner.
  • Perform Scalability tests which encompassed testing with hundreds of users and can be broken down into the following areas: Stress, Workload generation, Reliability test, and configuration tests using LoadRunner.
  • Use of Parameterization and Correlation of the VuGen scripts to ensure the real time load conditions using LoadRunner.

Environment: Java, My SQL Server, Oracle, LoadRunner, MS Excel, PowerPoint, Windows, Linux.

Hire Now