Performance Test Analyst Lead Resume
San Francisco, CA
SUMMARY:
- 9+ years of in depth experience as Senior Performance Tester with strong expertise in Performance/Load, Stress and Endurance Testing using HP Performance Center /LoadRunner
- Experienced in different domains like Banking, Health care, Insurance, Financial and Retail sectors with a unique combination of skill set in solving complex performance challenges, and implementing solutions that work
- Strong process and documentation skills for performance testing like Test planning, Test strategy and Test result reporting.
- Executed Performance tests - load, capacity and stress test using HP LoadRunner and Microsoft Visual Studio Load Test.
- Execution of automated test scripts using Mercury Tools (Test Director/Quality Center, LoadRunner, and QTP), JMeter based on business/functional specifications.
- Extensive experience in automated testing of Web based and Client/Server applications with proficiency in Load, stress and endurance testing. Good experience in agile methodology.
- Extensive knowledge in Performance Test Life Cycle and Software Testing Life Cycle, well acquainted Strong knowledge of Software Development Life Cycle.
- Experience in end-to-end management solution that integrates network, server, application and business transaction monitoring using HP BSM
- Hands on experience in using various monitoring tools like HP Site scope and HP Diagnostics, Performance Center, Wily Introscope, Dynatrace, App Dynamics, Transaction Viewer, Splunk, keep track of the test performance and identify various bottlenecks.
- Worked on various protocols Web (http/html), Web Services, MQ Client/Server, Winsock, Citrix, Web/Winsock, FLEX, Dual Protocol, Oracle, Web Services.
- Hands-on experience in developing test plan, strategy, and metrics for ERP systems with extensive knowledge of software testing life cycle.
- Proficient in using test automation tools such as testing scripts for web and client server applications
- Ability to interact with developers and product analysts regarding Testing Status and Defect & Change Tracking using Quality center
- Experienced on HP sites Quality Center, Load Runner, Performance Center.
- Used the various monitoring tools like Wily Introscope, Site Scope, HP Performance Manager and HP Diagnostics, to keep track of the test performance and identify various bottlenecks.
- Well versed with all functionality of Virtual User Generator and Correlating Statements, configuring Run time settings for HTTP, iterations, Simulated Modem speeds to bring the testing scenario to real world
- Involved greatly in Performance Testing, Functional Testing and Regression Testing, using automated testing tools including Load Runner, Performance Center, Quick Test Pro, Quality Center, and Clear Quest.
- Good experience in implementing Automation frameworks, Automation Test Plan and developing automation scripts using QTP, VB Script.
- Knowledge in HP Quality Center/IBM Clear Quest Administration, data structures & reporting.
- Proficient in Creating and Enhancing scripts, Executing Tests and Analyzing Performance results using LoadRunner, Wily Introscope, site Scope, Splunk, HP Diagnostics, GUI dashboard and Performance Center.
- Good understanding of the web services principles and technology
- Created and performed System Integration Tests against System Architecture Requirements Specifications
- Performing System Testing skills include Black Box, Smoke, Regression, Integration testing, User acceptance Testing
- Well versed with all functionality of Virtual User Generator and Correlating Statements, configuring Run time settings for HTTP, iterations, Simulated Modem speeds to bring the testing scenario to real world
- Experienced in monitoring CPU, Memory, Network, Web connections and through put while running Baseline, Performance, Load, Stress and Soak testing
- Expertise in tracking defects using tracking tools such as Quality Center and Clear Quest
- Excellent ability to understand complex scenarios and business problems, and transfer the knowledge to other users/developers in the most comprehensible manner
- Extensive knowledge in studying existing infrastructure landscape, cloud product matching, design cloud architecture, Proof of Concepts, design improvements, cost Estimation and implementation of AWS Cloud Infrastructure recommending application migrations to public Vs. private cloud.
- In-depth AWS knowledge including EC2, VPC (NAT, VPC Peering and VPN), Identity and Access Manager (IAM), EC2 Container Service, Elastic Beanstalk, Lambda, S3, Cloud Front, Glacier, RDS, Dynamo DB, Elastic ache, Redshift, Direct Connect, Route 53, Cloud Watch, Cloud Formation, Cloud Trail, OpsWorks, Amazon Elastic MapReduce(EMR), AWS IoT, SNS, API Gateway, SES, SQS, SWF.
- AWS certified solution architect (Certificate - AWS-ASA-32347)
- Experience in scrum methodologies as a Certified Scrum Master in project management.
TECHNICAL SKILLS:
Test Automation tools: Loadrunner 7.5/7.8/8.1/9 x/ 11.0/12.01/12.53, Silk Performer 7.5/7.6, SOA Test 5.5.3 and 6.2, SoapUI 3.6.1, Clear Quest, AppScan, Quicktest Pro 6.5/8.0/10.0, MS Visual Source Safe, HP QC ALM 9.x/10.0/11/12, QTP 9x/10
Databases: Oracle 10G,8i/8.0/7.0, IBM DB2 UDB7.0, MS SQL Server 6.5/7.0/2000, MS Access 7.0/97/2000.
Reporting Tools: Developer 2000 (Forms 4.5/5.0, Reports 2.5/3.0), Crystal reports, MS Access Reports
Programming: C, JAVA, BDL (Benchmark Description Language), SQL, PL/SQL (Stored Procedures, Functions, Triggers, Cursors), XML, HTML 4.0, Visual Basic 6.0/5.0, Unix Shell Scripting, SQL-Plus.
Test Monitoring Tools: Site Scope, Wily Introscope, Splunk, HP Diagnostic, Dashboard (Project Specific) Wily Introscope and Dynatrace
Web Server/ Application Server: IHS 4.x/7.x, IIS, Tomcat, Java Web Server 1.2, Microsoft Personal Web Server, Web Logic Server5.x, WebSphere 4.x/7.x
Operating Systems: Sun Solaris 2.6/2.7, HP-UX, IBM AIX 4.2/4.3, MS-DOS 6.22, Win 3.x/95/98/00/03, Win NT 4.0, Windows XP, SCO Unix, HP9000
Documentation: MS Word 2000/03/07/10, Visio 5.0, Testdirector 7.2/7.6
PROFESSIONAL EXPERIENCE:
Confidential,San Francisco,CA
Performance Test Analyst Lead
Responsibilities:- Participated in all phases of planning, such as defining requirements, defining the types of tests to be performed, and scenario creation.
- Participated in meetings with company executives such as business analysts, developers, managers, supervisors, and executive officers in order to understand the product and the testing phases more thoroughly.
- Performance Testing Web Applications, Siebel Web Applications, Web (HTTP/HTML), Ajax Truculent, Web services, Windows socket using HP LoadRunner 12.50/12.53 .
- Worked on different protocols like Web (HTTP/HTML), Ajax Truculent, Web services, Windows socket
- Developed Vugen Scripts for load testing with 800 users to find bottlenecks in the server and deadlocks in the database
- Experience in end-to-end management solution that integrates network, server, application and business transaction monitoring using HP BSM
- Generated scripts in Virtual User Generator, which included Parameterization of the required values.
- Generated, validated, and tested reports produced by the product quality testing division that was reviewed by the business team.
- Configured and used Perfmon Performance Monitor to monitor and analyze the performance of the servers by generating various reports from CPU utilization, Memory Usage to load average.
- Executed Load, Stress and Endurance Testing to simulate a process, which allowed using more than 800 virtual users.
- Analyzed scalability, throughput and load testing metrics against test servers.
- Debugged the issues occurred during the load test like - Load balancer issue, where the load was not evenly distributed among the servers
- Analyzed test results and prepare detailed Performance Test Reports including the recommendations for process improvement.
- Executed regression cycles of the test cases to ensure the product quality and performance after each sprint of the code changes.
- Performed Bug Tracking and figured out defects and managed them in HP Quality Center.
- Extensively operated HP Performance Center to meet tight project deadlines remotely by coordinating among the team members.
- Worked with offshore team and coordinated with them and reviewed their work before uploading in Test Management Tool - HP ALM.
- Summarized the test results for complete Performance Test Report.
Environment: LoadRunner 12.50/12.53, Quality Center, HP Performance Center, Transaction Viewer, CA Wily Interoscope, Dynatrace, MI Application Servers, Web Servers, Web Logic, MY SQL, Toad, Message Queue Servers, F5 Node, HTML, XML, .Net Application, Visual Studio.
Confidential,King of Prussia, PA
Performance Test Analyst Lead
Responsibilities:- Nonfunctional requirements analysis.
- Understand application architecture and identify critical business scenarios and prepare business flow documents.
- Workload modeling from the details shared by the client.
- Prepare detailed test plan and traversal flow document.
- Setup Jmeter before scripting.
- Test scripts development and enhancement using Jmeter.
- Test Data Set up.
- Create scenario with different business flows and their respective load distribution using Jmeter.
- Monitor various performance parameters like CPU and Memory utilization of POD with Hawkular metrics
- Analyze the test results and prepare preliminary reports and executive summary reports.
- Daily status reporting.
- Logging and tracking the defects in JIRA.
- Attend Defect meetings and address defects logged by Performance team.
- Re-test the defects and update JIRA defects with new results.
- Give suggestions to the Application team based on result analysis
- Understand application architecture and identify critical business scenarios and prepare business flow documents.
- Prepare detailed test plan and traversal flow document (File processing flow as well as online application flow)
- Test scripts development and enhancement using Vugen.
- Test Data Set up.
- Create test files using Load Runner scripts and drop it in different servers for processing using Load Runner scenario or manually using WinSCP.
- Monitor the file processing using Informatica DX consoles, TOAD and Online application.
- Create script and scenarios for online application and execute using Load Runner.
- Monitor various performance parameters like CPU and Memory utilization using Wily Introscope.
- Used Dynatrace, Splunk for Performance bottleneck analysis.
- Analyze the test results and prepare preliminary reports and executive summary reports.
- Client
- Daily status reporting.
- Logging and tracking the defects in HP Quality Centre.
- Attend Defect meetings and address defects logged by Performance team.
- Re-test the defects and update QC defects with new results.
- Give suggestions to the Application team based on result analysis
- Allocating work and mentoring team members.
Environment: Windows 2007, Unix, Linux, WinSCP, LoadRunner 12.02, Wily Introscope, Informatica DX consoles, TOAD, HTTP Watch, HP Quality Center, Dynatrace, Splunk, Jmeter 3.1, HTTP Watch, JIRA, Hawkular metrics, Kibana.
Confidential,GA
Performance Test Analyst Lead
Responsibilities:- Worked closely with development, engineering, architecture, network and IT to define Performance requirements for Brand new and existing Applications.
- Reviewed the Architecture and Performance requirements with the Business Users
- Involved in gathering business requirement, studying the application and collecting the information from developers, and business
- Played a major role in helping the business understand the load, stress criteria and helped them identify the critical scenarios on an application from an end user perspective
- Extensively used LoadRunner web HTTP/HTTPS, Web services and DB protocols for testing Client server applications and Standalone Applications
- Used Manual and Automated Correlation to Parameterize Dynamically Changing Parameters
- Involved in doing Load, Stress, Endurance, Capacity, configuration, baseline, benchmark, stress, Soak tests and Failover Testing using LoadRunner 11.0/12.0
- Service virtualization using tools like CA Lisa/ Lisa
- Developed the Performance Test Plan & Scope documents and Performance Test Cases
- Created virtual users in LoadRunner for Performance testing and analyzed the reports based on the different scenarios
- Involved in Defining the test scenarios and making sure that scripts are working according to planned scenario
- Monitored the software Performance on different Windows and UNIX environments
- Monitored various Performance parameters of database servers, application server s and web servers - for e.g. - CPU usage, total sessions created, current open sessions count, and number of open users on the database
- Monitor resources to identify performance bottlenecks and tuning JVM
- Analyzed various graphs generated by LoadRunner Analysis including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Web Server Resource Graphs
- Pulled AWR Reports for DB Analysis and shared with DBA team for more insight also used SQL Profiler to monitor the DB while testing the Application
- Analyzed the Windows Resource utilization viz. CPU and Memory impact on the application and server when there is a change in load and environment
- Analyzed Database Server's DB connections, table Indexes, Deadlock issues by applying proper Indexes
- Extensively used Willy Interoscope and Site scope to the Monitor the Applications and Environment.
- Extensively used Splunk to monitor the Jboss logs of the applications which are under testing.
- Responsible for creating the graphs for Web Server, Application Server and Database Server using Analysis and then generating the HTML Reports from those Analysis
- Responsible for making defect status report and project status report every week.
Environment: s: LoadRunner11.0/12.0, Quality Center, CA Lisa QTP, C, C++, HTML, Unix Scripting, Oracle 10g, DB2, Webservers, Database Nodes. Performance Center, Wily Introscope, HP Diagnostics Server, HP SiteScope.
Confidential,Houston, TX
Performance Engineer Lead
Responsibilities:- Workings apart of performance team that is responsible for doing pre-production performance testing of credit card Middleware applications. Every major code or infrastructure changes must be certified by performance team under production like workload before promoting it to production.
- Develop the LoadRunner scripts for the identified Business scenarios.
- Create LoadRunner scenario and execute Load Testing in Controller for each release testing.
- Configure and Monitor the system resources during the test execution
- Analyze the collected monitoring logs to identify performance bottlenecks in the application
- Preparation of Test Execution Summary Reports with recommendation for each release.
- Working closely with Application development, and business team to understand the requirements.
- Validate fail-over/recovery scenario and measure any impact to response times/interruption during execution of these scenario’s.
- Find processing ceilings (e.g. maximum transactions per second, concurrent users etc.) to prepare organization for high volume card transaction days (like Black Friday, holiday season) .
- Designing test approaches and manage the end-to-end quality life cycle of technology sponsored projects.
- Design and Develop automated scripts using LoadRunner based on business use cases for the application.
- Planned the load by analyzing Task distribution diagram, Transaction Profile and User profile and executed Performance Testing using Load runner
- Design scenario in LoadRunner to evaluate the performance of the application.
- Generate the Data and setup the data
- Execute different kinds of performance tests like Load test, stress, Volume and Endurance tests. Analyze the results using LoadRunner.
- Preparation of Test Execution Summary Reports with recommendation for each release
- Verify impact of any Infrastructure changes (Maintenance Releases, H/W, OS upgrades etc,) and under production like workloads.
- Monitor application and web server metrics using CA Wily Inters cope, IBM Main view.
- Prepare the quick report and peak load reports like Step test, Tandem Reports.
- Preparation of Test Execution Summary Reports with recommendation for each release.
Environment:: LoadRunner9.52, HP Performance Center, Quality Center, HP ALM 12, JVM, App Dynamics, Dynatrace, UNIX, Vmstat, Nmon, Netstat, MS Visual Studio. Net, MS Visio, MS Visual Source Safe, Application Servers, Tomcat Servers, Web Logic, Web Servers, Oracle 11g, Toad, SQL Developer Message Queue Servers.
Confidential, Irving, TX
Performance Engineer
Responsibilities:- Gathered business requirement, studied the application and collected the information from Project Managers, Developers, and Functional Test Leads.
- Responsible for all phases, planning, developing scripts, execution of Performance Center scenarios and analysis in Agile environment
- Developed Load Runner test scripts per test requirements.
- Developed Load Runner test scripts per test specifications/ requirements.
- Developed Load Test Scripts by using LoadRunner for entire site and did the Parameterization, Pacing, and correlation.
- Worked close with clients Interface with developers, project managers, and management in the development.
- Enhanced Vuser scripts by introducing the timer blocks, by parameterizing user id's to run the script for multiple users.
- Responsible for testing Web, Web Services and Ajax TruClient request.
- Extensively monitored the all the applications using HP Performance Center and Sitescope
- Created Various Vuser Scripts basing on the Critical Transactions Used by the Real-Time users using VuGen of
- Responsible for testingDatabases, Web service, Messaging Queues (ActiveMQ, AppWatch & IBM MQ) requests.
- Developed Load Test Scripts using Vugen for Walmart Canada POS Application using Windows Socket Protocol.
- Developed Scripts for MQ Applications using Web services and Java Vuser and tested the databases using JDBC connection using Java Vuser scripts and ODBC scripts.
- Developed scripts to change controller Flag settings using RTE Protocol.
- Worked close with clients Interface developers, project managers in identifying performance issues and bottlenecks and resolving them.
- Extensively monitored the all the applications using HP Performance Center and Java Mission Control, Java VisualVM, Jconsole, Grafana, HP Performance Manager,
- Created Various Vuser Scripts basing on the Critical Transactions Used by the Realtime users, using VuGen of Load Runner.
- Verify that new or upgraded applications meet specified performance requirements.
Environment: Web HTML/HTTP, Web Services, Windows Socket, ODBC Protocol, RTE Protocol, JavaVuser, Load Runner12.01, HP ALM 12.01, Java Mission Control, Java VisualVM, Jconsole, Grafana, ActiveMQ, AppWatch, POS.
Confidential,Houston, TX
Performance Engineer
Responsibilities:- Gathered NFR documents, business requirement, studied the application and collected the information from Business Analysts, Project Managers, Solution Architect's, Developers, and SME's.
- Responsible for all phases, planning, developing scripts, execution of Performance Center scenarios and analysis in Agile environment.
- Creating & Enhancing automated scripts using JMeter and Load runner tools.
- Creating and Executing load test scenarios using JMeter and Load runner tools.
- Identified functional scenarios for regression and Automated approximately 2000 test cases for all four products ofOpenSci: SciMap, SciEnable, SciCustomer, SciSupplier.
- Responsible fortesting Web, Web Services and Ajax TruClient request.
- Creation of Performance scripts in JMeter, VSTS and VUGen.
- Database testing using SQL queries to compare data accuracy of backend for reports.
- Configured WebDriver plugins with J-Meter and Worked with TechOps teams to setup new test case scenarios, includes setting up Performance test environment and conducted performance test using J-Meter for all 4 products.
- Manage and review log files for errors.
- Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
- Collaborating with application teams and DevOps teams for implementing and updating builds and responsible for J-Meter updates, patches thru Maven, version upgrades when required for every release.
- Using Blazemeter executed the performance tests in cloud performance testing.
- Used Performance Center to execute Tests and Dynatrace for analyzing root-cause of performance bottlenecks.
- Configured Automation framework with confluence and JIRA so that scripts can be executed from test cases or test cycles and status pass or fail will be updated automatically in JIRA.
Environment: Java, Windows 7, Devops,HP Load runner 12.x, Junit, Performance Center, J-Meter (2.x.-3.x), Blazemeter, Eclipse, TestNG, Maven, Jira, QC.
