Sr. Performance Analyst Resume
Sanjose, CA
SUMMARY
- Having 8 years of experience in Software Testing which includes Performance Testing using Load Runner.
- Hands on experience in preparation of Performance Test Scripts utilizing various protocols like web (HTTP/HTML), Web Services, SAP GUI, SAP - Web, Flex, RTE
- Involved in all performance test phases of Discovery, Planning, Designing and Execution phases
- Good knowledge on troubleshooting Load Runner scripts
- Good understanding of various performance test methodologies
- Involved in performance test requirements gathering and in preparation of performance test plan
- Experience on Load Runner components Vugen, Controller and Analysis
- Experience in preparing Baseline Testing, Load Testing, Stress Testing and Endurance Testing.
- Debug the script with C and LR specific functions
- Having experience on Performance Center
- Hands experience on preparing performance test summary reports
- Experience on Defining Users Behavior and Running Load Scenarios
- Experience in adding the Counters and analyzing the Result
- Drill down the test execution results to find the root cause of Performance bottlenecks
- Flexible & Versatile to adapt any new environment and work on any new project
TECHNICAL SKILLS
Performance Testing Tools: Load Runner, Jmeter
Performance monitoring tools: Site Scope
Web Technologies: HTML, SQL
XML & Web services Testing: Soap UI.
Defect Tracking Tools: Application Life Cycle, QC, Mantis.
Operating Systems: Windows
Languages: C, C++, Core Java.
DBMS: Oracle 10i
PROFESSIONAL EXPERIENCE
Safeway, Sanjose, CA
Sr. Performance Analyst
Responsibilities:
- Developed the scripts in VU Gen by using HTTP/HTML protocol.
- Enhanced the scripts like Parameterization, correlation and writing customized code.
- Designed the scenarios for test Runs using LR Controller.
- Involved in test execution of the project.
- Prepared the reports of client side metrics
Environment: Windows2003/NT, SAP web, Web HTML /HTTP, HP Load Runner,, performance center,, HP-Sitescope.
Confidential, Irving, TX
Sr. Performance Analyst
Responsibilities:
- Generated, validated, and tested reports produced by the product quality testing division that to be reviewed by the higher level management.
- Participated in all phases of planning, such as defining requirements, defining the types of tests to be performed, and scenario creation
- Participated in meetings with company executives such as business analysts, developers, managers, supervisors, and executive officers in order to understand the product and the testing phases more thoroughly
- Worked on different protocols like Web (HTTP/HTML), Ajax tru client, Web services, Windows socket
- Developed Vugen Scripts for load testing with 1000 vusers to find bottlenecks in the server and deadlocks in the database
- Tested performance of Web Portal using HP Load Runner 12.01
- Generated scripts in Virtual User Generator, which included Parameterization of the required values.
- Run the script via the Load Runner Controller workbench to perform stress, volume, and component testing
- Configured and used SiteScope Performance Monitor to monitor and analyze the performance of the server by generating various reports from CPU utilization, Memory Usage to load average etc.
- Executed in Load, Stress and Endurance Testing to simulate a process, which allowed using more 1000 virtual users.
- Analyzed scalability, throughput and load testing metrics against test servers.
- Monitored the performance of portal based on various Load Runner operating system, network, middleware, and firewall monitors
- Highly involved in verifying test results with various manual unit, integration, smoke, and sanity testing documents
- Executed regression cycles of the test cases in order to ensure the product quality and performance after each stage of the code changes
- Performed Bug Tracking and figured out defects
- Extensively operated HP Performance Center in order to meet tight project deadlines remotely by coordinating among the colleagues
- Performed analysis and reported required data to my managers.
Environment: HP Load Runner 12.01, performance center 12.01, HP-Sitescope, IBM Rational Performance Tester HP diagnostic,Sitescope, SQL, JIRA, Perfmon, SQL, C/C++, Java, PHP
Confidential, Phoenix, AZ
Sr. Performance Test Engineer
Responsibilities:
- Gathered the requirements and compiled them into Test Plan
- Followed Agile Methodologies (scrum)
- Responsible for implementing Load Runner, Performance center, JMeter based infrastructure including: Architecting the load testing infrastructure, hardware & software integration with LoadRunner.
- Prepared Test Cases, Vugen scripts, Load Test, Test Data, execute test, validate results, Manage defects and report results
- Experience in preparing complex scripts using IBM RPT using Web HTTP/HTML protocols
- Expert in creating Next Generation Usage Pattern Analysis from the Production Logs to generate Performance Load.
- Installed SiteScope, and configured monitors for analysis.
- Used SiteScope to get metrics from servers.
- Extensive backend knowledge Oracle 10g,11g, SAP, SOA, JAVA, J2EE, .Net Application Servers.
- Gathering and finalizing of specs / defining business and functional requirements for BI reporting by conducting workshops, completing BI Report Specifications; guiding disposition of reports between ECC and BI. Setup QA Environment Installing Loadrunner, Silk Performer, QTP, Batch Processes.
- Complex Usage Pattern Analysis
- Used Performance Center to define performance requirement like SLA in test.
- Interface with developers, project managers, and management in the development, execution and reporting of test automation results
- Identify and eliminate performance bottlenecks during the development lifecycle
- Accurately produce regular project status reports to senior management to ensure on-time project launch.
- Performed Black Box, White Box, Performance testing, Regression, and Validation testing during the testing life cycle of the product release.
- Participated in Integration, System, Smoke and User Acceptance Testing.
- Wrote User Acceptance Test (UAT) Plans and User Acceptance Test Cases.
- Verify that new or upgraded applications meet specified performance requirements.
- Used to identify the queries which taking too long and optimize those queries to improve performance for degrading the performance by looking at the resources such as Available Bytes and Private Bytes.
- Inserted GUI, bitmap and database checkpoints to check the functionality and data integrity.
- Involved in updating the whole Oracle based application on UNIX platform.
- Independently develop LoadRunner test scripts according to test specifications/ requirements.
Environment: Windows2003/NT, UNIX, SAP web, Web HTML /HTTP, Load Runner, IBM Rational Performance Tester, JUnit, Oracle10g and XML/ SOAP, BMC, Java, Performance Center, Java script.
Confidential, Irving,TX
Performance Analyst
Responsibilities:
- Followed Agile Scrum methodology
- Prepared Test Plan based on the requirements.
- Developed Scenarios in Controller based on the User Load and Transaction Volume
- Developed and executed formal test plans to ensure the delivery of quality software applications.
- Enhanced the test scripts using Web Http, ODBC, JAVA protocol in Load runner.
- Executed stress/load/rendezvous scenarios and regression testing for various operations and performed detailed test analysis reports and perform Disaster Recovery.
- Added performance measurements for UNIX, Oracle, Web Logic server in Load Runner, controller and monitored online transaction response times, Web hits, TCP IP Connections, Throughput, CPU, Memory, Heap sizes, Various Http requests etc. Monitored Oracle database V$ session and system table stats.
- Used Site Scope Performance monitors and LoadRunner graphs to analyze the results.
- Extensively worked on UNIX and executed various programs on C Shell.
- Web Interface Protocols are initially determined before generating the test scripts and later the scripts are generated.
- Sniffer traces were analyzed for Network Bottlenecks.
- The Average CPU usage, Response time, TPS are analyzed for each scenario.
- Performed backend testing on Oracle, executed various DDL and DML statement.
- Developed various reports and metrics to measure and track testing effort.
- Created Test Matrix, Traceability Matrix and performed Gap Analysis.
- Participate in Weekly Meetings with the management team and Walkthroughs.
- Interacted with analyst, system staff and developers.
- Detected defects and classified them based on the severity in Quality Center.
- Provided Screenshots to identify & reproduce the bugs in QC.
- Interacted with the development team to fix the defects as per the defect report
- Used Load Runner to execute multi-user performance tests, used online monitors, real-time output messages and of the features of the Load Runner Controller.
- Identified Real World Scenarios and Day in Life Performance Tests
- Complex Usage Pattern Analysis
- Developed Complex ‘C' Libraries and Utility Functions for Code Reusability and Modularity
- Independently develop LoadRunner test scripts according to test specifications/requirements.
- Developed baseline scenarios, scripts and measurements
- Worked with Software Development in creating, executing, and documenting automated test scripts.
Environment: Documentum, Performance Center, Load Runner, Windows2000/NT, Citrix 9.2 OEM, RUP, Test Director, Introscope, Oracle10g, DB2, Websphere, SOA, Struts, EJB, IIS and XML/ SOAP. TSQL.
Confidential, Cleveland, OH
QA Test Engineer
Responsibilities:
- Involved in writing Test Plan for the testing effort of the module.
- Developed and executed Test Cases and Test Scripts using QTP based on the requirement documents and managed it using Quality Center.
- Prepared data for data driven testing using Data Driver Wizard in Win Runner as required by the Corporate Customers.
- Conducted functionality testing manually prior to automated testing
- Maintained Requirement Traceability Matrix (RTM) to make sure that test cases were written for all the requirements.
- Performed Back End Testing using SQL.
- Involved in Configuration Testing.
- Actively participated in walkthroughs and team meetings.
- Conducted Security, Performance, and Regression testing during the various phases of the development.
- Investigated software bugs and interacted with developers to resolve technical issues using Quality Center.
- Identifying Test Cases to be run for Regression Testing and conducting Regression testing as and when new builds were made.
- Administration of Quality Center for Bug Tracking and Reporting, Generating customized graphs and reports.
Environment: QTP, Quality Center, Windows.
Confidential, Austin, TX
QA Test Engineer
Responsibilities:
- Involved in writing Test Plan for the testing effort of the module.
- Formulated detailed Test Plan, Test Scripts and Test Cases after analyzing business rationale.
- Involved in documenting test cases for Functionality, Security, and Performance Testing
- Performed Backend Testing of the Oracle Database manually to ensure Order details and requests were correctly inserted.
- Performed Functionality Testing Manually.
- Analyzed software test plans, co-ordinated automated and manual test cases.
- Performed Manual Functional and Regression Testing.
- Conducted backend-testing on the Oracle database using SQL queries to ensure integrity and consistency of the data.
- Performed Configuration Testing on various hardware platforms.
- Experienced in Backend Testing using SQL Queries.
- Maintained Test Matrix and Requirements Traceability Matrix. Performed Gap Analysis on the same
- Used Quality Center to analyze, track and report defects.
- Worked on uploading all the Test cases to the Quality Center for the current and prior releases.
Environment: Windows, Oracle, UNIX, IIS, Quality Center .
Confidential
Performace Test Engineer
Responsibilities:
- Created automation scripts in Load Runner for the identified workflows.
- Enhanced the scripts like Parameterization, correlation and writing customized code.
- Gathered Performance statistics such as concurrent users, break-point, response-time, and throughput.
- Smoke test was conducted to verify the automation is functionally able to run against, to validate basic automation functionality.
- Evaluated end-user transactions performed within the Application.
- Measured client side statistics like response time, throughput, hits per second etc.
- Published client and server side performance statistics.
- Provided a performance report describing the observations during stress test.
- Analyze the behavior of the Remedy application under load and identify performance bottlenecks.
- Produce the final report after completing the proposed load tests.
- Effectively communicated with onsite team and clients in daily and weekly status.
- Formulated and provided recommendations for performance improvement based on the results from the test.
Confidential
Test Engineer
Responsibilities:
- Created automation scripts in Load Runner for the identified workflows.
- Enhanced the scripts like Parameterization, correlation and writing customized code.
- Gathered Performance statistics such as concurrent users, break-point, response-time, and throughput.
- Smoke test was conducted to verify the automation is functionally able to run against, to validate basic automation functionality.
- Evaluated end-user transactions performed within the Application.
- Measured client side statistics like response time, throughput, hits per second etc.
- Published client and server side performance statistics.
- Provided a performance report describing the observations during stress test.
- Analyze the behavior of the Remedy application under load and identify performance bottlenecks.
- Produce the final report after completing the proposed load tests.
- Effectively communicated with onsite team and clients in daily and weekly status.
- Formulated and provided recommendations for performance improvement based on the results from the test.
Confidential
Performance Test Engineer
Responsibilities:
- Designing performance test scenarios from business scenarios
- Preparing test scripts using Load Runner,
- Develop and review the performance scripts including the correlation, parameterization, custom functions using C
- Design the load models based on the requirements of load testing/Volumetric analysis
- Conduct peer reviews for scripts created by others
- Setup test data for performance testing
- Setup runs as per requirement for performance, stress or stability testing
- Analyze results from the run and prepare scorecards
- Analyze potential performance issues provide analysis report
