Sr. Performance Test Engineer Resume
Clemmons, NC
SUMMARY:
- 8+ years of extensive hands on experience in tools such as HP LoadRunner, HP ALM, HP Performance Center, HP UFT, HP SiteScope, HP Diagnostics and Jmeter
- Experience of CA AMP
- Experience with Selenium, Jenkins, Jira and Cucumber
- Experience with Gatling
- Experience testing AWS (Amazon Web Services)
- Experience with Shell Scripting
- Expertise in Analyzing Business Requirements, Design Specifications, Use Cases to prepare Test Plans, Test Cases, Test Scripts.
- Expertise in Manual Testing, Web Based Testing and Client/Server Testing.
- Experienced in defining Testing Methodologies, Designing Test Plans and Test Cases.
- Experienced in verifying and Validating Web based applications and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Lifecycle (SDLC).
- Experienced in involving in Waterfall & Agile development methodologies
- Exposure in Test Automation life cycle & Automated Scripts Maintenance
- Extensive experience in automated tools HP Quality center/ALM for Test Cases and scenarios for Defect tracking and reporting.
- Knowledge in Defect Management Tools such as ALM/Quality Center.
- Strong experience in data manipulation using SQL for the retrieval of data from the Relational database.
- Experienced in Writing SQL for various RDBMS like Oracle, MY SQL & SQL SERVER.
- Experienced in Automation testing using HP UFT/Quick Test Professional (QTP).
- Expertise in developing automated test scripts using VB Script in UFT/QuickTest Professional (QTP).
- Proficient in protocols such as Web, Citrix, RTE, PeopleSoft JDBC, Siebel, SAP and Web Services for performance using LoadRunner, Jmeter and ALM Performance Center
- Well experienced in using monitoring tools such as CA APM and HP SiteScope
- Well proficient with complex ‘C’ Programming and VB Scripting
- Vast experienced in creating Web Services scripts using LoadRunner by scanning WSDL files and recording the Client
- Experienced in using HP Quality Center/ALM for gathering requirements, planning and scheduling tests, analyzing results and managing defects and issues
- Dexterous in tracking and reviewing defects using HP ALM/Quality Center
- Well experienced in preparation of Test Plans, Test Scenarios, Test Cases and Test Data from requirements and Use Cases
- Conversant in defining performance test strategy, performance test cases, load scripts and documenting the issues
- Experienced in developing Performance Test Plan, executing Load Testing, analyzing the results and generating Load Testing reports using LoadRunner
- Deft in conducting Load testing, Scenario creation and execution, measuring Throughput, Hits per second, Response time and Transaction time using LoadRunner Analysis
- Experienced in using Monitoring Tools such as Task Manager, Process Explorer, Performance Monitor, Resource Monitor and Data Ware House Monitor in Windows system and Jconsole to monitor Java based applications, System Monitor and Topas in Unix system
- Very proficient in interacting with Oracle, SQL Server and DB2 databases using SQL
- Ability to work on multi - platform environments like Windows and Unix with clear understanding of file system, environment variables and file transfer protocol (FTP)
- Effective time management skills and consistent ability to meet client deadline
- Excellent inter-personal abilities and a self-starter with good communication skills, problem solving skills, analytical skills and leadership qualities
TECHNICAL SKILLS:
Testing Tools: ALM/Quality Center, LoadRunner, Performance Center, Jmeter
Operating Systems: MS-DOS, UNIX, Windows
Mainframe: MVS, CICS
Monitoring Tools: Site Scope, JConsole and MS SCOM
Languages: Java, VB.Net, VBScript, C++, SQL and PL/SQL
RDBMS: SQL Server, Oracle, Sybase and MS-Access
Web Technologies: HTML, DHTML, EXML, SOAP, JSP, ASP, PHP and Java Applets
Documentation Tools: MS Word, MS Exel and Test Editor
Web Servers: Apache, Web Logic, Web Sphere and IIS
PROFESSIONAL EXPERIENCE:
Confidential, Clemmons, NC
Sr. Performance Test Engineer
Responsibilities:
- Lead Test Engineer or entire Line of Business (LOB) Performance Testing
- Onshore based LOB lead in large scale Enterprise lift-and-shift acquisition and migration program;
- Lead offshore team based out of India with around the clock support;
- Gathering NFR and Use Cases rom Business teams;
- Working on multiple projects and technologies at the same time to make sure project deliverables are attained within timeframes;
- Technologies include Microsoft, Oracle, Siebel, WebSphere, Mobile, Mainframe, SAP, Business Intelligence;
- Test requirement gathering, Test Planning, Scripting and modification using HP LoadRunner andPerformance Center, Test scheduling, Test Analysis and Reporting;
- Presenting test results to business and helping in explaining them;
- Suggesting performance modifications, next steps and path forward in order to be prepared orproduction data;
- Capacity planning for 5-to-10 year increases in user base for existing application and infrastructure;
- Instrumented Application Performance Monitoring (APM) using AppDynamics
Environment: HP Performance Center 12.53, HP LoadRunner, Jmeter, VuGen, AppDynamics, APM, Tibco EMS, Enterprise Service Bus, Mainframe, Mobile, SOA, Web Services, Fiddler, MS SQL Server Management Studio.
Confidential, Arlington, VA
Performance Test Engineer
Responsibilities:
- Identified and eliminated performance bottlenecks during the development lifecycle
- Accurately produced regular project status reports to senior management to ensure on-time project launch
- Verified that new or upgraded applications meet speicified performance requirements
- Identified queries that took too long and optimized those queries to improve performance
- Enhanced Vuser scripts by adding logic to do loops, numeric/string conversions, file IP/OP, rendezvous, cookies, text/image/context checks, blocks, correlation and parametrization
- Chanmged the runtime settings such as pacing, think time, Log settings, browswer emulation and timeout settings in LoadRunner VUGEN and controller to simulate the real scenario
- Worked with engagement leads to design benchmarks that measure the degree that an application can meet its performance requirements
- Created test harnesses and post processing software required to execute the benchmark and summarize performance and resource demand statistics
- Quickly identified a short fall in CPU, disk IO bandwidth, memory, network IO bandwidth on Linux, Windows and Solaris
- Used performance counters to assist in root cause analysis of performance defects (both production and during benchmarking) for Linux, Windows, Solaris and Java
- Documented User Workflows by corrdinating with the business people
- Developed Load Test plans and Load Test strategies
- Created various scenarios in loadRunner controller for performing Baseline, Benchmark, Load, Stress and Endurance tests
- Performed Baseline Test with one user and five itierations and Benchmark Test under a load of 100 users using LoadRunner Controller
- Used Scenario by Schedule in the Controller to change the Ramp Up, Duration and Ramp Down settings
- Helped in performance tuning of the application
- Analyzed the Transaction Summary Report and graphs generated in a LoadRunner Analysis session
- Created Templates in Analysis session and analyzed web page diagnostics to see if the server was the bottleneck or the network was the bottleneck
Environment: UFT formerly known as QTP, Windows, Mainframes, UNIX, Java, Oracle, PL/SQL, SQL, Quality Center (now known as ALM), LoadRunner
Confidential, Timonium, MD
Performance Test Engineer
Responsibilities:
- Identified areas for Test Automation and provided design prototypes for the Automation suite based on tool.
- Involved in Test case preparation based on the business requirements.
- Made reports on Testing progress and results, defected resolution and documented final test results in order to execute Testing Engagement.
- Executed Data mining of a large shared Test environment to set up Test scenarios.
- Evaluated users’ needs through performing System configurations, Test planning and reporting.
- Created and loaded Test data sets to validate system or unit functionality.
- Performed the Integration, System, and Regression testing of Software for both Manual and Automated Test execution.
- Developed Test scenarios, Test case, Test Data and mapped the Test case against a Requirement in HP ALM.
- Used HP ALM as a Test planning and Defect management tool.
- Wrote SQL scripts to validate the data in the database on the back end & Master files.
- Scheduled VBScripts in windows scheduler that trigger stored jobs in UFT at desired times.
- Performed descriptive programming approach of UFT in handling dynamic objects.
- Integrated UFT with ALM and ran automated test scripts stored in ALM by invoking UFT in the background.
- Reported defects out of UFT.
- Develop Vuser scripts for Web (HTTP/HTML), Citrix, Oracle and Web Services protocols based on the user workflow
- Performance test complex SOA based application using LoadRunner Webservices protocol to imitate a real user activity
- Develop complex ‘C’ Libraries and Utility functions for Code Reusability and Modularity
- Independently develop LoadRunner test scripts according to test specifications/requirements
- Use LoadRunner, execute multi-user performance tests, use online monitors, real-time output messages and other features of the LoadRunner Controller
- Perform in-depth analysis to isolate points of failure in the application
- Involved in performing load tests using LoadRunner on Oracle applications using Citrix Client
- Responsible for generating reports on these load testing scenarios, including documenting several factors such as User Think Time, Page Views per Second and number of virtual users for later analysis
- Perform validations to check and make sure that the product design satisfies and fits the intended usage
- Execute stress tests with a load of Vusers to see the breakpoint of the application
- Monitor the metrics such as Response Time, Throughput and server resources such as CPU utilized, Available Bytes and Process Bytes by using LoadRnner Monitors for IIS and Web Logic server
- Monitor the Web Logic server using Fog Light, a performance monitoring tool from Qwest Software
- Investigate and troubleshoot performance problems in a lba environment which also includes analysis of performance problems in a production environment
- Interface with developers, project managers and upper management in the development, execution and reporting of test automation results
Environment: HP ALM (formerly known as Quality Center), LoadRunner, Jmeter, CA AMP, SiteScope, VB, Html, XML, MS-Office, SQL, PL/SQL, Oracle Unix and Windows, UFT formerly known as QTP
Confidential, Herndon, VA
Performance Test Engineer
Responsibilities:
- Responsible for setting up Jmeter on AWS for cloud load testing
- Responsible for designing, creating and executing performance benchmarks and identifying root causes of performance defects
- Identified areas for Test Automation and provided design prototypes for the Automation suite based on tool.
- Responsible for creating and executing Autmation tests when needed using Selenium and Cucumber
- Ability to track defects using Jira
- Involved in Test case preparation based on the business requirements.
- Made reports on Testing progress and results, defected resolution and documented final test results in order to execute Testing Engagement.
- Executed Data mining of a large shared Test environment to set up Test scenarios.
- Evaluated users’ needs through performing System configurations, Test planning and reporting
- Created and loaded Test data sets to validate system or unit functionality.
- Performed the Integration, System, and Regression testing of Software for both Manual and Automated Test execution.
- Developed Test scenarios, Test case, Test Data and mapped the Test case against a Requirement in HP ALM.
- Used HP ALM as a Test planning and Defect management tool
- Wrote SQL scripts to validate the data in the database on the back end & Master files
- Scheduled VBScripts in windows scheduler that trigger stored jobs in UFT at desired times
- Performed descriptive programming approach of UFT in handling dynamic objects.
- Integrated UFT with ALM and ran automated test scripts stored in ALM by invoking UFT in the background.
- Reported defects out of UFT.
- Develop Vuser scripts for Web (HTTP/HTML), Citrix, Oracle and Web Services protocols based on the user workflow
- Performance test complex SOA based application using LoadRunner Webservices protocol to imitate a real user activity
- Identify real world scenarios and Day in Life performance tests
- Perform complex Usage Pattern Analysis
- Develop complex ‘C’ Libraries and Utility functions for Code Reusability and Modularity
- Independently develop LoadRunner test scripts according to test specifications/requirements
- Use LoadRunner, execute multi-user performance tests, use online monitors, real-time output messages and other features of the LoadRunner Controller
- Perform in-depth analysis to isolate points of failure in the application
- Involved in performing load tests using LoadRunner on Oracle applications using Citrix Client
- Responsible for generating reports on these load testing scenarios, including documenting several factors such as User Think Time, Page Views per Second and number of virtual users for later analysis
- Responsible for browser compatibility testing as well as HTML 4.0 compliance testing
- Perform validations to check and make sure that the product design satisfies and fits the intended usage
- Execute stress tests with a load of Vusers to see the breakpoint of the application
- Monitor the metrics such as Response Time, Throughput and server resources such as CPU utilized, Available Bytes and Process Bytes by using LoadRnner Monitors for IIS and Web Logic server
- Monitor the Web Logic server using Fog Light, a performance monitoring tool from Qwest Software
- Participate in walkthroughs with the client and the development team and attend Defect reporting meetings
- Assist in production of testing and capacity certification reports
- Investigate and troubleshoot performance problems in a lba environment which also includes analysis of performance problems in a production environment
- Interface with developers, project managers and upper management in the development, execution and reporting of test automation results
Environment: HP ALM (formerly known as Quality Center), LoadRunner, Jmeter, CA AMP, SiteScope, VB, Html, XML, MS-Office, SQL, PL/SQL, Oracle Unix and Windows, UFT formerly known as QTP
Confidential, Bellevue, WA
Performance Tester
Responsibilities:
- Analyzed server resources such as Total Processing Times, Available Bytes, Process Bytes and Heap Usages to look for performance bottlenecks
- Analyzed the server resources such as Availability Bytes and Process Bytes for Memory Leaks
- Used web page diagnostics to drill into the server responses and detect large times to first buffer for certain .jsp responses which was then reported to the development team who then provided the fix before the project sign off
- Prepared Close out Document and Executive level summary
- Worked in Complex system interfaces for creating Use Cases.
- Interacted with Developers for analysis of the Application.
- Performed Quality Center administration as well, setting up users and user groups with the required access privileges.
- Created Requirements and Test plans within Quality Center to aid the team.
- Performed complex SQL queries to test the data integrity for all actions from UI.
- Partially Automated web Scenarios using Quick Test Pro.
- Inserted Checkpoints for validation in the scripts generated using QTP
- Ensured the integrity of work performed by Performance Testers
- Translated raw data into graphs and charts that answered questions posed in the engagement objectives
- Assisted the Engagement Lead in assembling test plans and result documents
- Assembled test design documents with in-depth details
- Ensured the integrity of all information presented to the client
- Performed performance test on Cloud Test using SOASTA. Used SOASTA Mobile Test to record and customize complex motions, gestures and context with high precision
- Customized Quality Cneter for requirements analysis and defect reporting
- Performed defect reporting and defect tracking using Quality Center
- Wrote SQL queries to test the database and to perform validations
- Analyzed Test results by monitoring the scnarios and regularly checking the output logs on different levels
- Identified critical show stoppers and reported them to the developers during the testing phase
Environment: QTP, Windows, UNIX, Java, Oracle, PL/SQL, SQL, Quality Center, Jmeter and LoadRunner