Perfornce Engineer Resume
MA
SUMMARY
Over 7 years of diversified experience as a Performance Engineer. Experience includes Requirement Analysis, Manual Testing and Automation and quality assurance of Client/Server and Web based applications.
- Experience in implementing customized testing processes that involves requirements review, test planning and Strategizing, test design, test development, test execution, test defect reporting, project tracking and Issue/Risk management.
- Conversant with all phases of Project Life Cycle including Requirement gathering, Analysis, Design, Development, Implementation, Testing, Software quality standards and configuration management and change management and Quality procedures.
- Experience in using Web/HTTP, AJAX Click & Script, Winsock, Oracle NCA, 2 Tier protocols.
- Created Test Scripts and Test Conditions to perform End-to-End testing of the application to ensure requirements traceability using Quality Center for automation testing.
- Good Technical Experience in Writing System Test Plans, Test Cases, Test Scripts, Automated scripts, Documentation and Setting up Test lab Environment.
- Advanced programming skills in enhancing Load Runner VuGen scripts for dynamic navigation.
- Proficient in developing and executing Test Procedures, Test plans, Test scripts,verification, test result validation, usability, stress testing and ensuring that the software meets the system requirements.
- Implemented performance testing methodology and worked closely in planning coordination
- Involved in Test Matrix and Traceability Matrix and performing Gap Analysis.
- Experience in using different monitoring tools like Wily-Introscope, Jprobe, Jprofiler and Hp Diagnostics to keep track of the test performance and identify various bottlenecks.
- Knowledge in Oracle, SQL database queries and manipulation using SQL.
- Well organized, creative and a Team Player with proven ability to complete given tasks on time.
- Experience in Bug Tracking System and technically sound in knowledge.
- Responsible for weekly status meetings showing progress and future testing efforts.
TECHNICAL SUMMARY
- Operating Systems: AIX, HP-UX, Solaris, Windows XP, 2003, 2000, Vista, Windows NT and Linux
- Languages: C, JAVA/J2EE, .NET, VB Scripts, XML, UNIX
- Databases: Oracle 9i/10g, DB2, SQL Server, MS-ACCESS, MySQL
- GUI: VB 6.0/5.0, JSP, Java Applets, ASP, HTML
- Web Related: DHTML, XML, VBScript, JavaScript, Applets, JAVA, JDBC, Servlets and JSP
- Testing Tools: LoadRunner, Quality Center, Quick Test Pro, Performance Center
- Web / Application Servers: Apache 2.x, Tomcat, WebLogic, WebSphere and IIS.
- Other: JMeter, Silk, Wily Introscope, Sitescope, Jprobe, Jprofiler
PROFESSIONAL EXPERIENCE
Confidential, MA Jan 2011 - Present
Performance Engineer
Responsibilities
- Gathered business requirement, studied the application and collected the information from Analysts.
- Created LoadRunner Scenarios and Scheduled the Virtual user to generate realistic load on the server using LoadRunner.
- Involved in developing the Test Plan Strategy, build the test client and test environment.
- Configured and used Sitescope Performance Monitor to monitor and analyze the performance of the server by generating various reports from CPU utilization, Memory Usage to load average etc.
- Conducted all tests in the Controller by creating 100, 200, 400 virtual users for load.
- Inserted Transactions, Checkpoints into Mercury LoadRunner Web VuGen Scripts and parameterized & correlated the scripts.
- Analyzed Mercury LoadRunner on-line graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server.
- Performed Data Driven and Security Testing.
- Added and monitored Web Logic, App server and Windows servers during performance testing by using Sitescope.
- Used monitoring tools to analyze the heap memory statistics for identifying memory leaks.
- Debugged and enhanced the performance test scripts using C language.
- Developed and maintained overall test methodology and strategy, documented Test plans, Test cases and editing, executing test cases and test scripts.
- Involved in conducting stress tests and volume tests against the application using LoadRunner.
- Used Quality Center to invoke the scripts and initially performed the baseline testing and organized all the scripts systematically and generated reports.
- Extensively used Quality Center for test planning, maintain test cases and test scripts for test execution as well as bug reporting.
- Used Virtual User Generator to generate VuGen Scripts for web protocol. Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
- Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution.
Environment: Performance Center, VuGen, Java, Web Logic, Web Services, XML, LoadRunner, Wily, Quality Center, Oracle, Sitescope, Jprobe, Jprofiler.
Confidential, AZ Mar 2008 - Dec 2010
Performance Engineer
Responsibilities
- Responsible for designing scenarios for Performance Testing, generating scripts and handling Correlation as well as parameterization using Loadrunner and Silk Performer.
- Extensively used Web (html/http), Web Services, Siebel, SAP and Oracle_NCA protocol in Loadrunner.
- Executed scenarios using Controller and analysis of results using Loadrunner Analyzer.
- Identified bottlenecks in Network, Database and Application Servers using Loadrunner monitors.
- Used Keynote for cloud based load testing.
- Validated web services and made using SOAPUI.
- Analyzed Throughput Graph, Hits/Second graph, Transactions per second graph and Rendezvous graphs using LR Analysis tool.
- Enhanced Vusers scripts by correlation, parameterization, transaction points, rendezvous points and various LoadRunner functions.
- Responsible for conducting performance benchmark tests.
- Developing VUser scripts and enhanced the basic script by Parameterizing the constant values using LoadRunner
- Identified critical process for testing by monitoring the business activity.
- Extensively used Sitescope and Introscope for monitoring during load tests.
- Prepared load Test analysis reports (%disk, CPU Utilization, Throughput, %page breakdowns, Response Times, Network Monitors, Web Server Monitor Counters, Captures, System Performance Counters and Database Performance Counters.
- Configured SiteScope to monitor Siebel and Oracle Servers.
- Conducted Regression testing and BVT (Build Verification Test) using silk test and silk performer
- Parameterized unique IDS and stored dynamic content in variables and pared the values to Web submits under Http protocols.
- Performed load testing with 100/250/500/750/1050 Vusers.
- Performed load testing from various different locations using Keynote.
- Designed the Manual and Goal oriented scenarios using LoadRunner Controller module to test.
- Involved in analyzing Runtime, System Resources and Transactions Graphs
- Responsible for generating reports using LoadRunner.
Environment: LoadRunner, Performance Center, Wily Introscope, Siebel, Quality Center, Sitescope, Java, JavaScript, keynote, Websphere, Weblogic, HTML, Oracle, SQL, Dynatrace, Clear Quest, Unix
Confidential, MN Jun 2006 - Feb 2008
Performance Analyst
Responsibilities
- Defining the performance goals and objectives based on the client requirements and inputs
- Extensively Worked in Web, Citrix, Click and Script, Oracle Protocol in Loadrunner.
- Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production
- Responsible for developing and executing performance and volume tests
- Develop test scenarios to properly load/stress the system in a lab environment and monitor / debug performance & stability problems.
- Partner with the Software development organization to analyze system components and performance to identify needed changes in the application design
- Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
- Developed and deployed test Load scripts to do end to end performance testing using Load Runner.
- Implemented and maintained an effective performance test environment.
- Identify and eliminate performance bottlenecks during the development lifecycle.
- Accurately produce regular project status reports to senior management to ensure on-time project launch.
- Conducted Duration test, Stress test, Base Line test
- Used Controller to Launch 300, 500, 700 concurrent users to generate load
- Used to identify the queries which were taking too long and optimize those queries to improve performance
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
- Provide support to the development team in identifying real world use cases and appropriate workflows
- Performs in-depth analysis to isolate points of failure in the application
- Assist in production of testing and capacity certification reports.
- Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
- Worked closely with clients
- Interface with developers, project managers, and management in the development,
- Execution and reporting of test performance results.
Environment: Load Runner , Quick Test Pro, LDAP, Oracle , MS SQL Server , Web logic , Web Sphere, Load Balancer, JAVA ,Test Director J2EE Diagnostic Tool, JMeter, web, Windows 2000 / XP , Solaris , AIX , IE, Netscape, Firefox
Confidential, CT Jun 2005 - Apr 2006
Performance Tester
Responsibilities
- Changed the user loads by monitoring the number of hits on different web pages on the website
- Change the runtime settings such as pacing, think time, Log settings , browser emulation and timeout settings in LoadRunner VUGEN and controller
- Used Scenario By Schedule in the controller to change the Ramp Up, Duration and Ramp Down settings
- Monitored the metrics such as response times, throughput and server resources such as Total Processor Time, Available Bytes and Process Bytes by using LoadRunner Monitors
- Scheduled Load Tests to run overnight due to unavailability of environment during the day time
- Analyzed the Transaction Summary Report and graphs generated in a LoadRunner Analysis session
- Identified the load balancing issues on the servers
- Analyzed server resources such as Total Processor Time to look for performance bottlenecks.
- Analyzed the server resources such as Available Bytes and Process Bytes for Memory Leaks
- Reported the defects using Rational Clear Quest and smoke tested the application after the fix
- Performed capacity testing on the Enhancements by using Schedule By Group in Controller
- Worked on .Net Diagnostics tool to uncover the layers, classes and methods responsible for slow response times of transactions
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
- Writes and executes SQL queries in validating test results
- Used Virtual User Generator to generate VuGen Scripts for web protocol, Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
Environment: LoadRunner, .Net Diagnostics, Rational ClearQuest7.0, SQL, SQL server 2000, MS Office, MS-Visio, IIS, .Net, and Windows
EDUCATION
Masters in Mechanical Engineering