Performance Engineer Resume
San Diego, CA
SUMMARY
Over 8 years of experience with Senior Quality Assurance Methodologies/ Performance Engineer. Extensive experience using the automated tools HP LoadRunner. Experience planning QA strategy, setting up the environment for testing applications being a self-starter, motivated team player with Leadership abilities and excellent communication and interpersonal skills.
Expertise
- 8 Years of extensive experience in software testing of web-based, client server applications and webservices. Actively participated in all stages of software development and Testing Lifecycle. Highly proficient in performance testing using LoadRunner and manual testing
- Extensive experience in preparing Load Test plans and Strategies
- Developed and deployed test Load scripts to do end to end performance testing using Load Runner.
- Involved in developing load and performance test scripts using Web (HTTP/HTML), Web Services, Citrix and Click, SAP, Oracle_NCA and Script protocols.
- Expert in writing user-defined functions/codes in the LoadRunner Scripts using "C" Language.
- Involved in writing various functions to create realistic load test on the applications.
- Created Parameterization, Host and Scheduling Operation.
- Expert in various Recording Modes, Runtime Settings and their purposes to suit different types of Script generation
- Hands on experience with configuring ramp up/ramp down
- Familiarity with run time settings/recording options and general options in LoadRunner.
- Experienced with performing IP Spoofing using LoadRunner for the load balancing issues.
- Experience performance tuning of servers.
- Benchmark, Baseline, Stress, Endurance, Network and Component Testing.
- Proficient in debugging and executing LoadRunner scripts.
- Expert in writing SQL Queries in the scripts to query the database
- Good understanding of Finance, Insurance, and Healthcare domains.
- Ability to work on multi platform environments like Windows, UNIX with clear understanding of file system, environment variables and file transfer protocol (FTP).
- Excellent inter-personal abilities and a self-starter with good communication skills, problem solving skills, analytical skills and leadership qualities.
- Knowledgeable in Capability Maturity Model (CMM), and IEEE.
Technologies
Automation Testing Tools: LoadRunner, Performance Center, Sitescope, Wily Introscope, Quick Test Pro (QTP), WinRunner
Defects Tracking Tools: Quality Center, Test Director, Bugzilla, Clear Quest
Languages: C, C++, C#, JAVA, SQL, PL/SQL, HTML, DHTML, XML, JavaScript, VB Script, Unix Shell, TSL
Operating Systems: Windows 95/98/2000/XP/VISTA,Unix, Linux, Solaris, Windows NT
Experience
Sr. Performance Engineer
Confidential,San Diego, CA July 2010 – Present
Responsibilities:
- Defining the performance goals and objectives based on the client requirements and inputs
- Extensively Worked in Web, Web Services, Citrix, Click and Script, SAP, Oracle_NCA Protocol in Loadrunner.
- Developed Load Runner test scripts according to test specifications/ requirements.
- Executed multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller in Loadrunner.
- Analyzed, interpreted, and summarized relevant results in a complete Performance Test Report.
- Developed and implemented load and stress tests with LoadRunner, and present performance statistics to application teams.
- Carried out stress testing for specific transactions by introducing rendezvous points in the script
- Conducted Performance testing for No, Medium and Full Load and analyzed the systems response.
- Extensively worked on Performance Monitoring and analyzed the response time Memory leaks hits/sec and throughput graphs
- Analyzed various graphs generated by LoadRunner Analysis and communicated bottlenecks to the System Administrators.
- Analyzed the LoadRunner reports to calculate Response time and Transactions Per Second
- Researched on the usability of JMeter for the Products Performance Testing.
- Conducted stress testing using JMeter and memory profiling using tools like JProbe.
- Designed and developed UNIX shell scripts.
- Responsible for monitoring system resources such as CPU Usage % of Memory Occupied VM Stat I/O Stat
- Monitored Net Stat to check the connectivity Load Balance and network traffic in each of the JVM\'s by using Unix Shell Scripting
- Responsible for collecting the frequency of JVM Heap and Garbage Collection Cleaned up in WebSphere during test execution
- Captured Java threads and Exceptions in the application logs for the analysis.
- Collecting of metrics using Wily IntroScope writing PBDs and maintaining them
- Tracked network traffic by monitoring Big F5 default graphs.
- Developed Shell Scripts to facilitate batch testing in UNIX environment.
- Extensively used UNIX commands for fetching and checking the Log files.
- Involve in walkthroughs and meetings with Performance team to discuss related issues
- Participated in defects meeting to discuss the bottlenecks and long running queries
Environment: LoadRunner , Performance Test Center, Quality Center, Java, JProbe, J2EE, JSP, Servlets, EJB,HTML, IPlanet, WebSphere, XML, SQL, UNIX, Windows XP, Sun Solaris, CA Wily Introscope.
Senior Performance Tester
Confidential,Columbus, GA March 2010 – June 2010
Responsibilities:
- Developed performance test plans and managed tasks for performance testing of business applications
- Performed stress testing using various features of LoadRunner, recorded and debugged scripts using multiple actions using VuGen module in LoadRunner.
- Designed scenarios based on important transaction and users feedback to simulate realistic load on the system.
- Extensively Worked in Web, and Web services in LoadRunner.
- Measured response time of important actions of users using start and stop transactions functions
- Executed scenario with different network bandwidth and browser agents. Compare results of these scenarios to recommend bandwidth for store locations.
- Monitored hardware capacity to ensure the necessary resources are available for all tests.
- Provided support to the development team in identifying real world use cases and appropriate workflows
- Used Jmeter for performance testing high volume of users.
- Performed in-depth analysis to isolate points of failure in the application
- Assisted in production of testing and capacity certification reports.
- Responsible for setting up monitors for different tiers.
- Developed "C" libraries to customize the scripts.
- Worked in shared environment tested different application
- Identified and eliminated performance bottlenecks during the development lifecycle.
- Accurately produced regular project status reports to senior management to ensure on-time project launch.
- Worked with database administer to index database to improve performance of the applications
- Setup Weblogic, DB2 and Tuxedo monitor in performance center to monitor system performance and identify bottlenecks.
- Analyzed the system resource graphs, network monitor graphs and error graphs to identify transaction performance time, network problems and scenario results respectively.
- Prepare data files for scripts by extracting data from database tables.
Environment: LoadRunner, Performance Center, Quality Center, JAVA, J2EE, Websphere, Weblogic, DB2, WinSQL, SharePoint
Performance Analyst
Confidential,Columbus, OH April 2008 – February 2010
Responsibilities:
- Gathered and Analyzed business and technical requirements for performance testing.
- Exclusively worked on Web (Http/html), Web Services and Oracl NCA protocols.
- Developed Performance test plan, test cases and scripts using LoadRunner.
- Created scenarios for Concurrent (Rendezvous) and Sequential users.
- Developed LoadRunner enhanced scripts with C functions, parameterized cookies, and stored dynamic content in LoadRunner functions, using client side secure certificates.
- Modified the runtime settings such as pacing, think time, log settings , browser emulation and VuGen and controller timeout settings in LoadRunner
- Responsible for creating performance test plans detailing requirements for Benchmark Load Stress and Failover testing.
- Involved in designing Load Stress and Failover Testing scenarios based on SLA for various systems and future load projections.
- Well versed in using LoadRunner\'s PeopleSoft Enterprise Web/HTTP, SAP and Web Services protocols.
- Responsible for scheduling monitoring scenarios and analyzing results for identifying performance bottlenecks.
- Used Web Services protocol to transfer SOAP messages from one environment to another environment
- Involved in executing the scenarios and monitoring the server response timings monitoring throughput Hits/Sec and Trans/Sec.
- Identified performance issues with Load Balancer configuration and settings memory leaks deadlock conditions database connectivity and hardware profiling.
- Acted as coordinator for performance testing activities with the client as well as with offshore team to provide maximum testing support.
- Responsible for communicating the performance bottlenecks to the QA Manager and opening defects using Quality Center.
Environment: LoadRunner, Quality Center, Oracle, IBM Web Sphere, AS400, SAP, SOAP, Perl, XML, WSDL, Java, JSP, HTML, UNIX, and Windows XP.
Performance Engineer
Confidential,New York City, NY Sept 2006 – March 2008
Responsibilities
- Developed clearly defined test plans to ensure accomplishment of load-testing objectives.
- Independently developed LoadRunner test scripts according to test specifications/requirements.
- Extensively Worked in Web and Web Services Protocol in Loadrunner.
- Validated the scripts to make sure they have been executed correctly and meets the scenario description.
- Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations.
- Developed LoadRunner scripts by using Virtual User generator for Single User, Base Line, Soak scenarios by storing dynamically varying object IDs in parameters and validating correct downloads of HTML pages by validating content in sources.
- Configure and set up monitors in SiteScope.
- Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
- Used Virtual User Generator to generate VuGen Scripts for web protocol. Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
- Developed and deployed test Load scripts to do end to end performance testing using Load Runner.
- Implemented and maintained an effective performance test environment.
- Identify and eliminate performance bottlenecks during the development lifecycle.
- Accurately produce regular project status reports to senior management to ensure on-time project launch.
- Conducted Duration test, Stress test, Baseline test
- Verify that new or upgraded applications meet specified performance requirements.
- Used Performance Center to execute tests, and maintain scripts.
- Used to identify the queries which were taking too long and optimize those queries to improve performance
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
- Provide support to the development team in identifying real world use cases and appropriate workflows
- Performs in-depth analysis to isolate points of failure in the application
- Assist in production of testing and capacity certification reports.
- Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
- Created Test Schedules.
- Worked closely with clients.
- Interface with developers, project managers, and management in the development
- Execution and reporting of test performance results.
Environment: Load Runner, Quality Center, .NET, Rational Clear Case, Clear Quest, Windows, Oracle, XML, Winsock, Weblogic, Apache, UNIX, Solaris
Performance Tester
Confidential,Houston, TX March 2003 – Aug 2006
Responsibilities:
- Involved in gathering business requirement, studying the application and collecting the information from developers, and business.
- Created Vuser scripts that contain tasks performed by each Vuser, tasks performed by Vuser’s as a whole, and tasks measured as transactions.
- Developed Vuser Scripts in web, SAP and Citrix Protocols.
- Designed tests for Benchmark and Stress testing.
- Parameterized large and complex test data to accurately depict production trends.
- Validated the scripts to make sure they have been executed correctly and meets the scenario description.
- Created Single User, Base Line and Soak test scenarios. Random pacing between iterations was introduced to get the desired transactions per hour
- Added performance measurements for Oracle, Web Logic, IIS in Performance Center.
- Analyzed results using LoadRunner Analysis tool and analyzed Oracle database connections, sessions, Web Logic log files.
- Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
- Maintained test matrix and bug database and generated monthly reports.
- Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
- Used LoadRunner tool for testing and monitoring. Actively participated in enhancement meetings focused on making the website more intuitive and interesting.
Environment: Manual Testing, Loadrunner, Quality Center, SQL Server XML Java JavaScript Apache IE and Netscape.
Education
Master\'s in Electrical Engineering