We provide IT Staff Augmentation Services!

Performance Engineer Resume

AZ

SUMMARY

  • 7+ years of extensive experience in End - to-End Performance testing.
  • Hands on experience and exposure in all phases of project development lifecycle and Software Development Life Cycle (SDLC) right from Inception, Transformation to Execution which include Design, Development, and Implementation.
  • Proficiency in testing the applications compatibility on UNIX and Windows platforms
  • Experience in using automated tools like Performance Center, QTP, and Quality Centre.
  • Performance testing Experience in WEB/GUI, SOA, J2EE, .NET, PeopleSoft, Main Frame, Oracle and Legacy applications.
  • Experience in various Load runner Protocols like HTTP/HTML, Web Click &Script, Web Services, WinSock, Flex, Oracle NCA, Seibel (Web) and RTE Protocols.
  • Extensive experience with baseline, benchmark, Average load, stress, endurance, and capacity testing for performance.
  • Worked closely with the developers and business customers to understand the business requirements and establish the Performance Acceptance Criteria.
  • Advanced programming skills in enhancing Load Runner VuGen scripts for dynamic navigation.
  • Configured and used SiteScope Performance Monitor to monitor and analyze the performance of the server by generating various reports from CPU utilization, Memory Usage to load average etc
  • Expertise in designing different load scenarios replicating the real life production in Performance center and Controller.
  • Experience in using different monitoring tools like Wily-Introscope and HP Diagnostics to keep track of the test performance and identify various bottlenecks on back end servers.
  • Worked closely with developers to identify the bottlenecks and help tune them.
  • Developed comprehensive analysis and test results report for development team and Management.
  • Have an ability to handle multiple projects with competing priorities.
  • Individual with good analytical, inter personal and problem solving skills.
  • Strong process and documentation skills for performance testing/engineering.

TECHNICAL SKILLS

Operating Systems: UNIX, Windows XP,2003,Windows NT and Linux

Testing Tools: Load Runner, Quick Test Pro, Performance Center, Jmeter

Languages: C, C++, JAVA/J2EE, VB Scripts, XML, Shell Scripting

Databases: Oracle, DB2, SQL Server, MS-ACCESS, MySQL

Monitoring Tools: Wily Introscope, Sitescope

Web / Application Servers: Apache, Tomcat, Web logic, Web Sphere, IIS

Methodologies: RUP, Agile, Waterfall

Project Management /Analysis: MS Project, MS Visio, Clear Quest, Rational Requisite Pro, UML

PROFESSIONAL EXPERIENCE

Confidential

Performance Engineer

Responsibilities:

  • Responsible for Performance and Load testing using Loadrunner.
  • Interacted with Business teams and Developers to understand business requirements, writing Test Cases and bug fixes.
  • Experience in technical writing and working with product development group and tested.
  • Created and maintained traceability matrices. Used this matrix in performing impact analysis for changing requirements.
  • Responsible for setting up performance test lab and testing environment for Loadrunner.
  • Developed testing strategy, test plan and workflow for the major business function, which was tested.
  • Analyzed the results of different scenarios ran under different environment to determine the performance under same user load.
  • Determined the maximum number of concurrent users can handle by the application server in each environment.
  • Developed Vuser Scripts using LoadRunner web (http/html), Oracle, Microsoft .Net, Web (Click and Script), Windows Sockets protocols
  • Used C language to enhance Load Runner Scripts to handle exceptions
  • Developed Virtual User Scripts for the different environment using Multi protocol.
  • Created testing scenarios to run the same user load in different environment to assess the performance of Oracle middle-tier application server in three environments.
  • Analyzed Transaction Profile diagrams to identify the business process that needs load testing
  • Parameterized test scripts to send realistic data to the server and avoid data caching
  • Performed system performance & load benchmark measurements for capacity, scalability and breakpoints.
  • Monitored the metrics such as response times, throughput and server resources such as CPU utilized, Available Bytes and Process Bytes by using LoadRunner Monitors.
  • Identified Performance Tuning issues that impacted application performance
  • Continuously worked with developers on problems such as server caching, queuing, database indexing, hardware capacity, insufficient memory, memory leaks, etc.
  • Prepared Load Test Reports by analyzing the results from Load Runner analysis
  • Met with clients to discuss performance test plans and present performance test results
  • Prepared the weekly Test Report/status to the management team.

Environment: Load runner, QTP, XML, SOAPUI, Wily, Dynatrace, Excel, Business Objects, SQL Server, Windows XP, Telnet, Web Sphere, Lotus Notes, UNIX, Apache, Tomcat and Oracle.

Confidential, AZ

Sr. Performance Engineer

Responsibilities:

  • Involved in gathering business requirements, studying the application and collecting the information from developers and business.
  • Developed Vuser scripts using Performance Center Web (HTTP/HTML) and Siebel protocol based on the user workflows for above applications.
  • Prepared Vugen scripts for java, .Net and Siebel applications.
  • Designed and executed scenarios for Baseline, Stress/Capacity and Stability tests.
  • Parameterized unique IDS and stored dynamic content in variables and pared the values to Web submits under Http protocols.
  • Identified performance bottlenecks and involved in performance tuning process for improving the application response time.
  • Analyzed the performance test results and explained it to Agencies.
  • Recommendednecessaryserver upgrades for better performance results.
  • Utilized scenario By Schedule in the controller to change the ramp up / ramp down settings.
  • Responsible for monitoring the metrics such as response times, throughput and server resources such as total processor time, available bytes and process bytes by using Performance Center monitors.
  • Worked as an independent consultant for performance testing and coordinated with multiple vendors.
  • Worked closely with Development team to discuss the Design and Testing aspects of the applications to design the Test plans.
  • Reviewed BRD, SRS to prepare Performance acceptance criteria and Test Plan.
  • Worked on Web, Clint-server, Main frame, SOA, J2EE, and legacy applications.
  • Worked on different protocols like HTTP/HTML, Web Click &Script, Web Services, WinSock, Flex, Oracle NCA, Peoplesoft, Seibel (Web) and RTE.
  • Designed performance test suites by creating Vuser scripts, workload scenarios, setting transactions, rendezvous points and inserting them into suites using Load Runner.
  • Used Wily Introscope to monitor server metrics and Performed in-depth analysis to isolate points of failure in the application.
  • Involved in testing batch processes in PeopleSoft by capturing time taken for processing large volumes of data. Data for batch testing was created using Performance Center.
  • Monitored system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat using UNIX commands like top Vmstat, Svmon and Netstat.
  • Worked on analyzing Memory leak issues, data driven issues etc.
  • Analyzed JVM Heap and GC logs in Web Sphere during test execution.
  • Conducted result analysis and communicated technical issues with developers and architects.
  • Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution.
  • Created comprehensive test results report.

Environment: Performance Center, Wily, CVS, Excel, Oracle, Web logic, F5 Load Balancer, JAVA, JMETER, Quality Center, J2EE Diagnostic Tool, web, SOAPUI, Apache, Tomcat.

Confidential, IL

Sr. Performance Engineer

Responsibilities:

  • Defining the performance goals and objectives based on the client requirements and inputs.
  • Extensively Worked in Web, and Web services, Siebel in Load Runner.
  • Executed scenarios using Controller and analysis of results using Performance Center Analyzer.
  • Extensively used Web (html/http), Web Services, SAP and Oracle NCA protocol in Loadrunner
  • Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production.
  • Responsible for developing and executing performance and volume tests.
  • Developed test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems.
  • Configure and set up monitors in Sitescope.
  • Worked extensively on analyzing reports.
  • Setup HP Performance center monitor resources to identify performance bottlenecks analyze test results and report the findings to the clients, and provide recommendation for performance improvements as needed.
  • Used Virtual User Generator to generate VuGen Scripts for web protocol. Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
  • Developed and deployed test Load scripts to do end to end performance testing using Load Runner.
  • Implemented and maintained an effective performance test environment.
  • Identify and eliminate performance bottlenecks during the development lifecycle.
  • Accurately produce regular project status reports to senior management to ensure on-time project launch.
  • Conducted Duration test, Stress test, Baseline test
  • Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
  • Provide support to the development team in identifying real world use cases and appropriate workflows
  • Performs in-depth analysis to isolate points of failure in the application
  • Assist in production of testing and capacity certification reports.
  • Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
  • Interface with developers, project managers, and management in the development
  • Execution and reporting of test performance results.
  • Tune systems for optimal performance and characterize systems on multiple platform and configuration combinations

Environment: Load Runner, Wily, J2EE Performance tuning, E-Commerce, Rational Clear Case, Clear Quest, Oracle, COM, XML, SAP, Winsock, Web logic, Apache, Tomcat, UNIX, Solaris.

Confidential, AZ

Performance Engineer

Responsibilities:

  • Developed Single User, Base Line and Soak test scenarios.
  • Independently developed Load Runner test scripts according to test specifications/requirements.
  • Developed Vuser scripts using Web (http/html), Web Services protocols.
  • Introduced random pacing between iterations to get the desired transactions per hour.
  • Spikes were created by adding hundreds of users during baseline test and Server recoveries were monitored.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Conducted meetings and walkthroughs with users, developers and Business Analysts to gather information about business process.
  • Developed detailed Testing Methodologies, Test Matrices, Test cases and Test scripts,
  • Extensively worked on Load runner, created Scripts based on prioritized/critical scenarios and scattered the peak load over the production like distribution ratio.
  • Generated Automation Scripts using QTP for Regression Testing and the additional scripts are generated for each version.
  • Interacted with Stakeholders during testing, isolated bottlenecks at different levels and suggested Tune-up methodologies.
  • Performed Functional/Regression testing and Performance/Stress/Load testing.
  • Developed scenarios for Regression/Functional and Performance testing which covers more than 90% of the Critical scenarios for the application.
  • Analyzed Average CPU usage, Response times, No of Transactions, Throughput, HTTP Hits and Average Page times for probable scenarios and created Performance explorer graphs to analyze the CPU and Memory utilization for different load tests.
  • Conducted weekly walkthroughs on the application progress for meeting schedules and provided the QA status update periodically to the management.
  • Creating reports for higher management on performance test results. Developed various reports and metrics to measure and track testing effort.

Environment: SQL, UNIX, Load Runner, QTP, Performance tuning, .NET, IIS, SQL Server.

Hire Now