We provide IT Staff Augmentation Services!

Sr Performance Engineer Resume

0/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • 8+ years of Quality Assurance experience with strong expertise in Performance/Load & Stress Testing using HP Performance Center/LoadRunner.
  • Experienced in designing multiple LoadRunner scripts (VuGen) with different protocols like for load testing different applications. Few of them are Web (Http/Html), Ajax, Web services, LDAP, CitrixICA, FLEX, OracleNCA, Java over HTTP Vuser, JMS
  • Significant experience in Load Testing various applications including Java, J2EE, .NET, COM/DCOM, SQL Server implementations.
  • Well experienced in executing Baseline, Benchmark, Performance, Stress and Memory Leak tests.
  • Experience in writing Test Plans, Developing Test Scenarios along with Test Cases, Test Procedure with reference to Business Requirement Documents (BRD), Functional Specification & Technical Specifications to meet Functional SLAs.
  • As a Performance Tester, is accountable for support across all performance testing activities, load and stress testing processes, methodologies, and tools relating to the clients applications portfolio.
  • Accountable for consulting and advising all teams, assisting with the development and maintenance of reporting metrics to determine the effectiveness of Quality Assurance efforts across client’s applications.
  • Extensive experience in Quality Assurance methodologies and strategies with better understanding of Software Development Life Cycle (SDLC).
  • Hands on experience and exposure in all phases of project development lifecycle and Software Development Life Cycle (SDLC) right from Inception, Transformation to Execution, which includes Design, Development, and Implementation.
  • Strong experience writing SQL queries for back - end testing, UNIX commands for verifying log files, shell scripts to bounce/maintain QA servers, database refresh for QA environments, XML API testing.
  • Experienced in monitoring CPU, Memory, Network, Web connections and through put while running Baseline, Performance, Load, Stress and Soak testing
  • Involved greatly in Performance Testing, Functional Testing and Regression Testing using automated testing tools including LoadRunner, HP Performance Center, Quick Test Pro, Quality Center, Win runner and Test Director.
  • Experienced in supporting the team of on-site and off-shore to provide Performance Testing services 24/7 for internal client applications.
  • Well-versed in Software development which follows Agile methodologies.
  • Well-versed in implementing best practices for vugen scripting, Performance Testing and reporting Performance test analysis.
  • Installed and Setup Performance Center and Multi Load Runner Agents

TECHNICAL SKILLS

Testing Tools: LoadRunner 8x/9x/11x, Silk Test 7x, Silk Performer 7x, QTP 9x/10, Quality Center 8x/9x/11x, Selenium

LoadRunner Protocols: Web - HTTP/HTML, WebServices, CitrixICA, FLEX, Winsock, and AJAX, Oracle NCA

Scripting: JAVASCRIPT, BDL, SHELL, VB

Programming Languages: C, C++, C#, JAVA, HTML

Web Server/App Server: MS IIS, Apache, HIS, Web Logic, Web Sphere

Database: Oracle, Db2, and SQL Server

Service Oriented Architecture (SOA): Web Services, XML, SOAPUI, WSDL

PROFESSIONAL EXPERIENCE

Confidential, Dallas, TX

Sr Performance Engineer

Responsibilities:

  • Created Test Plan/Strategy, which includes Testing Resources, Testing Strategy, Risks and testing of end-to-end scenarios.
  • Prepared test Estimations and presented in front of higher management for approvals.
  • Created automated scripts by including timers for recording response times, and test checks for confirming the retrieval of the correct page.
  • Involved in performance testing of server’s load and scalability by creating multiple Virtual Users by using Load Runner Virtual User Generator component.
  • Designed multiple LoadRunner scripts (Vugen) with different protocols like Web, Flex, AJAX,
  • Tru Client, Citrix, Web services for load testing different GUI and other applications.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.
  • Created, Executed and Monitored the feasibility of various manual and goal oriented scenarios of an application with Load Runner Controller.
  • Run full formal performance test including Load, Peak, Breakpoint, Burst, Longevity and Fail over.
  • Effectively used all the components of LoadRunner 11.52 including the controller and Performance Center and efficient in writing LoadRunner 11.52 functions.
  • Identify system/application bottlenecks and work with Bottom-line to facilitate the tuning of the application/environment in order to optimize capacity and improve performance of the application in order to handle peak workloads generated via Mercury Interactive LoadRunner tool to simulate activity.
  • Configured Web, Application, and Database server performance monitoring setup using LoadRunner Controller, Wily Introscope, Spunk & HP diagnostics.
  • Created Vusers to emulate concurrent users, inserting Rendezvous points in the Vuser scripts and executed the Vuser Scripts in various scenarios which were both goal oriented and manual using Load Runner
  • Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Added various monitoring parameters (CPU, Memory) to the LoadRunner controller for monitoring, also using SiteScope for monitoring database and application servers.
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Using SoapUI for Load testing for different API’s.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports and graphical charts for upper management using Load Runner analysis component.
  • Extensively used Unix commands for debugging and used, modified & ran Shell Scripts for daily reports and data collection.
  • Responsible for analyzing the results like CPU usage, memory usage, garbage collection/heap size, server response times, database response times, active/idle threads, size of weblogic queues, etc.
  • Monitor UNIX logs for different type of exceptions during Load test manually and also using Failbox tool.
  • Extensively used SQL queries, responsible for Database testing using SQL queries, needs to verify records in backend after updating front end, modification and deletion of records from fount end and vice versa.
  • Monitor Oracle and Pl/SQl database while running the load for CPU utilization, storage IOPS, Storage KBs, IO Wait Percentage, AWR reports, etc. and finding out the issues within database.
  • Alert Count,Connections,Pending Messages,Incoming Message Rate,Outgoing Message RateorMessage Memory Percent (%) by using RTView performance dashboard.
  • Used SVN for copying JAR/WAR files from a remote repository to a local machine and use them for LoadRunner scripts generation.
  • Identified bottlenecks for a clustered environment relating to Indexes, Connection Pools, Garbage collections, Memory heap size and fixed them by changing configurations with the help of DB team.
  • Responsible for Order Management lifecycle, inventory management and Service provisioning for Triple Play services during System Test cycle.
  • Using Quality Center for complete defect management and reporting.
  • Responsible to provide on call Production support for the Production environment.

Environment: J2EE, JAVA, Web Sphere, SQL Server, Weblogic, XML, VuGen, Java JRE1.7, LoadRunner 11.52, Web services, SoapUI, Wily Introscope.

Confidential, Atlanta, GA

Senior Performance Engineer

Responsibilities:

  • Gathering and analyzing business and technical requirements for Performance Testing purposes.
  • Coordinating with Functional Teams to identify the Business Processes to be Performance tested.
  • Having good communication with cross-functional team and get various updates to move on to next step.
  • Arranging daily stand up meeting with off shore team and plan accordingly with development team and functional team.
  • Experience in using SOAP UI for testing the Web services.
  • Extensively used Web (HTTP/HTML), Web Services and J2EE.
  • Utilized WSDLs and files to perform web services (integration testing) using SOAP UI and Performance Center.
  • Creating various scenarios to do the performance testing according to scope of the project.
  • Created, Executed and Monitored the feasibility of various manual and goal oriented scenarios of an application with Performance Center.
  • Using Performance Center performed Goal oriented scenarios when transactions per second needed to be determined.
  • Prepared numerous scenarios to verify the accuracy of pricing validation procedures.
  • Using Performance Center Performed manual increment and decrement user (Ramp Up and Ramp down) scenarios and when how many users can Confidential a time do a transaction.
  • Worked on Controller to simulate explicit Real-Time scenarios by using accurate Run time settings and using IP Spoofing, WAN Emulation etc.
  • Also scheduled performance tests to run during various times of the day using Load Runner.
  • Involved in performance testing of server’s load and scalability by creating multiple Virtual Users by using performance Center Virtual User Generator component.
  • Executed performance test scenarios, analyzed results and reported findings to the relevant parties.
  • Monitored different graphs like Average Transaction Response Time, Network Data, Hits per Second, Throughput, and Windows resources like CPU Usage available and committed bytes for memory.
  • Used HP Diagnostics to obtain Performance data for problem solving, trend analysis, and capacity planning.
  • Identified problematic areas and recommended solutions to the developers.
  • Tuned systems for optimal performance.
  • Verify that new or upgraded applications meet specified performance requirements.
  • Understanding the change thoroughly and if necessary contact related business analyst, developer and other SME’s.
  • Defects were tracked in Quality Center and communicated with development/business teams.
  • Coordinated with BA/Dev teams and User Support Group to help resolve the defects quickly and escalate the potential issues when necessary.
  • Coordinate with Off-Shore QA team.

Environment: Java, J2EE 1.4, Web Services, SQL Server HP ALM Performance Center 11.0, HP ALM Quality Center 11.0, SOAP UI, HP Diagnostics Server, HP Site Scope

Confidential, Buffalo Grove, IL.

Performance Tester

Responsibilities:

  • Participate in all meetings planned for particular release and obtain necessary technical requirement and such meetings include design review, test execution timeline etc.
  • Meeting with project team to work for project business volume metrics.
  • Gathering and analyzing business and technical requirements for Performance Testing purposes.
  • Configure all necessary hardware and software to support Performance Center.
  • Planning, development and testing of scripts.
  • Independently developed Performance Center Vugen scripts according to test specifications/requirements to validate against Performance SLA.
  • Enhanced users Scripts by correlation, parameterization, transaction points, rendezvous points and various Load Runner functions.
  • Parameterized the Performance Center scripts to access data sheets based on environment like QA, UAT and Production.
  • Created automated scripts for API WSDLs/Portal Application using Vugen in Performance Center 9.52 (web services protocol/Portal, Frontend Web HTTP/HTML protocol) for regression scenarios.
  • Using Performance Center created scenarios and set up monitors to track load generator for performance testing.
  • Performed correlation by rightly capturing the dynamic values and parameterize the data dependencies that are always a part of Business process.
  • Conducted several Load tests such as 1 Hour peak production load, Reliability and Stress tests to identify the performance issues.
  • Ran the scripts for multiple users using controller in Performance Center for GUI/API regression/Load testing.
  • Involved in determining scalability and bottleneck testing of applications.
  • Identifying bottlenecks in Network, Database and Application Servers using Performance Center Monitors.
  • Monitored Average Transaction Response Time, Network Data, Hits per Second, Throughput, and Windows resources like CPU Usage available and committed bytes for memory.
  • Analyzed Throughput Graph, Hits/Second Graph, Transactions per second Graph and Rendezvous Graphs using LR Analysis Tool.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Analysis report.
  • Analyzed results and provided Developers, System Analysts, Application Architects and Microsoft Personnel with information resulting in performance tuning the Application.
  • Develop and implement load and stress tests with HP Performance Center and present performance statistics to application teams, and provide recommendations of how and where performance can be improved.
  • Verify that new or upgraded applications meet specified performance requirements.
  • Worked in Quality Process - Prepared monthly Quality Reports, Benchmarking Reports.

Environment: .NET, C#, ASP.NET, HP-Unix, HIS 4.0/7.0, Web services, DB2 Load Runner 9.52/HP Performance Center 11.0, HP ALM Quality Center 11.0, HP diagnostics Server, HP Site Scope

Confidential, Tallahassee, FL

LoadRunner Tester

Responsibilities:

  • Participated with the Business Support Group in design, review and requirement analysis meetings.
  • Reviewed the test scripts prepared by the team and the Business User Support group.
  • Worked in process development and metrics evaluation for Feature Testing, which involves tracking feature movement for the entire period of Feature Testing cycle. Also Worked in metrics and process development Regression Testing cycle.
  • Performed Functional testing, Regression testing and End-to-End Testing.
  • Create Performance scripts using Load Runner’s VuGen.
  • Developed UAT Test Scripts and maintained them in Hp Quality Center.
  • Parameterized the LoadRunner scripts to access data sheets based on environment like QA, UAT and Production.
  • Used Load Runner Controller to generate load andmonitor the performance under load of the application being tested.
  • Worked inLogging, recording, issues, tracking and maintainingthe defects usingBugzilla.
  • Generated the reports using Bugzilla on daily basis.
  • Entered defects in Quality Center and participated in the defect-review calls every day.
  • Analyzed the results using Online Monitors and Graphs to identify bottle necks in the server resources using Load Runner and JMeter.
  • Developed and executed SQL Scripts to verify the Analytics data and the corresponding data displayed in the UI.
  • Aided performance modification, managed and resolved technical issues.
  • Prepared testing status reports and reviewed the status with client and project management.
  • Followed Quality process and standard operating procedures.

Environment: Java, Oracle DB, Oracle Apps (AP,AR,GL), Web Services, XML, Windows 2000, Citrix Client, Site Minder Load Runner 9.52, Quality Center 9.5, Jmeter

Confidential, Dallas, TX

LoadRunner Tester

Responsibilities:

  • Had meetings with different teams for performance test scope.
  • Participated in the team meetings to discuss the issues arising out of testing.
  • Worked in Requirement analysis, Test strategy documentation of required system and functional testing efforts for all test scenarios including Positive testing, Negative tests
  • Identified the test requirements based on application business requirements.
  • Generated Test Cases for each specification in Requirement Specification Document corresponding to each module.
  • Created Test Data for the test cases for Functional and Automated testing.
  • Documented standards, guidelines, and strategic plans to develop a robust Performance test environment and streamlined the existing Performance testing Process.
  • Lead the development, documentation, and maintenance of LoadRunner scripts along with enhanced standards, procedures and processes.
  • Took lead for creating Performance Center Load Runner scripts using Web-HTTP, Web Services, Oracle NCA Protocols.
  • Tests using Load Runner, Monitor system under load in conjunction With Capacity Planning
  • Troubleshot, tracked and escalated issues related to test execution.
  • Worked on executing Baseline, Benchmark, Stress, Memory leak test scenarios for internal and customer facing applications based on application’s work load model.
  • Ensured daily production support and administration of portal operations like end user performance, identifying the bottlenecks before and after production migration and maintenance upgrades.
  • Generated weekly status reports and reported to management.
  • Worked on sharing applications Performance analysis to Technical Directors and business teams to mitigate GO or No-Go decisions.
  • Communicated project status to Leadership.
  • Worked in Internal and External Audits.

Environment: Java Script,Windows NT/2000, Oracle 8i/9i, SQL Server 2000/2005 Load Runner 9.5, Quality Center 9.0

Confidential, Cambridge, MA

LoadRunner Analyst

Responsibilities:

  • Worked in writing and reviewing the test cases.
  • Prepared, reviewed and executed the tests plan and test cases based on requirements.
  • Analyzed and DevelopingTest Plan, Test Cases, Test Scripts, Expected Test Results and Test Procedurefrom functional requirement for each module.
  • Utilized testing tools to manage the testing process.
  • Verified EDI test files to communicate with external vendors.
  • Have good exposure inGUI, Business Testing,Functionality Testing, Manual testing, White Box testing, Black box Testing, System testing, including Integration, Performance, Stress, Load and Regression Testingof Web and Client/Server based applications andUATusingAutomated Testing tools like Load runner.
  • Reviewed Load Runner coding standards and best practices process for performance testing projects
  • Worked in designing test objectives, planning the test, creating Vusers, creating the scenarios, executing the Scenarios, monitoring the scenarios, and analyzing the test results for the Performance testing using Load Runner.
  • UsedQuality Centerto check out the latest versions of the build for testing purposes, and check in the updated test cases, and test documentation periodically.
  • Planned,designed,executedand evaluated performance tests of web application and services and ensured optimal application performance usingLoad Runner.
  • Conducted extensiveSecurity Testingincluding alternative user identification and authentication using manual testing.
  • Performed back-end testing by extensively using SQL commands to verify the database integrity.
  • Using LoadRunner/Performance Center analyzed results and created reports in for the Load test performed.
  • Monitored different graphs like transaction response time and analyzed server performance status, hits per second, throughput, windows resources and database server resources etc.
  • Evaluated test results to identifyperformance issues, bottlenecks.
  • Identifiedperformance issueswith the servers and made recommendations for their performance improvement.
  • Maintained bug lists for critical issues usingQC.
  • Logged and tracked defects identified during testing cycle.

Environment: Java, Windows NT/2000, SQL, MS SQL Server, Web Sphere 6.0 Load Runner 9.0, Quality Center 8.0, Bugzilla

We'd love your feedback!