We provide IT Staff Augmentation Services!

Sr Performance Engineer Resume

0/5 (Submit Your Rating)

Nashville, TN

SUMMARY

  • 6 + years of strong expertise in Performance/Load & Stress Testing using HP Load Runner/ Performance Center.
  • Extensive experience in automated testing of Web based and Client/Server applications with proficiency in Load and Performance Testing.
  • Worked on both Web based (.net/Java) & Native/Hybrid mobile based applications.
  • Experience in working with Onsite/Offshore model projects.
  • Good experience in working with both Agile/Waterfall methodologies. Implemented shift left strategy in Performance testing.
  • Experience in log file analysis for finding performance bottleneck using tools like - Splunk, GC Viewer etc.
  • Good working knowledge in APM tools like HP Diagnostics & Dynatrace for diagnosing complex performance problems.
  • Perform the monitoring of the application and database servers during the test run using tools like Dynatrace, HP Diagnostics and SiteScope.
  • Executed performance tests - load, capacity and stress test using HP LoadRunner and JMeter.
  • Familiar with any APM Tools major components Business Process Monitor, Application diagnostics.
  • Manage and configure independently resolve tickets within SLA.
  • Adheres to standard operating procedures / work instructions.
  • Execution of automated test scripts using Mercury Tools (Test Director/Quality Center, LoadRunner, and QTP), JMeter based on business/functional specifications.
  • Experience in automating the web tests using VSTS.
  • Experience in working with Network Virtualization tools like Shunra to simulate load from different geo graphical locations.
  • Extensively used HTTP Watch and Inspect element to identify the bottlenecks in the UI/Client side of the application.
  • Expertise on various monitoring tools like HP Site scope and HP Diagnostics, to keep track of the test performance and identify various bottlenecks.
  • Expertise in SQL queries to perform Backend testing.
  • Strong in identifying the database bottlenecks like missing indexes & dead locks using the SQL profiling.
  • Experience in monitoring Web Servers and Application Servers such as Microsoft IIS, web logic, Web Sphere and Database Servers such as SQL Server and Oracle during the Performance Test with and without firewalls.
  • Expertise on Web Services and experienced in using SOAP UI for testing of SOA environment.
  • Participated in Integration, System, Smoke and User Acceptance Testing.
  • Well versed with all functionality of Virtual User Generator and Correlating Statements, configuring Run time settings for HTTP, iterations, Simulated Modem speeds to bring the testing scenario to real world
  • Strong knowledge of using Single and Multiple protocols in Loadrunner VUGen like Web Http, Web Services, Ajax TruClient, Web Click and Script, Citrix ICA, ODBC and Oracle NCA.
  • Good understanding of object oriented methodologies, software development life cycle (SDLC) and software testing methodologies
  • Excellent ability to understand complex scenarios and business problems, and transfer the knowledge to other users/developers in the most comprehensible manner
  • Good knowledge on Object Oriented Programming, experienced with C programming (C#), HTML, XML, CSS.
  • Experience in Structured Query Language (SQL), Joins, PL/SQL stored procedures & Triggers.
  • Worked with Orasi software Inc. Team for Installing and setting up latest patches & upgrades to the Loadrunner.
  • Experience in working with mobile applications, which includes both Native and Hybrid applications.
  • Good working experience on using Fiddler and Wireshark to capture the Network traffic from the mobile client to the servers.
  • Experienced working with developers in White Box Testing, and debugging codes for better performances results
  • Closely works with solutions architects, Developers and DBA’s to identify and resolve bottlenecks.
  • Quick learner by respect to latest technology, most excellent put into practice and system

TECHNICAL SKILLS

Testing Tools: Load Runner, QTP, Performance Centre, QC, Jmeter

LoadRunner Protocols: Web-HTTP, Web Services, TrueClient, JavaVuser, Oracle NCA

Scripting: JAVASCRIPT, VB Script, SHELL

Programming Languages: C, VB.net, ASP.net, HTML

Web/ Application Servers: MS IIS, Apache, Web sphere, Web Logic

Database: Oracle, Db2, SQL Server, Denodo

Service Oriented Architecture (SOA): Web Services, XML, SOAPUI, WSDL, WCF

Monitoring tools: HP Diagnostics, HP Sitescope, DynaTrace, HP BPM, Wily interscope, Splunk

PROFESSIONAL EXPERIENCE

Confidential, Nashville TN

Sr Performance Engineer

Responsibilities:

  • Working closely with Business Analysts, Product Owners, Developers and Managers to gather Application Requirements and Business Processes in order to prepare Performance Test Plans, Test Strategies, and High-Level & Low-Level Performance Scenario’s Documents.
  • Analysed the Business Requirement Document (BRD) and developed detailed Test plans, prepared Test cases.
  • Deriving the Production like workload for the identified business processes in order to reduce the risk of overload on servers & unavailability of application during peak hours.
  • Having good communication with cross-functional team and get various updates to move on to next step.
  • Creating performance Test plan along with the estimates based on the scope of the project and share with the Project team.
  • Preparing the test cases for the Key business scenarios.
  • Performing the scalability exercise to compare the Production and Performance environments and scale the environment accordingly.
  • Engage in Discussions with the team to understand the criteria of the performance testing and provide the recommendations on what type of executions need to be performed. Ex: Load Test, Stress Test, Volume Test & Endurance test.
  • Continuous smoke test each time the latest code/build is deployed.
  • Developing Load Runner scripts using a variety of protocols ranging from WEB (http/html), Mobile (http/html), Web services and Ajax TruClient (IE).
  • Script enhancements by correlation, parameterization, transaction points, rendezvous points and various Load Runner functions.
  • Designed and created a number of LR Scripts for Applications deployed on a variety of platforms including Java J2EE, .NET, Ajax and SOA applications.
  • Validate the script by providing the inputs and checking in the UI or Back-end Database.
  • Communicating with QA/DEV or TDM (Test Data Management Team) to create the Test data required for all the executions.
  • Independently developed Performance Center VUGen scripts according to test specifications/requirements to validate against Performance SLA.
  • As an Admin for PC ALM, setting up the scenarios for executions & providing user access to the ALM were the key responsibilities. Conducting baseline test for all the applications was part of Admin role.
  • Analysing throughput graphs, hits/second graphs, transactions per second graphs and rendezvous graphs using Load Runner Analysis tool.
  • Responsible for monitoring performance of the applications and database servers during the test run using tools like HP Diagnostics, Site Scope & Dynatrace.
  • GC monitoring to understand if there are any memory leaks during the Capacity & Endurance Tests.
  • Collaboratively worked with the Network/Security team to get the ports open between the Load generator to the application and Performance centre to the controller.
  • Good understanding of SOA requirements and SOA Reference Architecture.
  • Maintained amicable relationships with developers and other stakeholders in better triaging and narrowing down the bugs.
  • Working on both Web based (.net/Java) & Native/Hybrid mobile based applications.
  • Using Fiddler to capture the network traffic from IOS Mobile app and convert the traffic using VUGen.
  • Closely works with solutions architects, Developers and DBA’s to identify and resolve bottlenecks.
  • Using SoapUI for capturing required Headers/Parameters to perform Web Service testing.
  • Using Shunra NV to generate different network bandwidth dependent on the location.
  • Raise & Track Defects using QC ALM (Application Lifecycle Management).
  • Interacting with developers for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Capturing performance metric and comparing with expected business expectations & providing recommendations for better performance.
  • Presenting the Artifacts to the stakeholder using PPT & Test Report documentation.
  • Extensively worked with JIRA & ALM on tracking project timelines.
  • Experience in working with Native IOS applications using Fiddler and Wireshark tools.
  • Responsible for conducting time-to-time Knowledge Transfer meetings and imparting domain knowledge with in the team.
  • Managed team up to 5 members in size, worked in capacity of Test Lead for Manual, Performance testing POC and Performance testing for an independent module.

Environment: SQL Server 2008 R2, HP Load Runner 12.53/12.02 (Web Http/HTML, Web Services, Oracle Applications, True client), Performance Center ALM, SOAP UI, QC ALM, JIRA, HP Diagnostics, Dynatrace, Perfmon, Shunra(Network Virtualization). Protocols: Enterprise AJAX TruClient, Web HTTP/HTML, Web Services, Siebel.

Confidential, Dallas, TX

Sr Performance Test Engineer

Responsibilities:

  • Gathered business requirement, studied the application and collected the information from Business Analysts, Project Managers, Solution Architect’s, Developers, and SME’s.
  • Responsible for all phases, planning, developing scripts, execution of Performance Center scenarios and analysis in Agile environment
  • Responsible for Monitoring the Application's performance under Load using the key Web Server Monitors, Web Application server monitors for WebSphere, IIS 5.0, Apache monitors and NT Performance Monitors
  • Involved in writing Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and Presentation of results.
  • Worked with App-Dev, Production, Technical and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Resolving issues related to HP Quality Center/ALM and responsible for administration and support experience
  • Creating GUI, Bitmap, Database and Synchronization verification points in WinRunner.
  • Developed Load Runner test scripts according to test specifications/ requirements.
  • Responsible for testing Web, Web Services and Ajax TruClient request.
  • Extensively monitored the all the applications using HP Performance Center and Sitescope
  • Created Various Vuser Scripts basing on the Critical Transactions Used by the Real Time users using VuGen of Load Runner. Identify and eliminate performance bottlenecks during the performance lifecycle
  • Verify that new or upgraded applications meet specified performance requirements.
  • Used HP Diagnostics to obtain Performance data for problem solving, trend analysis, and capacity planning.
  • Planned,designed,executedand evaluated performance tests of web application and services and ensured optimal application performance usingLoad Runner.
  • Using Performance Center Performed manual increment and decrement user (Ramp Up and Ramp down) scenarios and when how many users can at a time do a transaction.
  • Setting run-time parameters (Think time, pace time, Replay options etc.), ramp up and load distribution.
  • Worked on executing Baseline, Benchmark, Stress, Memory leak test scenarios for internal and customer facing applications based on application’s workload model.
  • Responsible for preparing System test environment (UNIX OS) using FTP.
  • Verify that new or upgraded applications meet specified performance requirements.
  • Ensured daily production support and administration of portal operations like end user performance, identifying the bottlenecks before and after production migration and maintenance upgrades.
  • Implemented monitoring solutions with Graphite.
  • Actively involved in automating the Regression Testing process of the application using the existing manual testing scenarios using C#, ADO.Net, VSTS
  • Implemented the SQL automation from VSTS tool using ADO.net connections.
  • Extensively used Unix commands for debugging and used, modified & ran Shell Scripts for daily reports and data collection.
  • Worked closely with software developers and take an active role in ensuring that the software components met the highest quality standards.
  • Deployed Graphite + Tasseo for real time metrics collection and dashboards.
  • Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Database testing using SQL queries to compare data accuracy of backend for reports.
  • Monitored different graphs like transaction response time and analysed server performance status, hits per second, throughput, windows resources and database server resources etc.
  • Developed load test scripts using VuGen to make them flexible and useful for Regression testing.
  • Analysed the results and Created Analysis Report through Performance Center Analysis, prepared and submitted Exit Report with Recommendations.
  • Analysed the system resource graphs, network monitor graphs and error graphs to identify transaction performance, network problems and scenario results respectively.

Environment: SQL Server 2012, C#, HTML, HP ALM Quality Centre, SOAP UI, Load Runner (Web (HTTP/HTML, Web services, AJAX True Client, JavaScript Vuser), HP Performance Center 12.x.LoadRunner, Java over HTTP User, Web Services, Web (HTTP/HTML)

Confidential, Atlanta, GA

Sr. Performance Analyst

Responsibilities:

  • Responsible for all phases, planning, developing scripts, execution of performance center scenarios and analysis.
  • Responsible for performance testing Oracle Retail MFP application.
  • Developed Load Runner test scripts according to test specifications/ requirements.
  • Developed Test plan, Traceability metrics mapping with Requirements and Test Cases.
  • Developed Load Test Scripts by using LoadRunner for entire site and did the Parameterization, Pacing, and correlation.
  • Perform the monitoring performance of the application and database servers during the test run using tools like AppDynamics and SiteScope.
  • Arranging daily stand up meeting with off shore team and plan accordingly with development team and functional team.
  • Analyze and monitor customer Problem Reports (PR’s).
  • Collaborate with infrastructure functions required in escalations management.
  • Evaluate escalation performance metrics.
  • Used Ramp Up/Ramp Down, Rendezvous point, Start and End Transaction, Parameterization, Correlation features of Load Runner.
  • Experience in using SOAP UI for testing the Web services.
  • Extensively used Web (HTTP/HTML), Web Services, J2EE.
  • Utilized WSDLs and files to perform web services (integration testing) using SOAP UI and Performance Center.
  • Worked close with clients Interface with developers, project managers, and management in the development.
  • Enhanced Vuser scripts by introducing the timer blocks, by parameterizing user id's to run the script for multiple users.
  • Developed and tested web services applications using SOAP UI as well as Load Runner by using WSDL Files.
  • Responsible for testing Web, Web Services and Java Vuser request.
  • Responsible for testing backend Oracle database.
  • Extensively monitored the all the applications using HP Performance Center and Zabbex
  • Extensively used Splunk in analysing log files.
  • Created Various Vuser Scripts basing on the Critical Transactions Used by the Real Time users using VuGen of Load Runner.
  • Accurately produce regular project status reports to senior management to ensure on-time project launch.
  • Verify that new or upgraded applications meet specified performance requirements.
  • Customized LoadRunner scripts in C language like String manipulation and using C libraries for the LoadRunner Scripts
  • Performed correlation by rightly capturing the dynamic values and parameterize the data dependencies that are always a part of Business process.

Environment: VB.net, C#, Microsoft .Net applications, Oracle 10g, Java, Web Services, SQL Server 2008 R2, HP Load Runner (Web Http/HTML, Web Services, Oracle NCA, Microsoft ADO.Net), Quality Center, Oracle 10g, SQL Server Management Studio, SOAP Microsoft® SQL Server, Enterprise AJAX TruClient, Java over HTTP Vuser.

Confidential, Herndon, VA

Performance Tester

Responsibilities:

  • Responsible for generating the key Virtual user scripts using the Load Runner VUGen utility for web (HTTP/HTML), LDAP and WINSOCK Protocols.
  • UsedSoapUIProto perform Web Service test.
  • Uploaded and configured WADL file to SOAPUI and JMeter applications to test the web services application.
  • Developed web services automated scripts from API document to verify RESTful web service calls using XML and JSON format.
  • Responsible for Monitoring the Application’s performance under Load using the key Web Server Monitors, Web Application server monitors for WebSphere, IIS 5.0, Apache monitors and NT Performance Monitors
  • Responsible for Running the LoadRunner scenarios for the Vuser using LoadRunner Controller and monitoring the server response times, throughput, Hits/sec, Trans/sec Transaction Response under load, Web Server Monitors, App server monitors, system monitors such as java processes and a host of other Performance metrics.
  • ImplementedIP Spoofingtechniques to simulate unique users' requests while running the tests.
  • Made many enhancements to the recorded scripts by correlating, parameterizing, inserting debugging messages, string manipulation, and any other script enhancements as and when needed.
  • Configured various WebSphere monitors for WAS applications to figure out which of the several servlets/JSPs caused the problem.
  • Created quantifiable load with test-scenarios for various applications (both standalone and integration) using LoadRunner's Controller.
  • Used various servers and ran SQL queries in SQL Server 7.0 on the back end to ensure the proper transaction of data during various tests.

Environment: Windows NT, Unix, Linux, Oracle 8.i & 9.i, SQL, MS SQL Server, Oracle Finance, .Net, Web Services, XML, HP LoadRunner (LDAP, WinSock, Web HTTP/HTML) Test Manager.Web Http, Oracle NCA, Microsoft ADO.Net.

Confidential, Chicago, IL

Performance Tester

Responsibilities:

  • Responsible for Requirements gathering for the new enhancement in Defect Tracking system.
  • Designing the Test Architecture and the Scenarios for the Automation.
  • Created and documented the Test Scenarios for each functional area mentioned in Test Plans to develop the test scripts (automated scripts).
  • Responsible for Test case purpose writing, checkpoints and Test case steps writing.
  • Validated the test cases based on the functional specifications.
  • Modified the test cases, which were not following the specification flow.
  • Participated in internal and external Reviews for Test Description document and track the
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Developed Loadrunner Scripts in Web, Web services and Database protocols.
  • Responsible for Database testing to verify records in backend after updating, modification and deletion of records from fount end and vice versa.
  • Handled day-to-day activities to client and was involved in meetings and calls with offshore management.
  • Executing system test cases, regression test cases.
  • Complete defect management and reporting.
  • Participated in regular meetings with developers for reviews, walkthroughs and bug dance.
  • Reported bugs using Quality Center Bug Tracking system and verified fixes with every deployment.
  • Using Test Director for complete defect management and reporting.
  • Responsible for weekly status, updated showing the Progress of the automation testing effort.

Environment: Windows 2000/XP Professional, Oracle 9.2.05, Oracle 9i, UNIX, Quality Center 8.2, Test Director, .Net, IBM Web Sphere and XML.

We'd love your feedback!