We provide IT Staff Augmentation Services!

Senior Performance Engineer Resume

0/5 (Submit Your Rating)

Dearborn, MI

SUMMARY

  • 7+ years of work experience in the field of IT with focus on testing of web based and Client - Server applications. Strong expertise in Performance testing using LoadRunner, Visual Studio and Jmeter. Extensive hands-on experience in software testing/QA activities.
  • Highly proficient in Creating and Enhancing scripts using various performance testing tools, executing scripts using Controller and analyzing Performance results.
  • Have functional knowledge on Kronos application and expert in performance testing Kronos application along with web based applications.
  • Well versed in Recording scripts, Correlation, Parameterization and validations.
  • Exposure to third party Performance monitoring tools like Dynatrace, Wily Introscope, SiteScope, etc.
  • Highly experienced in documentation and generating business reports after monitoring the results.
  • Expensively used all the components of load runner: VuGen, Controller and Analysis.
  • Proficient in designing Test Objectives and Test Cases.
  • Experience in analyzing network traffic, networking ISO tree and network monitoring metrics.
  • Efficiently executed load tests in the cloud environment MS Azure, EC2.
  • Good working experience in Agile Scrum, Waterfall and Iterative methodologies.
  • Expertise in using HP Performance center.
  • Hands on experience with SQL server, Oracle database and web based applications
  • Experienced in testing applications manually as well as using the defect tracking tools like Quality Center/ALM.
  • Experience in analysis, design, implementation, execution, maintenance, support and documentation for system testing.
  • Worked on both performance testing and engineering and involved in trouble shooting in various Unix, Linux and windows server environments.
  • Extracted IBM Tivoli and SCOM reports to monitor server’s performance during the load test.
  • Hands on experience with functional, performance, stress, regression, user acceptance and integration/system testing.
  • Expert in end to end, API and web based level of testing.
  • Extensive experience in programming languages C, C++, Java, Python, JSF, VB, Ruby, C#.
  • Extensively worked on relational database tools SQL, DB2
  • Developed multiple Load Runner scripts using various protocols like Web-HTTP/HTML, WebServices, Flex, SAP, AJAX, TruClient, .Net, etc.
  • Knowledge in automation tools QTP and Selenium web driver.
  • Full life cycle experience in Quality Assurance which includes various kinds of tests including Black Box System, Integration, Functional, Non-Functional, Data Migration, GUI, Regression, End to End, User Acceptance, Smoke Testing of both Client-Server and/or Web-Based applications.

TECHNICAL SKILLS

Languages: C, C++, Java, Java Scripting Python, Ruby, VB, JSF, C#

Testing Tools: Load Runner, VSTS, QTP/UFT, JMeter, HP ALM, Perfecto

Monitoring tools: Perfmon, Dynatrace, Introscope, Wily Introscope

Database tools: SQuirreL SQL, TOAD, SQL Database Studio, MySQL Workbench, DB2

Reporting tools: BlazeMeter, MeterPlus, QAPlus, MeterPlus

Protocols: Flex, TCP/IP, HTTP/HTML, FTP, AJAX, Citrix, RESTFUL webservices

Webservices Tools: SOAP UI, TestMaker, HTTPMaster

Defect Management Tools: Mercury Quality Centre, Test Link, Test Director, JProfiler, Fiddler, Firebug, Rational Clear Quest, HP ALM, Bugzilla

PROFESSIONAL EXPERIENCE

Senior Performance Engineer

Confidential, Dearborn, MI

Responsibilities:

  • Worked from project planning, requirement gathering to project closing documentation.
  • Used LoadRunner Version 12.53 and 12.55 for testing Kronos based application.
  • Involved in project planning, requirement gathering and constructing user load model perfectly to mimic the business scenario exactly as that of production.
  • As Kronos is a rich application with multiple Flex, HTML and Rest API calls made it necessary to script using LoadRunner Flex protocol.
  • Due to limited licensing of users with flex protocol made to script the test scenarios in Web protocol which required allot of development using Java, Java script and C++.
  • Worked with business team in establishing benchmarks/SLA based on the production analysis.
  • Involved in developing critical test scenarios based on the requirements and provided required suggestions as when required.
  • Extensively used C, C++, JAVA, Jscript programming language for scripts development/ enhancements.
  • Organized L&P Testing stand-up meeting to communicate with the business, application and database team.
  • Worked in developing and running scripts during business hours and non-business hours.
  • Conducted load test after every iteration is functionally tested.
  • Performed regression testing for every build if that impacted any of the transaction response times.
  • Actively monitored web, application and database server behavior during the load test.
  • Extensively used SQL Server Management Studio 2005 for creating queries and executing queries.
  • Extensively created list of performance monitoring metrics to analyze the performance of the application.
  • Extracted IBM Tivoli reports for web, application, report, user, DB and BGP server’s performance for every 15seconds during the test to check if CPU/Memory/Pages swapped out, etc. are going beyond the permissible limits.
  • Extracted SCOM reports to measure the hardware utilization on the database server.
  • Monitored all the windows server’s hardware utilization during the test using Perfmon.
  • Analyzed the reports of Throughput, Transactions/sec, Average response time, Vusers/sec, etc.
  • Collected and analyzed logs after every test to identify root-cause of a problem.
  • Provided performance, capacity assessment and deployment risks using Key Performance Indicator attributes i.e. capacity,performance, scalability, availability, reliability etc.
  • Gained complete functional knowledge of the Kronos application.
  • Identified the bottlenecks and involved in resolving those bottlenecks, improving performance of the application.
  • Played key role in successful plant launches in 7 different cities.
  • Developed performance strategy document and presented it to the appropriate audience.
  • Expertise in documenting the load test results in aesthetic manner so that business can understand it more sensibly.

Environment: C, C++, JAVA, Jscript, MS SQL, HTML, IIS Server, MS Office Suite, Kronos, LoadRunner 12.53, 12.55, Flex, Web, Webservices, RestAPI, MS Office, SharePoint, IBM Tivoli, SCOM.

Senior Performance Lead

Confidential, Lansing, MI

Responsibilities:

  • Involved in restructuring project plan documentation.
  • Worked on Visual studio Performance Testing on .Net based web application.
  • Worked with business team in establishing benchmarks to use during deployment.
  • Involved in developing Critical test scenarios based on the requirements and system analysis document.
  • Created and executed scripts using VSTS and MS Azure.
  • Executed scripts in cloud environment using MS Azure.
  • Installation of Controller and Agent’s on different machines.
  • Frequently working on writing scripts in shell and Perl as per the changes in application functionality.
  • Extensively worked on preparation of test data.
  • Used C# programming language for scripts development/ enhancements.
  • Involved in daily communication with Developers, Testers, Business Analyst, DBA and System Admin.
  • Prepared the load test environment to do performance testing.
  • Worked in developing and running scripts during business hours and non-business hours.
  • Designed Load test scenarios for 10 different tests.
  • Used Fiddler to capture all the traffic on the application.
  • Used Linux Server environment extensively.
  • Used MS Office Suite (Word, Excel, Outlook, PowerPoint, Access, Visio, MS Project).
  • Extensively used SQL Server Management Studio 2005 for creating queries and executing queries.
  • Highly involved in performance monitoring of the Database, web and application servers.
  • Extensively created list of performance monitoring metrics to analyze the performance of the application.
  • Analyze the server metrics along with the performance test reports to check server impacts on performance.
  • Analyzed the reports of Throughput, Transactions/sec, Average response time, Vusers/sec, etc
  • Worked on complete performance monitoring reports analysis after each test execution and documented all the findings.
  • Used TFS to create tickets for existing bugs and issues, work on assigned tickets to resolve them and to discuss about the existing tickets in stand up meetings.
  • Provided performance, capacity assessment and deployment risks using Key Performance Indicator attributes i.e. capacity,performance, scalability, availability, reliability etc.
  • Extensively worked on capturing Client server, App server, DB server logs to find the reason for the system bad behavior.
  • Added DNS host names.
  • Identified the bottlenecks and involved in resolving those bottlenecks, improving performance of the application.
  • Involved in creating the final test analysis report for the client and customers.

Environment: VSTS, Fiddler 4, Oracle server, C#, MS Office Suite, DNS, Linux scripting, MQ, MS SQL Server Management Studio 2005, Asp.Net, Sharepoint, Team Foundation Server.

Senior Performance Engineer

Confidential, Florence, KY

Responsibilities:

  • Understand overall 3-tier application architecture by collaborating with application development team.
  • Gathering Non-Functional Requirements from client and analyzing those requirements.
  • Involved in created of Test Plan, which includes project scope, hardware or software requirements, detailed scenario, etc.
  • Worked with business team in establishing benchmarks to use during deployment.
  • Based on the gathered non-functional requirements developed load scripts or validate against Performance SLA.
  • Involved in performing load tests by generating multiple virtual users load to determine how the application would work during peak hour load.
  • Performed stress tests to know the breaking point of the application and to take necessary steps to overcome crashing of the server.
  • Define runtime settings and recording settings as per the client requirements.
  • Conducted load, stress, endurance and fail-over tests.
  • Used SQL server to write complex SQL queries.
  • Enhanced scripts with parameterization, correlation, validations, etc. and efficiently executed load tests in cloud environment.
  • Execution of Load Runner scripts using HP Controller by defining ramp-up and ramp-down of virtual users in order to generate results of as that real time scenario.
  • Installed and Configured Apache Tomcat server.
  • Highly used SoapUI tool for web service testing.
  • Worked on manual scenario to manage load test by specifying number of virtual users.
  • Monitor the execution of scenarios continuously for any errors and fix them before they cause any kind of loss.
  • Implemented transactions in VuGen scripts to find the time consumed for each transaction.
  • Effectively used all the components of HP Performance and efficient in writing Load Runner functions.
  • Created automated scripts by including timers for recording response times, and test checks for confirming the retrieval of the correct page.
  • Worked with development and database administration team to evaluate the bottlenecks and work on how to handle them to improve the performance of the application.
  • Monitor performance of the web, application and database server performance continuously using third party performance monitoring/ tuning tool Dynatrace.
  • Generate performance graphs, session reports, performance capacity reports and other related documentation required for validation and analysis using Load Runner analysis component.
  • Monitor various machines during load tests and informing the corresponding teams in case of issues till the issues are been fixed by concerned team.

Environment: HP- Load Runner 12.5, HP Dynatrace, EC2, WebSphere, SQL, HTML, XML, SOA Environment, C++, protocol: TruClient, SOAP UI, web HTTP, Ajax, Ajax TruClient, MQ, SQL Server

Senior Performance Tester

Confidential, FL

Responsibilities:

  • Involved in meetings conducted with clients to gather application, tool-based and other requirements in relate to our application.
  • Design test plan/ test objectives as per business requirements.
  • Active role in test metrics generation for web applications.
  • Involved in the project from initial kick-off calls to signoff.
  • Efficiently worked with application team in developing the architecture of the application.
  • Design and executed performance test plan and test cases.
  • Identified business critical scenario’s which play a crucial role in knowing volume intensity and most DB activities.
  • Worked with business team to establishing benchmarks to use during deployment.
  • Involved in Load Testing of various modules and software application using Load Runner
  • Created various number of Load testing scripts for Data seeding purposes.
  • Developed scripts simulating virtual load using the Load Runner Virtual User Generator VuGen.
  • Enhanced the scripts by including transactions for each Virtual User activity.
  • Parameterize the values given by virtual user to run the script with different VUsers credentials.
  • Used correlation in order to handle the dynamical values generated by server.
  • Created customized LoadRunner VuGen scripts using C, C++ programming at API level with manual correlation, user defined functions, development libraries (classes and methods), and error handling.
  • To evaluate whether all the servers are receiving equal amount of load distribution between webservers.
  • Used Linux and Unix commands to highly to check if every thong on the server is working fine.
  • Used Jira for tracking and locking defects.
  • Responsible for setting recording settings and run time settings before each run of the script.
  • Used Performance center for scheduling and execution of load tests.
  • Responsible for conducting smoke, load, stress, API testing.
  • Created high level test strategy documentation and detailed test requirements/results documents.
  • Customized Loadrunner scripts in C language like String manipulation for the Loadrunner Scripts
  • Used ramp-up and ramp-down in VUser simulation to generate real time scenario.
  • Created test cases creation and enhanced them as and when required.
  • Worked with database team and submission and processing of data requests.
  • Installed Wiley Introscope to monitor application’s performance.
  • Used Websphere application server heavily to perform business transactions.
  • Involved in WebSphere setup and configuration
  • Used Perform to monitor CPU/Memory usage, page faults, average disk read queue length, etc.
  • Maintained strong relationships with application developers and database team which helped in fixing the bottlenecks at early stages itself.
  • Participated in regular meetings with developers for reviews and walk-throughs.
  • Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution.

Environment: Testing tools: HP-Load Runner 11.5, Quality Center, HTML, XML, C, SQL, IntroScope, Java, LINUX/UNIX, JIRA, Ajax truclient, IBM Websphere Application Server, Oracle

Performance Tester

Confidential, Richmond, VA

Responsibilities:

  • Understanding the Business requirement, Functional and Performance Specification documents.
  • Prepared performance test scenarios using functional documents and technical documents.
  • Development of Test Scripts and Monitoring methodologies.
  • Assisted the team in creating test plans and discovering new test paths to improve the current test coverage.
  • Executing Load Test scripts for different QA Environments and identifying memory leakages.
  • Tested application in Citrix server using Citrix protocol and monitored Citrix server through PerfMon.
  • Performed Load Test, Longevity test and Stress Test.
  • The Average CPU usage, Response time and TPS are analyzed for each scenario.
  • Used defect tracking tool Clear Quest to track the most dangerous bugs on the application.
  • Used ramp-up and ramp-down in VUser simulation to generate real time scenario.
  • Monitored application and servers performance continuously.
  • Enhanced and modified the scripts according to the test case scenarios.
  • Coordinating with Off Shore on project issues and executions.
  • Attended Defect Meetings and Status Meetings to resolve the bugs.

Environment: LoadRunner, Oracle, Agile, Test Director, Citrix, QTP, RDP, UNIX, C#, C++, Clear Quest, WSDL, JBOSS, MS SQL Server, Unix, HTML, XML, Oracle

Performance Tester

Confidential, Dayton, OH

Responsibilities:

  • Independently developed PerformanceCenter test scripts according to test specifications/requirements.
  • Designed performance test suites by creating Web (GUI/HTTP/HTML), Web service and Click & Script test scripts, workload scenarios, setting transactions. Extensively used VUGen to create Load Test Scripts.
  • Identify system/application bottlenecks and worked to facilitate the tuning of the application/environment in order to optimize capacity and improve performance of the application in order to handle peak workloads generated via Mercury Interactive LoadRunner tool to simulate activity.
  • Created Vusers to emulate concurrent users, inserting Rendezvous points in the Vuser scripts and executed the Vuser Scripts in various scenarios which were both goal oriented and manual using Load Runner.
  • Correlated and Parameterized test scripts to capture Dynamic data & input various test data as per business requirements.
  • Using PerformanceCenter, execute multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller/Performance center.
  • Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.
  • Develop and implement load and stress tests with Mercury PerformanceCenter and present performance statistics to application teams, and provide recommendations of how and where performance can be improved.
  • Monitoring the various machines during load tests and informing the corresponding teams in case of issues.
  • Created detailed test status reports, performance capacity reports, web trend analysis reports, and graphical charts for upper management using Load Runner analysis component.
  • Root Cause Analysis & Bottleneck Analysis/Problem Isolation after performance monitoring using Wiley Introscope
  • Expertise in Capacity Planning, Data Modeling and Root Cause Analysis
  • Expertise in statistical and mathematical analysis and interpretation of data.
  • Performs in-depth analysis to isolate points of failure in the application
  • Assist in production of testing and capacity reports.
  • Develop and implement load and stress tests with Mercury PerformanceCenter, and present performance statistics to application teams, and provide recommendations of how and where performance can be improved
  • Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests.
  • Creating User Stories and setting up the E-mail alerts to respect focal to follow up the work items in TFS.
  • Creating and executing test cases in Test Manager.
  • Logging the bugs and tracking until it get closure in TFS.
  • Generating Test execution Status and Bug status based on TFS statistics.
  • Used Soap UI Pro to perform Web Service Performance test.

Environment: Windows XP, ORACLE, Unix, VB Script, Load Runner, JBOSS, ALM, Quality Center, Introscope, App Dynamics, Microsoft Office, VB script, JIRA, SOAP UI

Test Engineer

Confidential

Responsibilities:

  • Analysis of system & functional requirements with the clients & team members.
  • Involved in preparing Test Objectives and Test Case Creation.
  • Involved on testing the new functionalities based on test cases and coordinated with development team in fixing the issues.
  • Conducted GUI Testing, Smoke, functionality, System and Regression testing.
  • Updating, Executing and Maintenance of manual scripts in Mercury Quality Center.
  • Tracking and Locking of Defects through JIRA.
  • Defects Management and Re-testing of defects.
  • Worked on Unix/ Linux shell scripting
  • Performed regression testing and analyzed results.
  • Participated in Review meetings with project team.
  • Analysis of CRs with the client and Test Cases preparation for CRs.
  • Tested critical bug fixes including critical fixes and coordinated with developers in release of bug fixes meeting tight timeline.
  • Developing test scripts for the bugs found in exploratory testing.
  • Generated Bug Reports and conducted Regression Testing for Recognize the changes.

Environment: J2EE, Windows XP, HP ALM, Agile, JIRA, TestComplete, Quality Center,HTML, SQL/PLSQL, ETL, Unix/Linux, Microsoft Excel, MS Word, Internet Explorer, Microsoft outlook, PowerPoint and Visio

We'd love your feedback!