We provide IT Staff Augmentation Services!

Senior Performance Engineer Resume

2.00/5 (Submit Your Rating)

Alpharetta, GA

SUMMARY

  • Over8 Years of diverse experience in Quality Assurance (Performance/functional) testing with expertise in requirements gathering, analysis, design, application testing & Quality Assurance of Web based applications, Client Server and Mainframe applications.
  • Specialized in performance testing applications using load - testing tools such as Load Runner, Performance Center.
  • Expertise in Test Planning, Test Cases Design, Test Environment Setup, Test Data Setup, Defect Management, Configuration Management. Strong expertise in developing Test Plans and Test Cases based on the User Requirements and System Requirements document.
  • Extensive experience in analyzingBusiness Requirements Docs,Functional, Performance andTechnical Specifications Docs, Design Documents and breaking them in to detail Test Cases.
  • Experience in creating Test approach, work plan and designing the test Scenarios, test cases for Performance test requirements.
  • Highly proficient inHP Testing Tools(Performance Center, HP ALM, Load Runner, UFT, Win Runner, Test Director and HP Quality Center).
  • Full Life Cycle Performance engineering from Requirements to Release and Capacity Planning. In addition custom performance test bed, monitoring framework development and test automation.
  • End to End Performance test Analyst specializing in monitoring, and tuning Enterprise Class Middleware server and applications for optimal performance. Additionally competent in front end client, backend database and network performance tuning.
  • Worked on various Load Runner Protocols like Web (HTTP/HTML), Web Services, FLEX, Citrix, SAP-GUI/Web, .NET, JAVA- RMI.
  • Conducted load and performance testing using Load Runner by creating rendezvous points to simulate heavy user load, and transaction points to test application response time.
  • Created various scenarios in Performance Center for performing baseline, benchmark, stress tests and scalability tests.
  • Extensively used Performance Monitors to analyze the System Bottlenecks, GC Heapsize, CPU utilization, Thread details, Pool size, and Session details.
  • Analyzed load balancer settings to perform Spoofing.
  • Executed the baseline performance tests for each release to verify the performance changes for significant business transactions.
  • Developed test plans, test cases, test scripts and procedures, traceability matrix, and test result reports for manual and automated testing.
  • Performed User Acceptance Testing in the final phase of software development process to check the functionality of the software.
  • Coordinated meeting for determining Production readiness of applications
  • Knowledge on performance tuning activities.
  • Developed and managed test data and the test environment; as well as document and track problem reports.
  • Review of deliverables like Test Report and Test Analysis (Weekly Status Report, Work Breakdown structure,
  • Strong Experience in Retail and Point of Sale testing.
  • Good experience in UNIX and Shell Scripting.
  • Writing and executing SQL - DDL, DML statements for checking the validity and integrity of the data in various databases for back-end-Testing.
  • Extensive work experience using Bug Tracking Tools like Quality Center, Rally, Jira, Remedy.
  • Expertise in analyzing Performance test results and presented the same to the Management and other Business people.

TECHNICAL SKILLS:

Operating Systems: Windows 95/98/NT/XP,7, Windows Vista /, MS-DOS, UNIX, HP-UX, Sun Solaris, Linux, VMware, AIX

Programming Languages: C, C++, JAVA/J2EE, .Net (ASP, C# and VB), Visual Basic, HTML, DHTML, XML

Databases: Oracle, DB2, SQL, MS-SQL Server2005/2008, MS-Access, Sybase

Front-End GUI: Visual Basic, Developer 2000 (Forms & Reports), Power Point

Testing Tools: Load Runner 9.x/8.1/7.x, ALM, Performance Center 11.X/9.x/8.1, Selenium,WinRunner7.x/6.5, Quality Center 8.1, Microsoft VSTS, Test Director 8/7.x/6, Quick Test Pro 9.x. Selenium IDE, Web driver 2.

Test Management Tools: HP Quality Center 9.2, Test Director, PVCS, VSTS, Saffron, Rational Clear Quest, Tivoli

Monitoring Tools: HP Sitescope 10.x, PerfMon, DebugDiag, Process Explorer, Fiddler, CA WilyIntroscope, Splunk 5.0, Foglight 5.6, Dyna trace.

Scripting Languages: TSL, VB Script, JAVA Script, Shell Script, Perl, bash

Others: Visual Studio, TCP/IP, Erwin Data Modeler 7.2, Product Studio, Microsoft Visio,Microsoft Visual Team Explorer

PROFESSIONAL EXPERIENCE

Confidential, Alpharetta,GA

Senior Performance Engineer

Roles & Responsibilities:

  • Lead for the Quality assurance effort for Entire project.
  • Responsible for setting up an Agile Eco system.
  • Update the status of the QA tasks during the daily SCRUM meetings in an agile methodology.
  • Created and adhered to process methodologies, policies and procedures for testing activities for both new project work and maintained and supported (i.e. Test Strategy, Test Plan, Test Summary, Metric reporting).
  • Create and update the Manual/Automation and Performance test plans based on the inputs from like Business users, Business Analysts and Development team.
  • Application SLA's were estimated after brainstorming with business owners.
  • Create and execute the flow along with the users during the UAT for Biometrics Project.
  • Rally was used for Test suite creation, writing test cases and executing the test cases.
  • RALLY was also used for tracking defects.
  • Experience in POS application.
  • Tested Big Data project for consumer insight platform.
  • Create the Requirement Traceability matrix to track the test coverage.
  • Key process improvements were suggested and implemented for the Testing Center of Excellence.
  • Create the Scripts using VUGen Web protocol.
  • Enhanced the scripts using parameterization, correlation and error handling functions.
  • Review the test cases, automation scripts and VUGen scripts created by other team members and providing suggestions for improvements and enhancements.
  • Create and execute performance, load and stress tests using the controller.
  • Analyze the results using the load runner analysis to trouble shoot the bottlenecks.
  • Wily Introscope was used to pinpoint the exact method and class names causing the application poor response time.
  • Participated in Project Team Reviews for "Lessons Learned" sessions.
  • Analyzed Performance Bottlenecks using LoadRunner Monitors, CA Wily Introscope, HP SiteScope,
  • Worked on SOA based Applications.
  • Involved in discovering the Performance bottlenecks arising from the Performance tests and worked various teams in an effort to resolve issues, and developing performance testing best practices.

Environment: Java, Hive, HTML, XML, Big Data, WebSphere 8.x and Windows 7, Rally, QTP 11.x, HP ALM, Performane Center,Web Services.

Confidential -Irving, TX

Senior Performance Engineer

Roles & Responsibilities:

  • Responsible for setting up the Test lab environment including Installation, Troubleshooting, Configuration and connectivity.
  • Involved in creating and updating the Performance test plan based on the inputs from various sources like Business users, Business Analysts, Development team etc.
  • Worked with various business owners and users on arriving Confidential SLAs for the AUT.
  • Responsible for creating the test harness for the Load Testing activity
  • Used Fiddler to capture the browser traffic and convert into VUGen request.
  • Used wire shark to capture the network traffic and converted as a VUGen script.
  • Responsible for scripting the Load Test scenarios from scratch using a variety of protocols like WEB, Htmlmobile, Webservice.
  • Responsible for creating and Load Test Scenarios and Scheduled the Load Tests from Performance Center.
  • Responsible for creating the list of Application Monitors that needs to be added from Performance Center.
  • Used various Correlation and parameterization techniques as part of the Scripting process. Performed Load, Scalability, Volume and Performance testing across a variety of Apps.
  • Used Fog light tool for taking transaction traces and for analysis after and during the load test.
  • Used Splunk for trace the logs.
  • Analyzed Performance Bottlenecks using Performance center, Splunk, Foglight.
  • Involved in write up of the Performance test results and presented the same to the Management and other audience.
  • Debug, troubleshoot, and work with team members to find and fix software defects
  • Used Performance center Analysis for generating a variety of graphs including Merging, Overlaying of the existing baseline results
  • Involved in developing the FrameWork, Test Cases, and Design Test steps and associate the corresponding requirements.
  • Update test framework, which is based upon Page Object Design pattern, for any new business logic and web elements
  • Facilitate various performance tweaking exercises and work with developers to resolve performance related bugs
  • Utilized SQL and validation tools to evaluate test results.
  • Trained Production support associates on latest Point of Sale functions.
  • Managed regression test scripts and maintained end user documentation to facilitate hardware and software deployme

Environment: Web services, Java, Mongo DB, .NetFramework, SQL ServerTools: HP ALM 11.0. Performance Center 9.5, Foglight 5.6.x, Splunk 5.0, Wireshark, Fiddler, RabbitMQ.

Confidential, Richardson, TX

Senior Performance Engineer

Roles & Responsibilities:

  • Responsible for setting up the Test lab environment including Installation, Troubleshooting, Configuration and connectivity.
  • Involved in creating and updating the Performance test plan based on the inputs from various sources like Business users, Business Analysts, Development team etc.
  • Worked with various business owners and users on arriving Confidential SLAs for the AUT.
  • Responsible for creating the test harness for the Load Testing activity
  • Responsible for scripting the Load Test scenarios from scratch using a variety of protocols like WEB, Web Services etc.
  • Responsible for creating and Load Test Scenarios and Scheduled the Load Tests from Performance Center.
  • Responsible for creating the list of Application Monitors that needs to be added from Performance Center.
  • Involved in monitoring test execution and system resource monitoring of different host machines.
  • Used various Correlation and parameterization techniques as part of the Scripting process. Performed Load, Scalability, Volume and Performance testing across a variety of Apps.
  • Used DebugDiag tool for taking periodical dumps and crashes for analysis after and during the load test.
  • Used Process Explorer to see if any extraneous processes other than the Load Testing processes are running while the Load Test execution is in progress.
  • Involved in capacity planning of the Application Infrastructure based on the business estimates.
  • Analyzed Performance Bottlenecks using Load Runner Monitors, CA Wily Introscope, HP Sitescope,
  • Worked on SOA based Applications.
  • Involved in discovering the Performance bottlenecks arising from the Performance tests and worked various teams in an effort to resolve issues, and developing performance testing best practices.
  • Involved in write up of the Performance test results and presented the same to the Management and other audience.
  • Debug, troubleshoot, and work with team members to find and fix software defects
  • Used LoadRunner Analysis for generating a variety of graphs including Merging, Overlaying of the exsting baseline results
  • Facilitate various performance tweaking exercises and work with developers to resolve performance related bugs

Environment: LoadRunner 9.5/9.0/8.1/8.0/7.5, HP ALM 11.0 HP Performance Center 8.1, Java J2EE, Mainframe, Web Services, JavaScript, IIS 6.0/5.0, COM+, CA Wily Introscope 7.x,Dynatrace, Tivoli,Oracle 10g/9i, DB2, TOAD.

Confidential, Beaverton-OR

Performance/Functional Test Engineer

Roles & Responsibilities:

  • Prepared Load test plan based on Business requirement document and prepared test cases and test procedures.
  • Developed test plans, automation of new feature test suites, across the product line and multiple operating systems.
  • Involved in weekly meetings to update the process flow charts.
  • Created Vuser scripts for multiple protocols including WEB( HTTP/HTML), WEB Services.
  • Created Load Scenarios Scheduled the Virtual Users and Parameterized Vuser Scripts to generate realistic load on the Server.
  • Design and develop automated, Performance, Stability, Scalability, and Load tests.
  • Developed all the Vuser Scripts using the r VUGen.
  • Involved in running the Scenarios for the generated Vuser Scripts in Performance Center Controller.
  • Analyzed the Performance bottlenecks using Performance Analysis.
  • Executed Load scenarios with different Load options to check the impact of the application on the network & servers.
  • Involved in monitoring test execution and system resource monitoring of different host machines.
  • Executed SQL queries to check the data table updates after test execution.
  • Forecast system load levels and develop scripts that stress the system to those levels.
  • Performed load and performance testing on the WebSphere web-application server using Performance center
  • Debug, troubleshoot, and work with team members to find and fix software defects
  • Executing Tests and creation of results using Analysis.
  • Created Reports that describe detected defects and possible causes.

Environment: Performance Center.9.5, Quality Center, IBM WebSphere, Java, VB Script, SQL*Plus, SQL Loader, Oracle, Oracle NCA, Windows 2000. UNIX, SAP web, Web HTML /HTTP, Load Runner, Oracle10g and XML/ SOAP.

Confidential, Thousand Oaks, CA

Performance Test Analyst

Responsibilities:

  • Involved in planning and coordination effort throughout QA life cycle.
  • Responsible for designing load test strategies based on the environment activity.
  • Responsible for estimating the concurrent users for the load test based on the Transactions/sec that is seen in production.
  • Responsible for creating test scripts using Load Runner VUGen.
  • Performed the correlation and Parameterization using VUGen.
  • Responsible for creating Load Test scenarios based on the load testing scripts that are to be loaded.
  • Responsible for identifying the counters that needs to be added onto the scenario for data collection from analysis perspective
  • Monitored the available bytes graphs that would give an idea as to the Memory leak if any present in the system.
  • Responsible for migrating the necessary stored procedures and other database stuff that is needed by the load test database.
  • Involved in compatibility testing of the application with different platforms using Quick Test Pro.
  • Worked in association with the DBAs in making sure that the databases are re-pointed to the original environments once we are done with the environment for the load test in question.
  • Responsible for load testing lo-ordination with various other projects involved in load testing activity.
  • Responsible for the generation of the LoadRunner Analysis files based on the LoadRunner results file generated by the load test.
  • Responsible for filtering the analysis file data based on the durations required.
  • Generated detailed test status reports, performance reports, web trend analysis reports, and graphical charts for upper management.

Environment: LoadRunner 7.x, Java/J2EE, WebLogic, WinRunner 7.5, QuickTest Pro, Oracle 8i, HTML, DHTML, Javascript, LDAP, IBM AIX 4.2, Windows NT, Dynatrace.

Confidential, Bethesda, MD

Performance Testing Engineer

Responsibilities:

  • Closely worked with the Business, Involved in writing Test Plans for the Functional testing and Performance testing.
  • Involved in weekly meetings to reported Test activities.
  • TestingtheWeb based applications.
  • Executed automated test scripts usingQTPforRegressiontesting.
  • Attended theRequirements reviews,Design reviews,Code reviewsto understand the overall process flow and Source to Target Mappings.
  • Writing aSQL Queriesto retrieve data from database.
  • Tested various Transformations for Informatica.
  • Created the Automated test scripts using WinRunner for testing the compatibility of the application with different platforms.
  • Executed SQL queries to check the data table updates after test execution.
  • Executed the test scripts and saved the results in TestDirector.
  • Created the test data for interpreting positive /Negative results during functional testing.
  • Used the TestDirector as bug-tracking tool to centralize the bugs and also to follow up the bug status.
  • Involved in Regression testing after each build release of the application.
  • Involved in Functional, System, Integration, Performance, Load, Stress and Regression testing during various phases of the development using WinRunner, LoadRunner and TestDirector.
  • Performed Installation Test by deploying the Application on the Application Servers built
  • Developed all the Vuser Scripts using the LoadRunner 7.0 VUGen.
  • Involved in running the Scenarios for the generated Vuser Scripts in LoadRunner 7.0 Controller.
  • Analyzed the Performance bottlenecks using LoadRunner 7.0 Analysis.
  • System testing consists of three different strategies: individual test cases, integrated, regression and performance evaluation stages.
  • Executed Load scenarios with different Load options to check the impact of the application on the network & servers.
  • Involved in monitoring test execution and system resource monitoring of different host machines.
  • Involved in executing the Performance, Load and Stress Tests, Analyzing the reports and documentation of test results

Environment: LoadRunner 7.0, QTP, WinRunner7.0, TestDirector7.0, Microsoft IIS, ASP, XML, Toad, COM+, Oracle 8i, Windows NT, 2000.

Confidential

QA Analyst

Responsibilities:

  • Reviewed and analyzed functional requirement specifications, workflow documents, and Use Cases.
  • Performed System Testing, Integration Testing, Functional and Regression Testing.
  • Created Data Driven Tests to validate test scenario with different sets of data using parameterization.
  • Closely worked with the Business Analyst, Developers and UI team to resolve the requirement issues, deployment issues, change management etc., during the course of the QA testing and actively participated in Review meetings and walkthroughs.
  • Interacted with developers in resolving the defects found in the application during testing.
  • Working closely with team members to ensure status and schedules are communicated
  • Extensively performed manual testing activities using Quality Center.
  • Participate in peer reviews of functional specification, application previews, and test plans/test cases.
  • Create and execute SQL queries to fetch data from Oracle database to validate Data.
  • Developed automated test scripts for regression testing, based on the requirement documents, using Quick Test Professional.
  • Conducted functional testing by inserting Standard checkpoints and synchronization points in test scripts using Quick Test Professional.
  • Interacted with business analysts and developers to resolve the technical issues so as to meet the client's requirement for a better quality software product.

ENVIRONMENT: HP Quality Center, TOAD, HTML, XML, Windows PL/SQL, Oracle and Windows 2000.

Confidential

QA Engineer

Responsibilities:

  • Performed User Interface Tests, Business Function tests.
  • Developed test cases, Test procedures and shell procedures in SQA Basic script language.
  • Created SQA repository using SQA administrator.
  • Developed software requirements, Test plans, Test procedures, Test cases and using SQA Manager.
  • Performed Play back of shell procedures and test procedures using SQA Robot.
  • Used SQA Robot to perform recording and playback of shell procedures, test procedures and test cases.
  • Test cases include clipboard and file existence, Region Images, Object properties.
  • Test results were analyzed in SQA log Viewer and in Text & object comparators.
  • Test Procedures and cases were developed in SQA-basic language.
  • Generated detailed reports in SQA Manager for developers and testing team.
  • Performed Business Functionality, User Interface and Performance tests.

Environment: Oracle, Visual Basic, Windows 95/ NT, SQA Suite

We'd love your feedback!