We provide IT Staff Augmentation Services!

Performance Engineer Resume

Clinton New, JerseY

PROFESSIONAL SUMMARY:

  • 10+ years of diversified IT experience in Performance Testing & Engineering, Manual and Automated testing in Client/Server, Mobile and Web based applications.
  • Extensive experience in Performance Testing & Engineering in various areas including Banking, Health Insurances, Retail, Utilities & Energy and recently in Pharma, Agribusiness Industries.
  • Exposure to all Stages and Methodologies of SDLC, including Agile/Scrum methodologies, V - Model and waterfall model.
  • Experienced in testing the widest array of applications which are developed in .Net, J2EE Technologies.
  • Involved in gathering performance requirements and develop PTP.
  • Developed Executed & Tested test plans, test cases and test strategies.
  • Implemented workload models sizing application or project demand for required resources to meet business needs.
  • Proficient on Performance, Load, Stress, Concurrent, Endurance, Functional and Regression Testing of Client/Server, Web based applications using various HP Testing Tools such as Load Runner, Performance Center, JMeter and HP ALM.
  • Extensive knowledge to create performance test scripts using LoadRunner, VuGen, Neoload
  • Hands on experience in Load Runner scripting with Web (Http/Html), Oracle NCA, SAP GUI, SAP-WEB, Citrix ICA, Web services and Ajax TruClient Protocols
  • Extensively used Auto and manual Correlation, Parameterization and Content Check features.
  • Responsible for analyzing Throughput Graph, Hits/Second graph, Transactions per second graph and Rendezvous graphs using LR Analysis tool.
  • Used Dynatrace for Application Performance monitoring. Created Dash lets and Pure paths to monitor the client side and server-side metrics.
  • Analyze the CPU Utilization, Memory usage, Garbage Collection and DB connections to verify the performance of the applications using various monitoring tools like App Dynamics, Wily Introscope, and Perfmon.
  • Experience working AWS, Azure and Salesforce cloud for performance testing.
  • Experience on Jenkins for CICD implementation. Worked with DevOps team to create pipelines for performance testing.
  • Monitored database for deadlocks, sessions, connections, long running SQL queries.
  • Expertise in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
  • Hands on experience using JMeter for developing user interface scenarios along with micro- services. Used http protocol along with Web Services.
  • Experience working on J-Profiler and ANTS profiler to identify time spent on various methods and DB calls.
  • Experience working Service Virtualization tools like CA Lisa and Parasoft to virtualize/ stub various upstream and downstream services in a cross application.
  • Good Exposure to project management tools like ALM and JIRA.
  • Experience using tools like TFS and Subversion.
  • Excellent ability to understand complex scenarios and business problems and transfer the knowledge to other users/developers in the most comprehensible manner.
  • Experience in coordinating on shore and off shore resources, Mentored team members, Identifying and Managing Risks, Estimating Work, Resource sharing between the projects, tracking project progress, Delivering projects as per the deadlines.

WORK EXPERIENCE:

Performance Engineer

Confidential, Clinton, New Jersey

Responsibilities:

  • Worked as a performance test lead and onsite coordinator role in this project and was involved in performance testing effort planning, understanding the requirements, resource management and scheduling meetings and coordinating performance testing activities between client and offshore resources, handled multiple projects.
  • Collaboratively worked with quality assurance team members, developers, infrastructure teams to perform Performance testing.
  • Gathered the performance business requirements from the project stake holders.
  • Identified the Performance critical business scenarios that need to be tested from business & functional based on the SLAs and benchmarks.
  • Prepared performance test plan, work load profile by analysis the already existing production system stats and baseline the performance test load profile and SLA's.
  • Understand the application architecture & critical business application middleware services.
  • Developed Vuser scripts for several different protocols such as Web (HTTP/HTML), Web Services, Ajax TRU-client.
  • Enhanced scripts with correlation, parameterization and LR functions and user defined functions.
  • Created and implemented test scenarios with HP ALM Performance Center 12.60 for dry-run, baseline, benchmark, load and break tests.
  • Maintained repository for scripts, scenarios, test data, documentation, results, detailed analysis and reports.
  • Hands on experience using Dynatrace for application performance monitoring. Created dash lets and pure paths in Dynatrace to monitor the client and server-side metrics.
  • Experience working on Splunk Log Analysis to monitor Application server and DB log files. Created regex functions in Splunk to view the log files.
  • Experience working on AWS cloud for Application performance testing. Used Code build and Docker image files for auto scaling of VM’s.
  • Used Cloud Watch to obtain metrics of all performance counters like CPU, Memory, DISK I/O, Network Utilization and Deadlocks.
  • Validate performance parameters such as CPU utilization, memory utilization for the application server, DB server.
  • Worked with the Dev and DBA team to resolve performance bottlenecks related to code level and SQL queries running on higher end.
  • Experience working on Jenkins for CICD pipelines used for performance testing.
  • Used JMeter for developing User Interface and micro-services scenarios.
  • Used JIRA to create user stories, sub task and enablers for various sprints.
  • Used ALM to raise various performance testing detects.
  • Created performance analysis report for Certification and sign off’s or major releases.
  • Measure response times for the application critical business transactions/NFR's.
  • Delivered detailed Performance Test Analysis reports and Recommendations and shared with project stake holders.

Environment: Load Runner 12.60, ALM, AWS, Jenkins, Dynatrace, Splunk, CloudWatch, JMeter, Windows, NET 4.5.1, SQL Server DB 2008, SSRS, SSIS, XML, Oracle 11G, MS IIS 7.5, JQUERY, PL-SQL, Web Services, HTTP Watch, Putty.

Performance Engineer

Confidential, Jersey City, New Jersey

Responsibilities:

  • Analyzed business requirements, created scenarios/conditions and test plans from the documents and finally created detailed test cases that would completely satisfy the objectives set in software certification process.
  • Developed Detailed & Overall Test Plan using the Business Specifications.
  • Reviewed Specified Business Documents and Developed Test Cases in HP ALM based on Use-Cases and Requirements, and executed test scripts to verify actual results against expected results.
  • Written Test Cases and Performed Manual Testing like GUI, Functionality, System, Integration, Regression, and performed Positive/Negative, testing for system validations.
  • Enhanced scripts with correlation, parameterization and LR functions and user defined functions.
  • Created and implemented test scenarios with HP ALM Performance Center for dry-run, baseline, benchmark, load and break tests.
  • Conducted Regression, Integration, Functional Testing, Performance Testing.
  • Used UFT GUI virtual user scripts to generate load and Executed UFT Scripts on Virtual machines from Performance Center.
  • Have experience in executing exceptional modularized re-usable automated scripts using -UFT, VBScript.
  • Instrumented windows and Unix resource monitors, data-base resource monitors, web resource monitors like MS IIS monitors, various graphs like Average transaction response time graphs, error statistics graphs to monitor the system under test.
  • Maintained repository for scripts, scenarios, test data, documentation, results, detailed analysis and reports.
  • Design appropriately sized environments to generate load against the application under test
  • Generated detailed summary reports, merged, filtered and correlated graphs to represent the apparent bottle-necks in the application
  • Together with network administrators, database administrators, integration engineers, identified problems in the application/server architecture and functionality.
  • With the Load Runner analysis data, identified and suggested risks and solutions. Tuned the application/server architecture for optimal system performance.
  • Delivered detailed Performance Test Analysis reports and recommendations plan report based on system performance and diagnosis.
  • Tracking the defects and prepared status summary reports with details of executed, passed and failed test cases and reported defects through HP ALM.
  • Involved in peer Reviews and Team Walkthroughs for the project as per Test methodology.
  • Responsible for keeping the test schedule working directly with software engineers to ensure clear communications on requirements and defect reports.
  • Participated actively in Requirements and Design Reviews.
  • Implemented whole life cycle of QA Methodologies starting from Planning, Capturing, Creating, and Executing, Reporting and Tracking the defects in HP ALM.

Environment: Load Runner 11.52, UFT 12.01, HP ALM, VB script, and C language.Net, APP Dynamics, JIRA/ Confluence.

Performance Test Lead

Confidential, St. Louis, MO

Responsibilities:

  • Identified the Performance critical business scenarios that need to be tested.
  • Recommended and described the Performance test strategies that need to be tested.
  • Enhanced scripts with correlation, parameterization and LR functions and user defined functions.
  • Created and implemented test scenarios with a controller for dry-run, baseline, benchmark, load and break tests.
  • Different Loads at the increments of 10 starting from 5 Virtual Users, 20 Iterations to 250 Virtual Users were ramped until it reached 100% CPU.
  • Used AppDynamics to monitor systems while under expected load, CPU & memory of servers.
  • Involved in Performance tuning of the class objects/ paging issues, and the database queries
  • Involved in the garbage collection overhead issue and solved with the proper recommendations
  • Maintained repository for scripts, scenarios, test data, documentation, results, detailed analysis and reports.
  • Validate performance parameters such as CPU utilization, memory utilization for the application server, DB server
  • Measure response times for the application critical business transactions/NFR's
  • Submitted weekly reports to the QA Manager.

Environment: Load Runner 11.00, HP ALM Performance Center 11.5, AppDynamics, JAVA/J2EE, WAS 7.0, Oracle 10g, Tomcat Apache 5.5, and Web Logic 10.0, Sitescope, TIBCO BC/BW SQL Server 2008/2005.

Performance Tester

Confidential, Chicago, IL

Responsibilities:

  • Understanding of the business process and involved in preparing the Performance Test Plan, which is used for Load Testing.
  • Performance testing the applications using HP Loadrunner and HP ALM Performance Center.
  • Developed test scripts after project/application deployed in the testing environment.
  • Responsible for the development of Vuser scripts for several different protocols such as Web (HTTP/HTML), Web Services, Ajax Tru-client.
  • Recorded and enhanced Vuser Scripts by inserting Transaction points, Rendezvous points in to Vuser Script in LoadRunner to create intense user load on the server to measure server performance under load.
  • Experience working on Salesforce cloud used for performance testing.
  • Monitored different Graphs like Transaction Response time and Analyzed Server performance Status, Hits per Second, Throughput, Windows Resources and Database Server Resources
  • Extensively used LoadRunner to design and execute load and performance tests and stress tests.
  • Used HP ALM Performance Center 11.5 to execute load tests and stress tests.
  • Design test scenarios & setting up the Monitoring agents.
  • Executed and monitored the tests using monitoring tools like App Dynamics.
  • Executed Warm up Tests, Baseline Tests, and Regression Tests for each assigned application.
  • Create Test Reports, maintain, and disseminate documentation as required.
  • Testing critical bugs including critical fixes and coordinated with developers in the release of bug fixes meeting tight timeline.
  • Participating in Defect conference calls to discuss the issues during the project development time.
  • Worked on latest versions of LoadRunner 11.5, Neoload 5.0
  • Worked closely with developers to debug and to know the functionality and flow of the application.

Environment: HP LoadRunner 11.00, HP ALM Performance Center11.00, Neoload 5.0.

Performance Engineer

Confidential, Augusta, ME

Responsibilities:

  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
  • Developed data driven test automation scripts and extensively used HP ALM Performance Center to design and execute load and performance tests and stress tests.
  • Designed and executed weekly load BVTs as well as performance tests with focus on application. performance, availability and capacity with the goals of ensuring the optimal user experience.
  • Used Web HTTP/HTML Protocol in Load Runner to capture web-based applications.
  • Analyzed test results to identify trends in performance and reliability of web applications.
  • Used Manual and Automated Correlation to Parameterize Dynamically changing Parameters.
  • Developing Vuser scripts and enhanced the basic script by parameterizing the constant values using LoadRunner.
  • Created and maintained Test Scripts and Test Cases based on High Level Functional Document utilizing Visual Studio System (VSTS) 2008, Team Foundation Server (TFS), for performance/stress testing.
  • Extensively used Quality Center for test planning, maintain test cases and test scripts for test execution as well as bug reporting
  • Performing problem solving analysis and root cause for system functionality and testing challenges using Load Runner Analysis Tool.
  • Develop the Daily status reports and publish the same to Development, Configuration, DBA and Network Teams
  • Worked closely with developers to debug and to know the functionality and flow of the application
  • Involved in Mentoring QA Team Members, Training the QA team with the Load Runner Analysis Tool, HP Performance Center to document the results in Team Foundation Server (TFS).
  • Creating different scenarios to simulate baseline, breakpoint with mixed test scripts.
  • Involved in Correlation and Parameterization for the script, to ensure the script runs successfully during replay. Monitored the activities through LoadRunner Controller.
  • Responsible for the development of Vuser scripts for several different protocols such as Web (HTTP/HTML), Web Services and SoapUI.
  • Recording and enhancing Vuser Scripts by inserting Transaction points, Rendezvous points in to Vuser Script in LoadRunner to create intense user load on the server to measure server performance under load.
  • Developing Test Scenarios and performed the test runs using HP Performance Center 11.0.
  • Using Scheduler to schedule the scenarios for User's Ramp up/Ramp down in Load Runner.
  • Worked on JMeter, Installed JMeter & additional jar files needed which are not included in the JMeter installation.
  • Using LoadRunner Analysis to create graphs and reports from the load test results to correlate system information and identify both bottlenecks and performance issues.
  • Involved in individually running each workflow for single user, throughput tests and scalability tests. These were then tuned for optimal performance.
  • Analyzing Software and Hardware components using Load Runner Analysis Graphs.
  • Interacting with developers, business analysts and other stakeholders to ensure to resolve Defects and their priorities.
  • Responsible for Test Summary Reports and Defect Reports.
  • Prepared load Test analysis reports (CPU Utilization, Throughput, Response Times, Web Server Monitor Counters, System Resource Performance Counters and Database Performance Counters).
  • Performing problem solving analysis and root cause for system functionality and testing challenges using Load Runner Analysis Tool.
  • Developing and implementing load and stress tests with Mercury LoadRunner, and presented performance statistics to application teams, and provided recommendations of how and where performance can be improved.
  • Extensively worked with all the HIPAA EDI transactions.

Environment: ALM Performance Center 11, HP LoadRunner, JAVA, Web Sphere, JMeter.

Hire Now