We provide IT Staff Augmentation Services!

Soa Operational Support & Performance Engineering Resume

4.00/5 (Submit Your Rating)

Mason, OH

SUMMARY

  • 9+ years of IT experience working as Performance Test Engineer in various domains like insurance, financial, pharma and health care.
  • Expertise in HP Load Runner/Performance center tool for test development and executions/analysis.
  • Expertise in using APM tools like Dynatrace/APP Dynamics to monitor test environments during test execution.
  • Detected performance defects in production and provided root cause analysis using Dynatrace Appmon.
  • Expertise in testing and monitoring of Web based client Server systems on multiple platforms such as in .NET, Java, SQL.
  • Developed Scripts to meet load - testing requirements according to the SLA (Service Level Agreement) agreed upon.
  • Excellent working knowledge in Developing & Implementation of complex Test Plans, Test Cases and Test Scripts using automated test solutions for Client/Server and Web-based applications.
  • Expert in developing Work load Model for performance testing using Splunk traces for API’s in production.
  • Enhanced Vuser script in Load Runner with Parameterization, Correlation, Transaction point and check points.
  • Excellent knowledge of software development and testing life cycles, SQA methodology and test process documentation.
  • Knowledge in developing and executing scripts using Selenium Web Driver and TestNG framework.
  • Experience in preparing Test data by retrieving data from Relational Databases Oracle, MS SQL Server.
  • Setup load/performance testing environment using Load Runner.
  • Experienced in different types of Performance testing activities like Load, stair step, Stress Testing and endurance test depending on the project requirement.
  • Worked on analyzing Throughput, Hits per seconds, Network delays, latency, and capacity measurements and reliability tests on multi- layer architecture.
  • Experience using network proxy tools such as Fidler and Wireshark.
  • Experience using Grafana tool to monitor Message flow from producer to consumer through Kafka and the health of Kafka in production.
  • Worked extensively on Kafka to set up the topic name, number of partitions required, replication factor and heartbeat count.
  • Developed Vuser scripts using Web (HTTP/HTML), Ajax TruClient and Web Services protocol.
  • Expert in finding performance bottlenecks both in client side and server side and making recommendations for Performance Profiling or Tuning.
  • Expert working on application performance and maximum scalability, critical parameter such as number of users, Response times, hits per seconds (HPS) and Throughput using Load runner.
  • Experience on working with JMeter for load performance testing.
  • Proven ability to check Network Bottlenecks using Vuser Graphs.
  • Expert in Analyzing results using HP Load Runner Analysis tool and analyzed Oracle database connections, Sessions and Log files.
  • Added performance measurements for Oracle, Web Logic, IIS in Load Runner Performance Center.
  • Performed backend testing creating SQL queries in Oracle, SQL Server.
  • Comfortable with various Industry Leading Operating systems (Windows NT/98/2000/XP/Vista/Windows 7, 10, 12 and UNIX).
  • Experience in Installation and Configuration of Software and Hardware in testing environment.
  • Excellent inter-personal abilities and a self-starter with good communication & presentation skills, problem solving skills, analytical skills and leadership qualities.
  • Experience in coordinating on-shore and off-shore resources.

TECHNICAL SKILLS

Testing Tools: HP Load Runner tool, JMeter, HP Quality Center, Performance Center, HP ALM, Dynatrace, AppDynamics, Fidler, Wireshark, Soap UI and Postman, Grafana, Apache Kafka

Languages: Java, J2EE, SQL, PL/SQL, C, C++

Build Tools: Ant, Maven, Jenkins, Hudson

Database: MySQL, DB2, Oracle, SQL Server, Sybase and MongoDB

Web Technologies: HTML, VBScript, JavaScript

Environment: Windows7, 10, 12 /2003 server/95/98/NT/00/XP, UNIX

Communication: MS-Outlook2016, MS-Office and Skype for Business

Others: Notepad++, TestNG, SQL Developer, MobaXterm, Putty

PL/SQL: Oracle PL/SQL

Browsers: IE 8/9/10/11, Firefox, Chrome, Safari

PROFESSIONAL EXPERIENCE

Confidential, Mason, OH

SOA Operational Support & Performance Engineering

Responsibilities:

  • Involved in gathering Nonfunctional requirements - NFR’s for API’s based applications across Confidential applicable to Member Domain as well other domains “Benefits, Claims” etc., as needed.
  • Hands on experience in creating custom scripts for both REST & SOAP API’s using Micro focus Load Runner 12.53
  • Discuss and develop the approach for creation of test data for the planned performance tests.
  • Actively engaged in SPE intake process with individual scrum teams and defects in Jira for every release.
  • Gather the production logs from Splunk data analytics tool and design/develop the system load models to simulate the production behavior in the lower environments.
  • Executing sanity tests using postman to validate the scripts and test environment.
  • Responsible for Performing Dockerization container level performance tests that includes (Feasibility Analysis, Test Environment Setup, Performance Scripting, Performance execution, Tuning & analysis.
  • Monitor the Docker environment at CPU, Memory and I/O related using UCP and CA wily.
  • Analyze the Docker Container behavior and propose the appropriate Java options (ADD OPTS).
  • Performance tested multiple producer api’s and consumer api’s which are connected to a kafka cluster with multiple brokers.
  • Good knowledge on partition of kafka messages and setting up the replication factors in kafka cluster.
  • Using Grafana tool analyzed the load on Kafka from multiple producers. Verified the messages sent by producer and received by consumer is matching by validating the updated count in database.
  • Using Grafana monitored targeted topic messages in per second, bytes in, bytes and also validate total messages produced at producer end and total messages updated at consumer db end.
  • Collect the heap dumps, thread dumps from the servers and analyze using IBM thread and Heap analysis tool.
  • Analyze the heap issues both the prod and test environments and provide the necessary JVM options.
  • Raising appropriate flag with product owners based on the performance test results and thus highlighting the possible risks
  • Run the test until all the performance issues are resolved and provide signoff Summary Report to all the stake holders and devops team.
  • Work with support teams to analyze the severe alerts and provide the appropriate recommendations/fix to resolve the issue.
  • Proactive monitoring of the PROD systems and take appropriate actions to avoid the failures.
  • Hands on experience with OEM to analyze top activities, SQL monitoring long running queries & AWR report during / after test execution also monitor DB parameters (CPU, Memory, disk & Deadlock).
  • Experienced in DB tuning of back end systems with the help of DBA’s for API’s engaged for performance testing.
  • Hands on experience in writing soft SQL queries in SQL Developer, Toad for pulling require test data to support Performance testing.
  • Strong knowledge on Jira, Agile development process for Docker, APIGEE, and Micro services Architecture.
  • Communicate and co-ordinate with an offshore team and guide them in understanding the test requirements and test executions.
  • Involved in daily/weekly status meeting.

Environment: MS-Office, SOAP UI, Ca Wily Introscope, Postman, HP ALM, LoadRunner 12.53, SQL Developer, Putty, Splunk, App Dynamics, Oracle Enterprise Management Eclipse Memory Analyzer

Confidential, Northbrook, IL

Performance Engineer Test Lead

Responsibilities:

  • Involved in requirement gathering sessions with project stake holders, business analyst and developers.
  • Tested multiple Infrastructure projects successfully.
  • Preparing a test approach document which includes workload, business critical scenarios, SLA and performance counters that provides a detailed list of conditions under which the system will be tested.
  • Performing scalability exercise to compare production and performance infrastructure related to the AUT.
  • Developed complex scripts using web HTTP/HTML, web services, true client protocols with the necessary enhancements in C language.
  • Validated web services developed using SOAPUI, and then converted to Load Runner web services scripts using web services protocol.
  • Initiated load testing process for new microservices with REST API.
  • Parameterized large and complex test data required for the multiple test executions to accurately depict production trends.
  • Creating a scenario with scripts developed for the identified business scenarios in Performance center.
  • Developed automated process to trigger the test, automated email alerts and the results to client configuring Jenkins and GitHub with ALM Performance center.
  • Deriving the type of test need to be performed depending on project requirement like Load, stair step and Endurance test.
  • Extensively used Dynatrace for monitoring the application to debug client side and server-side bottlenecks.
  • Worked closely with application team to troubleshoot issues and identify root cause of failures to fix the bottlenecks and fine tune the application until it meets the performance and business acceptance criteria.
  • Analyzed throughput, hits per second, transactions per second and rendezvous graph using a load runner analysis tool.
  • Prepare release report with all the artifacts and recommendations to the stakeholders to improve the performance of the applications.
  • Results presentation on the various testing activities performed throughout the release and the improvements in the transaction response times after fixing the performance related issues.
  • Weekly meeting with a production support team regarding the performance degrading transactions in production.
  • Documented defects and tracked them to resolution using defect management tools like JIRA and Quality Center.
  • Analyzed cross results, cross scenarios, overlay graphs, merging different graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server.
  • Responsible for creating the System Performance Graph from the results of the incremental (user load) stress tests to determine the scalability of the system, response time at the desired throughput and the system break point (knee of the curve).

Environment: Load runner 12.53, performance center 12.53, Dynatrace 7.2, Windows 2012, Oracle 12C

Confidential, Scottsdale, AZ

Performance Test Engineer

Responsibilities:

  • Assisted the team lead in the preparation of the Test Plan and Test Strategy documents.
  • Developed scripts using number of protocols including Web HTTP/HTML, Oracle, Web services, Ajax protocols.
  • Executed scripts using JMeter and Soap UI to perform web services testing.
  • Worked with JMeter in stimulating load on the servers to check performance of different load levels.
  • Responsible for scheduling the Load tests using HP Performance center involving a variety of load scenarios combination.
  • Responsible in Co-ordinating with various other teams involved to monitor load testing activity.
  • Responsible for generating and publishing Load Test Results and publishing the results in NAS drive.
  • Analyzed Performance Bottlenecks using Dynatrace and HP Site scope.
  • Analyzed the server resources and interacted with developers in identifying memory leaks, fixing bugs and for optimizing server settings at web and App levels.
  • Worked closely with development team using Quality Center to classify them based on the severity in Quality Center along with screenshots.
  • Responsible for monitoring Oracle Database performance for Indexes, Sessions, Connections, poorly written SQL queries and deadlocks for each component of application.
  • Analyze, interpret and summarize meaningful and relevant results in a complete Performance Test Report.
  • Interacted in Daily standup meetings with the Management and report day-to-day activities and updates.

Environment: Load Runner 11.5, JMeter 2.10, Soap UI, Oracle 10g/11i, Quality Center 11, HP ALM, Agile, Performance Center, Dynatrace7.1, HP Site Scope.

Confidential

Performance Tester

Responsibilities:

  • Attended periodic meetings for changes in the application requirements to document and implement procedures for test changes.
  • Developed Test Scripts using Load Runner for automation of Regression testing and verifying the web applications expected behavior at different stages by inserting various Checkpoints.
  • Validated the scripts to ensure it is matching the requirements.
  • Developed work load model to generate required volume by adjusting pacing between iterations to get the desired transactions per hour.
  • Conducting the Load, Stress and Performance testing using Load Runner.
  • Extensively used Load Runner web HTTP/HTML protocol to script and developed complex logics to customize the script according to requirement.
  • Successfully Completed User Acceptance Testing (UAT) on each release of the project with the help of end user requirements.
  • Determined the source of bottlenecks by correlating the performance data with end-user loads and response times.
  • Used Performance monitor to analyze the % CPU usage and Memory usage for each Scenario.
  • Analyzed Mercury Load Runner on-line graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server.
  • Used Test Results to provide summary reports, response times and monitor averages.
  • Coordinated with Onshore to give updates on scripting and test executions

Environment: Load Runner 9.1/11, Quality Centre 9.2, Wind XP Professional, SQL server 2008, Web browsers IE9.

We'd love your feedback!