We provide IT Staff Augmentation Services!

Qa Performance Lead Resume

5.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Over 8 Years of experience in Quality Assurance (Manual/Automated) testing with expertise in requirements gathering, analysis, design, application testing, Quality Assurance of Web based applications, Client Server and Mainframe applications & thorough executions and analytical reporting of results to the stakeholders.
  • Experience in Load Runner, JMeter, Soasta, Performance Center, BlazeMeter and Cloud Test: creating Scripts, monitoring, Runtime Transactions and analyzing the test results
  • Very good knowledge of Core Java, method and constructor overloading and overriding, OOPs concepts.
  • Experience in both Agile and Waterfall Software Development methodologies.
  • Involved in scripting, test design, test development, implementation of test procedures.
  • Extensive experience with Load Runner tool for monitoring, testing of Web based as well as client Server systems on multiple platforms such as in .NET, Java, SQL.
  • Developed Scripts to meet load - testing requirements according to the SLA (Service Level Agreement) agreed upon.
  • In depth knowledge of the most common protocols (Web, Web services),Queue Testing, Database testing, Cloud Testing.
  • Worked on various Load Runner Protocols like Web (HTTP/HTML), TruClient-Web, Web Services, SAP-Web, JAVA.
  • Excellent working knowledge in Developing & Implementation of complex Test Plans, Test Cases and Test Scripts using automated test solutions for Client/Server and Web-based applications.
  • Generated web, database, and client/server Vusers scripts and efficient in Debugging, and fixing script errors, using VuGen.
  • Good experience in putting thinktime, pacing, parameterization, correlation, assertion using Jmeter.
  • Experience working closely with DevOPS team for setting up CICD pipeline and testing EMS, MQ and AMQ Performance Testing.
  • Very strong in custom coding, error handling, file handling etc
  • Good experience working on monitoring tools like HP Sitescope, HP Diagnostic, Dynatrace, FogLight, Perfmon, AppDynamics, Grafana, DataDog.
  • Good experience working on Mobile/Device applications, scripts were created using Mobile connecting to system Fiddler proxy.
  • Experienced and knowledgeable in different types of load testing, stress testing, volume testing, endurance testing, regression and user acceptance testing.
  • Expert in finding performance bottlenecks both in client side and server side.
  • Analyze the CPU Utilization, Memory usage, and Garbage Collection and DB connections to verify the performance of the applications.
  • Developed vugen script formatting tool using Python programming language.
  • Developed WebLogic log parser using python programming language.
  • Good knowledge on JVM, Garbage Collection, Heap Dump, Thread Dump on Window and Linux servers using JVisualVM and JMap respectively.
  • Extensive experience in Unit, Functional, Integration, Regression, User Acceptance, System Level Load and Stress Testing.
  • Good in Linux commands to stop and start services, to change property files for services, to delete log files after performance test using putty and winscp.
  • Experience in database testing using SQL stored procedures.
  • Good experience working in Cloud infrastructure.
  • Good knowledge working in AWS environment, EKS(Kubernates), MSK cluster.
  • Good knowledge on how Docker and Container works and how auto scaling should be tested.
  • Good experience in manual and automation of Web Services (SOAP/REST) testing with SOAPUI, WSDL, XML, JSON.
  • Experience in understanding the code written in different programming languages Java, JavaScript, VBScript, .NET, SQL and databases like Oracle, SQL Server.
  • Expert in Analyzing results using HP Load Runner Analysis tool and analyzed Oracle database connections, Sessions, Log files.
  • Proficient in manual testing as well as automated testing for client/server, Internet, Intranet and web applications using automated tools.
  • Proficient in the use of defect tracking system and analyzing test results.
  • Experience in analyzing Business and Development Specifications, Architecture, Use Cases, and Detail Design to develop test requirements, procedures and test cases.
  • Excellent analytical skills with good communication and self-organizing skills, assertive and a committed team player.

TECHNICAL SKILLS

Languages: Core Java, .NET, C, C++, HTML, XML, SQL.

Testing Tools: LoadRunner, JMeter, Junit, Clear case, SOAP UI, HP ALM, QualityCenter, HP Performance Center, HP Site Scope, PuTTY.

Cloud Testing Tool: HP Storm Runner Cloud testing

Monitoring Tools: AppDynamics, Dynatrace, HP Diagnostics, Datadog, Kibana, Splunk

Web/Application servers: Tomcat, IIS, WebSphere, MQ Series and BEA WebLogic, Java Web Server, Messaging..

Web Technologies: HTML, CSS, Jquery, VBScript, JavaScript, J2EE, JMS,JDBC, Swing, and Mainframe

Network Proxy Tools: Fiddler

Defect Tracking Tools: Quality Center, performance center JIRA, TFS

Database: DB2, Oracle 8i/9i, MS SQL Server 2008/2012, MS Access

Operating System: Windows, MAC, Linux/ UNIX, Soap UI, AIX.

Methodologies: Water fall, Agile methodologies (SCRUM).

Browsers: IE 8/9/10/11, Firefox, Chrome, Safari, edge.

PROFESSIONAL EXPERIENCE

Confidential - Chicago, IL

QA Performance Lead

Responsibilities:

  • Responsible for validating the performance of Confidential UI application which is being accessed by thousands of customers across the world and also the performance of Customer Technology Platform (CTP), central common App layer of Confidential which interacts with almost all the component of United’s IT infrastructure related to multiple sub systems internally.
  • Discussed the approach for performance testing and to outline the performance acceptance criteria that are in line with business requirements and defining strategy to carry out all Performance Engineering activities for the application.
  • Involved in validating performance of newly developed React Pages for the user facing frontend UI application.
  • Working in AWS cloud environment with EKS and MSK clusters.
  • Getting Load Test scheduled through Team City.
  • Responsible for creating test scripts using Load Runner using various protocols including Web Http/Html, Web service, Citrix etc.
  • Responsible for developing automation scripts to validate Web Services, Web, Mobile applications, Queue Testing and Cloud Testing using HP ALM Performance Center and HP LoadRunner (12.55).
  • Responsible for performing Stress tests, Endurance tests and failover tests against the application using LoadRunner.
  • Planned and generated Vuser scripts with VuGen and enhanced them with correlation, parameterization and functions.
  • Expertise in developing the scripts using Proxy tool, Fiddler which captures the HTTP and HTTPS traffic.
  • Worked on JMeter to create script to push messages to MQ.
  • Experience on looking at PODS resources and autoscaling in Datadog monitoring tool and good knowledge on Containers, PODS, nodes, cluster in AWS environment.
  • Performance tested Queue processing rate of Tibco EMS topics and queues to calculate how fast email and message notifications are delivered to end user.
  • Used pincher/AppDynamics tool to monitor(queue depth, in total, out total messages) the queues for monitoring and JConsole to monitor the consumer(message processor) machine resource utilization.
  • Analyzed the Garbage Collection (GC), Heap Memory usage, CPU utilization using JConsole to identify the performance bottlenecks of application.
  • Worked with the application development teams to debug and resolve any defects/issues that arise out of the testing process using JIRA and TFS tools.
  • Participate in requirements gathering and analysis, test strategy documentation, design analysis using SharePoint, JIRA and HP ALM.
  • Executed Load, Stress tests and endurance tests with LoadRunner and provided statistics to application teams, and provide recommendations on the issues that impact performance.
  • Compare load test results with Baseline results to determine system degradation and match with SLAs.
  • Used AppDynamics as APM tool to identify the issues, Memory and CPU usage for the load test and helped Project team by analyzing Heap behavior, throughputs and pauses in Garbage collections as well as tracking down memory leaks to improve the application stability.
  • Working with Mobile Applications Performance Testing.
  • Involved in Web as well as API level testing.
  • Analyzing Throughput Graph, Hits/Second graph, Transactions/second graph and Rendezvous graphs using LoadRunner Analysis tool.
  • Shared the detailed analysis report by including Client side and Server-side metrics to respective stakeholders and reviewed the results with the project team to help them in the identifying the bottlenecks before GO/NO GO to avoid the production issues.
  • Involved in Performance troubleshooting to drill down application performance issue and GO NO-GO decision.

Confidential

Senior QA Performance Engineer

Responsibilities:

  • Conducted meetings across all areas of the product organization to identify, prioritize, and mitigate risks to the responsiveness and scalability of our offerings.
  • Follow Agile (Scrum) process, the performance validation process goes by the ‘Work Done & Ready to Go’ approach from time to time, release to release and in specific sprint by sprint.
  • Worked with developers to assist them in creating dashboards in Grafana useful to the development teams.
  • Organized status meetings with the stakeholders for Performance Testing in the project to ensure processes and content of all Performance testing artifacts is documented, maintained and transitioned to the client teams as per the Clients Retention and Transition policy.
  • Created Performance Test plans and Test Case Design Documents with the input from developers and functional testers.
  • Splunk - data onboarding, dashboard, alerting creation, Scripted Input.
  • Used monitoring tools like Grafana, Performance Center, Wily Introscope, Dynatrace, HP Diagnostics, Splunk, App Dynamics to monitor and collect metrics on production and test servers. Also used for checking transaction response time for each query.
  • Integrated Performance Testing with various applications as well as within a cloud environment.
  • Used Hp LoadRunner 12.55/12.53 and JMeter 3.0/3.1 for performance testing.
  • Extensively used LoadRunner using Virtual User Generator to script and customize performance test harness with protocols like Web (HTTP/ HTML), Web services, Seibel Web & RTE.
  • Generated and associated different IP addresses to Virtual users to emulate real time scenarios for load balancing issues using IP Spoofing.
  • Executed Baseline, Load, Stress, and Endurance Testing using HP Performance Center.
  • After test execution collaborated with the development, solution engineering, and technical architecture and release management teams in the client organization, to analyze performance results and identify fixes for the findings and for effectively identifying potential bottlenecks.
  • Reported various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Network time, Page Refresh Time, and the Page Rendering Time.
  • Worked on setting up monitoring for MuleSoft ESB and extracted the logs to Splunk for troubleshooting.
  • JVM Performance Tuning: GC and Heap Analysis, Thread dumps, Heap dumps, Memory Leaks, Connection Leaks, Core Dump.
  • Application server, Database, Network and WebLogic monitors are utilized during execution to identify bottlenecks, bandwidth problems, and infrastructure, scalability, and reliability benchmarks.
  • Configured and used Dynatrace for performance monitoring and performed trouble shooting on Bottlenecks with performance testing along with response times, analysis and profiling the application to find out where the performance issue.
  • Setting up user profiles, configuring and adding application servers on Dynatrace.
  • Added Header with the script and monitoring the script Using Dynatrace Client.
  • Knowledge of Java Virtual Machine internals including class loading, threads, synchronization, and garbage collection.
  • Profile slow performing areas of the application, system resources and identify bottlenecks and opportunities for performance improvements by using wily Introscope tool.
  • Back end testing to check for data and application integrity by writing SQL queries.
  • Conducted application profiling and JVM tuning for all builds delivered per each agile sprint.
  • Reviewed and profile Java application code on over several dozen J2EE applications to optimize performance and eliminate risks to stability, capacity, and high availability.
  • Conducted application performance profiling at method call level and set priorities for code performance optimization.
  • Root out inefficient SQL calls and indexing issues for DBA group.

Confidential

QA Performance Tester

Responsibilities:

  • Gathered Test Plan and Test Specifications based on Functional Requirement Specifications and System Design Specifications.
  • Responsible for creating test scripts using Load Runner using various protocols including Web Http/Html, Web service, Citrix etc.
  • Planned and generated Vuser scripts with VuGen and enhanced them with correlation, parameterization and functions.
  • Working on Dynatrace to get the response time at the server level
  • Challenge of creating Restful service to avoid request going to the production during performance test.
  • Good in Linux commands to stop and start services, to change property files for services, to delete log files after performance test using putty and winscp.
  • Hands on experience in executing sql queries.
  • Expert in developing Work load Model for performance testing.
  • Good experience in using JIRA.
  • Good experience in using GrayLogs.
  • Working on HDFS, KAFKA, ODS services.
  • Parameterized large and complex test data to accurate depict production trends.
  • Scheduling the scenarios using the Load Runner's Controller and analyzing the results using Analyzer.
  • Monitoring the servers and logging the metrics using the monitoring tools.
  • Identified Disk Usage, CPU, Memory for Web and Database servers and how the servers are getting loaded using Sitescope
  • Worked in association with the DBAs in making sure that the databases are re-pointed to the original environments once we are done with the environment for the load test in question.
  • Created test cases based on the requirements and the test conditions in Mercury Quality Center and identified test data in order to match with requirements.
  • Executed SQL Queries for backend testing of the application to ensure business rules are enforced, and data integrity is maintained.
  • Performed usability and navigation testing of web pages and forms.
  • Analysis of cross results, cross scenarios, overlay graphs and merging different graphs.
  • Responsible for database rolled back after the load tests are completed.
  • Independently executed the test scenario, analyzed the execution statistics by monitoring the online graphs
  • Coordinated with Technical Teams to monitor Database Query, CPU Utilization and Memory.
  • Worked closely with the Development team in the performance tuning efforts of the various sub systems.

Confidential

QA Performance Tester

Responsibilities:

  • Participated in all the phases of Software Testing Life Cycle (STLC) like, Requirements Review,Test documentation, Application testing, and defect reporting.
  • Reviewed the Test Basis, designed and documented Test Strategies, Test Plan, Test Cases and executed test cases.
  • Responsible for writing and maintain Selenium WebDriver scripts for regression and functional testing using data driven framework.
  • Developed test code in Java language using Eclipse and TestNG framework.
  • Responsible for identifying test cases for manual and Automation with Selenium WebDriver for Smoke Test, Functional and Regression Tests.
  • Involved in testing the application utilizing the Agile methodology.
  • Extensively used Selenium (XPath and CSS locators) to test the web application.
  • Identified, Reported and Tracked Defects using Quality Center test management tool.
  • Logged defects using TFS and work with business analyst and developer to follow the defect cycle.
  • Performed Backend testing by checking the updated data in the Database using SQL queries.
  • Performed SOA / web services testing using SoapUI.
  • Performed testing on Web Services using WSDL and SOAPUI to check the communication between different services.
  • Added test cases using Groovy Script in SOAPUI tool to test the SOA architecture web services.
  • Used Java language and TestNG framework for scripting. Integrated with Continuous Integration tools Jenkins for running test automatically.
  • Use ANT to build and run the Selenium automation framework. Once the script got over, framework use to send the automation reports over email.
  • Involved in designing and building automation frameworks in support of continuous integration in a test driven development (TDD).
  • Developed automation test scripts using the data driven automation framework from scratch using VB script for the purpose of regression tests using HP QTP.
  • Insert Object Data Verification Check point on Quick Test Professional (QTP) automation testing tools.
  • Reported software defects in Jira and interacted with the developers to resolve technical issues.
  • Followed Agile testing methodology, participated in daily SCRUM meetings and testing each SPRINT deliverables.
  • Reviewed the defects using HP Application Life Cycle Management (ALM) bug tracking tool.

We'd love your feedback!