Sr.performance Tester Resume
New York-, NY
PROFESSIONAL SUMMARY:
- Over 9+ years of experience in tools like Load runner, Microsoft Visual Studio, J - Meter in testing Stand-Alone, Client/Server and Web Based applications.
- Exposed to all phases of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
- Hands on Experience with various protocols of Load Runner such as HTTP, Web-Services, Ajax Truclient, Siebel, SAP GUI and FTP.
- Developed Test Plan, Test Strategies, Test Cases, Test Scripts, Test Matrix and Test Summary Report.
- Good understanding of Software Development Life Cycle (SDLC) and QA testing methodologies.
- Hands on Experience with end to end performance testing Framework working on On-Prem Environment and on Cloud Environment. Worked on Salesforce, Microsoft Azure and AWS.
- Hands on Experience with Capacity Planning of Applications under test. Performed Capacity planning of application server and web servers along with DB Server.
- Hands on experience with Apache Tomcat and IIS Application Server for performance tuning of method level exceptions.
- Experience working in Agile Environment where CICD pipelines have been integrated with AWS and testing tools.
- Hands on Experience with Dynatrace and App Dynamics for Application performance monitoring.
- Hands on experience working with SPLUNK Log Analysis to debug APP, DB and CORE Tier Log Files.
- Created Dashboards with Splunk and written regex functions with SPLUNK.
- Hands on Experience with Load Runner Vugen for developing test scripts.
- Customized Vuser script for Parameterization, Correlation Query, Transaction point, Rendezvous point and set up the run time settings.
- Good exposure in creating Test Scripts, Test Plans and coordinating with development team to fix Bugs.
- Flexible with work Environments, having Good communication and Presentation Skills
TECHNICAL SKILLS:
Load Runner, ALM, Test Director, J: Meter, Bugzilla, JIRA, IBM clear quest, Windows Mobile, Visual basic and script, HTML, XML, Asp, .Net, Ms- office package, Windows 2000/7, Ubuntu Unix, T-SQL, PL/SQL Developer, Oracle, TOAD.
Frameworks: .NET Framework 4.0/3.5/3.0/2.0/1.1
Databases: MS SQL Server 2005/2008, Unix, Linux.
Operating Systems: Windows server 2003,2007 XP, Vista 2000.
Testing Tools: Load Runner, JMeter, Visual Studio
APM Tools: Dynatrace, AppDynamics, Perf Mon.
PROFESSIONAL EXPERIENCE:
Confidential, New York- NY
Sr.Performance Tester
Responsibilities:
- Coordinated with business team to get the performance requirements for the Load Testing, Stress Testing and Capacity Planning.
- Developed Performance Test plan and Test Case Design Document with the input from developers and functional testers.
- Created automation test scripts with Unified functional testing and performed various customizations to the test script invoking various checkpoints for error handling.
- Utilized LoadRunner and Performance Center for conducting performance tests. Extensively used LoadRunner using Virtual User Generator to develop scripts and customize performance test harness Web Protocol.
- Utilized LoadRunner Controller to execute multiple scenarios.
- Used Test Results to provide summary reports, response times and monitor averages. Provided results by analyzing average response time graph, throughput and hits per second.
- Extensive familiarity with protocols like Web (HTTP/ HTML), Web services and Citrix. Parameterized scripts to emulate realistic load.
- Involved in performing load and stress test on the application and server by configuring LoadRunner to simulate hundreds of virtual users and provided key matrix to the management.
- Experience with SOASTA cloud performance testing. Setting up the cloud environment and installing all necessary components for performance testing.
Environment: LoadRunner/ Performance Center 11.00, Websphere, Shunra, Weblogic, SiteScope, Soap UI, Java, .Net, Report Server Oracle Database, Cisco F5, SSL, Windows XP, JDK, SQL, Navigator.
Confidential, Clinton, NJ
Test Engineer/Lead
Responsibilities:
- Experience with Scalability, Load, Baseline, Stress and Endurance Testing.
- Hands on experience with LoadRunner for developing performance test scripts using Web http/html and ajax truclient protocol.
- Hands on Experience with JMeter for developing test scripts. Experience using bean shell processors.
- Experience using blaze-meter for developing testing scripts using .jmx files and enhancing them later via JMeter.
- Hands on experience with Dynatrace for performance monitoring using Pure-paths and dash lets capturing the server side and client-side performance metrics.
- Hands on experience with testing micro-services through Devops pipelines on AWS cloud.
- Good understanding of docker and Kubernetes for cloud performance testing.
- Experience with cloud watch working on AWS cloud.
- Experience with DynamoDB on AWS cloud.
- Experience using Splunk Log Analysis for identifying bottlenecks in Core and Web Tier Log Files.
- Experience writing regex queries using Splunk Log Analysis.
- Hands on Experience with Vugen for developing Test Scripts using various protocols like HTTP, Web- Services and Ajax Truclient.
- Co-ordinated with the development team for method level performance bottlenecks.
- Co-ordinated with the database team for fixing database related performance issues.
- Experience with J-profiler for debugging performance bottlenecks.
Environment: LoadRunner/ Performance Center12.60, JMeter 5.2.1, Unix, Dynatrace, Toad, Oracle 11G, Dynamo DB, CloudWatch, QC/ALM., C, J- Profiler, HTTP Watch, JIRA, Splunk.
Confidential, Berkeley Heights, New Jersey
Performance Engineer
Responsibilities:
- Configured Performance Test Environment in Azure cloud using Windows PowerShell scripts and installing all necessary components on IIS Application server.
- Experience developing performance test scripts with Claims, Policy and Billing modules. Created performance testing artifacts like Performance test Strategy document, Performance test Plan and Script design document.
- Experience working on LoadRunner and Microsoft visual Studio for Web performance testing of Ui and Service component.
- Hands on experience with JMeter developing test scripts and executing them in Non Gui mode.
- Used Http watch for recording the response times of various transactions and reporting it to development team.
- Hands on Experience on APP Dynamics for Monitoring Web Applications under test.
- Hands on experience with Microsoft test manager to create test cases and organize them into test plans and suites.
- Facilitate testing discussions and planning sessions with test leads from the other tracks of the project, i.e. Customer Experience Management, Communication Services, Release Management tracks, Automation Testing, to ensure optimal coverage of performance testing.
Environment: Microsoft Visual Studio Ultimate 12.0, LoadRunner/ Performance Center, Unix, Web Load, WebSphere, WebLogic, Microsoft Azure, SQL Server 2012, Microsoft Visual Studio, C, C++, ANTS Profiler, HTTP Watch, Subversion, Team Tracker, Team City, Team Foundation Server.
Performance Engineer
Confidential, Saint Louis, MO
Responsibilities:
- Designing the test plans which include scope, test strategies, test scenarios and types of tests to be executed.
- Involved in gathering business requirement, studying the application and collecting the information from developers, and business.
- Developed Neoload & LR Scripts using Flex, web, Web Services, Ajax True Client & Citrix Protocols.
- Extensively worked with Neoload to develop load test scripts for applications based on Web & Web services Protocols.
- Designed tests for Benchmark, Stress and Endurance Testing.
- Parameterized large and complex test data to accurately depict production trends.
- Validated the scripts to make sure they have been executed correctly and meets the scenario description.
- Analyzed results using Neoload Analysis tool and analyzed Oracle database connections, sessions, Web Logic log files.
- Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
- Maintained test matrix and bug database and generated monthly reports.
- Interacted with developers & architect during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
- Used Load Runner tool for testing and monitoring actively participated in enhancement meetings focused on making the website more intuitive and interesting.
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards.
- Provide support to the development team in identifying real world use cases and appropriate workflows
- Worked with business & Infrastructure teams and developed business & Infrastructure workload models for several applications.
- Measured performance units like response times, throughput etc. for web systems optimization. Built servers based on the system performance cycle and metrics.
- Performs in-depth analysis to isolate points of failure in the application
- Assist in production testing and capacity certification reports.
- Configured Offline & Online Diagnostics like J2EE/.NET Diagnostics through performance center.
- Used Dynatrace to monitor server metrics and Performed in-depth analysis to isolate points of failure in the application.
- Monitored system resources such as CPU Usage, % of Memory Occupied, VM Stat, I/O Stat using UNIX commands like top, vmstat, svmon and netstat.
- Analyzed JVM Heap and GC logs in Web Sphere during test execution.
- Extensive experience monitoring the purepaths for 3-Tier architecture using Dynatrace.
- Conducted result analysis and communicated technical issues with developers and architects
- Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution. Worked in an agile environment.
- Created comprehensive test results report.
- Manage Performance Center (ALM) servers, monitor the server status, edit server information, and check server performance.
- Manage timeslots, view user reservations, and monitor availability of time and resources Manage timeslots, view user reservations, and monitor availability of time and resources.
Environment: Neoload 7, Load Runner 12.55, ALM, Dynatrace, IIS, Web Logic, Oracle, Jenkins, JIRA, SOA Test, Java, Windows and LINUX, LAN, WAN, .Net, Java.
Performance Engineer\Performance Center Admin
Confidential - Cincinnati, OH
Responsibilities:
- Prepared Test Strategies, Test Plan and Test Cases as per the business requirements and Use Cases.
- Involved in Load Testing of various modules and software application using Load Runner 11.
- Developed the Load Test scripts using the Load Runner Virtual User Generator (VUGen) and enhanced the scripts by including transactions, parameterize the constant values and correlating the dynamic values
- Extensively worked as a Performance center administrator in installing & migrating from PC 9.52 to ALM 11.10 and then to ALM 11.5. Providing access to new users, Adding new projects.
- Raised several tickets with HP and fixed various issues related to ALM & loadrunner. Applied patches to the tools to keep them up to date.
- Developed Vuser Scripts in Web\HTTP, WebServices, ODBC, Oracle-2Tier, Winsock and Click & Script Protocols.
- Enhanced Load Runner scripts to test the new builds of the application
- Developing Vuser scripts and enhanced the basic script by Parameterizing the constant values
- Conducted Web services testing using SOAP UI.
- According to business specification, Customization of scripts by using Load Runner
- Carried out stress testing by introducing rendezvous points in the script
- Conducted testing on the servers using Load Runner & Performance Center to establish the load capacity of the server
- Using Load Runner analyzed the response times of various business transactions, modules login times under load, developed reports and graphs to present the test results
- Used Sitescope & Dynatrace to monitor the load test and to identify the bottle necks.
- Exclusively used Dynatrace to monitor the Performance of the application at a 3-Tier architecture level and analyzed the metrics like JVM heap size and expensive SQL queries.
- Installed Dynatrace probes in Application & Database Servers. Configered the complete Dashboard for easier Monitoring & Analysis.
- Monitored the performance of the Web and Database (SQL) servers during Stress test execution.
- Extensively used Rstat to understand various running daemons in the unix/linux servers, when we encountered an application bottleneck.
- Defined transactions to measure server performance under load by creating rendezvous points to simulate heavy load on the server
- Used Load Runner for checking the performance of Various Reports and Queries under load
- Analyzed the results of the Load test using the Load Runner Analysis tool, looking at the online monitors and the graphs and identified the bottlenecks in the system
- Reported and entered bugs in Quality Center
- Tested for the compatibility of the Browser.
- Developed High Level and Detailed Test Plans and reviewed with Team, demonstrated Customer Level experience to team. Identify critical functionality at business level.
- Updated test matrices, test plans, and documentation at each major release and performed Regression testing using automated script.
- Managed/Updated Shared object repository from time to time using Object Repository Manager.
- Used environment variables as global variables to pass the values between actions.
- Carried out the manual testing of different interfaces.
- Provided Test Estimates for various phases of the project.
- Reported and tracked defects in the Quality Center bug tracking system.
- Automated the test cases using Quick Test Professional
- Performed QA Process management by automated process, identified functional changed vs. business impact and trained QA team with cross business training.
- Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
- Supported Production team to understand and execute the processes.
- Created application documentation to assist in the support and training of users.
Environment: LoadRunner 9.52,11.1, Performance Center 11, Quality Center, SiteScope, Wily IntroScope, DynaTrace, QTP, SQL Server, SQL Profiler, Windows, UNIX, Weblogic, Websphere, Performance Center, XML
Performance Engineer
Confidential - Columbus, OH
Responsibilities:
- Participated in requirement analysis and prepared performance test documents.
- Involved in specifying the functional as well as security requirements for performance testing.
- Performed Load test, Stress test, Benchmark Profile test, Fail -Over test, Fail - Back test against supported configurations.
- Performed Benchmark test against non-clustered and clustered application configurations.
- Worked in Loadrunner Winsock, Web (http/html), Web Services and Oracle Protocols.
- Identified system capacity, system scalability and stability under stable load as well as under pick load time.
- Load Runner was used to simulate multiple Vuser scenarios. Defined Rendezvous point to create intense load on the server and thereby measure the server performance under load.
- Verified the connectivity from Controller to the Load Generator. Utilized the IP address of Load Generators to add them to the Controller.
- Analyzed Transaction Profile diagrams to identify the business process that needs load testing
- Parameterized test scripts to send realistic data to the server and avoid data caching
- Performed system performance & load benchmark measurements for capacity, scalability and breakpoints.
- Monitor the performance of Oracle middle-tier form server in different environment to correlate with the user load.
- Provide statistics on the buffers, workload processing, CPU and memory utilization, database activity, system errors, buffer swaps and table locks.
- Analyzed Online Monitor Graphs like Runtime Graphs, Transaction Graphs, Web Resource Graphs and System Resource Graphs.
- Developed report and presented to the management.
Environment: Load runner, QTP, Quality center, Java, XML, Nmon, Oracle, Business Objects, SQL Server 2000/2005, Windows XP, Telnet, Web Sphere, Lotus Notes and UNIX