Performance Test Engineer/lead Resume
Jersey City New, JerseY
SUMMARY
- Over 5 years of experience in tools like Load runner, J - Meter and VSTS in testing Stand-Alone, Client/Server, Web Based applications front end and back end performance testing.
- Good understanding of application architecture or developing performance testing framework.
- Hands on Experience with various protocols of Load Runner such as HTTP, Web-Services, Ajax Truclient, Siebel, SAP GUI and FTP.
- Developed Test Plan, Test Strategies, Test Cases, Test Scripts, Test Matrix and Test Summary Report.
- Good understanding of Software Development Life Cycle (SDLC) and QA testing methodologies.
- Hands on Experience with end to end performance testing Framework working on On-Prem Environment and on Cloud Environment. Worked on Salesforce, Microsoft Azure and AWS.
- Hands on Experience with Capacity Planning of Applications under test. Performed Capacity planning of application server and web servers along with DB Server.
- Hands on experience with Apache Tomcat and IIS Application Server for performance tuning of method level exceptions.
- Experience working in Agile Environment where CICD pipelines have been integrated with AWS and testing tools.
- Experience with bitbucket and AWS cloud to create performance testing Pipelines.
- Hands on Experience with Dynatrace and App Dynamics for Application performance monitoring.
- Hands on experience working with SPLUNK Log Analysis to debug APP, DB and CORE Tier Log Files.
- Created Dashboards with Splunk and written regex functions with SPLUNK.
- Hands on Experience with Load Runner Vugen for developing test scripts.
- Customized Vuser script for Parameterization, Correlation Query, Transaction point, Rendezvous point and set up the run time settings.
- Good exposure in creating Test Scripts, Test Plans and coordinating with development team to fix Bugs.
- Flexible with work Environments, having Good communication and Presentation Skills
TECHNICAL SKILLS
Technologies: Load Runner, ALM, Test Director, J-Meter, Bugzilla, JIRA, IBM clear quest, Windows Mobile, Visual basic and script, HTML, XML, Asp, .Net, Ms- office package, Windows 2000/7, Ubuntu Unix, T-SQL, PL/SQL Developer, Oracle, TOAD.
Frameworks: Java, .NET Framework 4.0/3.5/3.0/2.0/1.1
Databases: MS SQL Server 2005/2008, Unix, Linux.
Operating Systems: Windows server 2003,2007 XP, Vista 2000.
Testing Tools: Load Runner, JMeter, Visual Studio
APM Tools: Dynatrace, AppDynamics, Perf Mon.
PROFESSIONAL EXPERIENCE
Confidential, Jersey City, New Jersey
Performance Test Engineer/Lead
Responsibilities:
- Experience in implementing Proof of Concept, Baseline, Scalability, Regression, Stress and Endurance performance testing.
- Experience in Resilience performance testing.
- Hands on experience with LoadRunner for developing performance test scripts using Web http/html and ajax truclient protocol.
- Developed Microservices using JMeter. Performance test Springboot services with APIGEE layer.
- Hands on Experience with JMeter for developing test scripts. Experience using bean shell processors.
- Experience using blaze-meter for developing testing scripts using .jmx files and enhancing them later via JMeter.
- Hands on experience with Dynatrace for performance monitoring using Pure-paths and dash lets capturing the server side and client-side performance metrics.
- Hands on experience with testing micro-services through DevOps pipelines on AWS cloud.
- Good understanding of docker and Kubernetes for cloud performance testing.
- Experience with cloud watch working on AWS cloud.
- Experience with DynamoDB on AWS cloud. Experience working on Oracle DB.
- Experience using Splunk Log Analysis for identifying bottlenecks in Core and Web Tier Log Files.
- Experience writing regex queries using Splunk Log Analysis.
- Hands on Experience with Vugen for developing Test Scripts using various protocols like HTTP, Web- Services and Ajax Truclient.
- Co-ordinated with the development team for method level performance bottlenecks.
- Co-ordinated with the database team for fixing database related performance issues.
- Experience with J-profiler for debugging performance bottlenecks.
- Experience with JIRA for project management. Created user stories for performance testing.
- Experience with ALM for creating and tracking performance testing defects.
- Used SOAP Ui, Postman and Swagger for functionally validating the web services.
- Used http watch and fiddler for capturing the web service calls details.
Environment: LoadRunner/ Performance Center12.60, JMeter 5.2.1, Unix, Dynatrace, Toad, Oracle 11G, Dynamo DB, CloudWatch, QC/ALM., C, J- Profiler, HTTP Watch, JIRA, Splunk.
Confidential, Whitehouse, New Jersey
Performance Engineer
Responsibilities:
- Configured Performance Test Environment in Azure cloud using Windows PowerShell scripts and installing all necessary components on IIS Application server.
- Experience developing performance test scripts with Claims, Policy and Billing modules. Created performance testing artifacts like Performance test Strategy document, Performance test Plan and Script design document.
- Experience working on LoadRunner and Microsoft visual Studio for Web performance testing of Ui and Service component.
- Hands on experience with JMeter developing test scripts and executing them in Non Gui mode.
- Experience in micro-services performance testing. Used to JMeter to develop Independent and Cascaded microservices.
- Used Http watch for recording the response times of various transactions and reporting it to development team.
- Hands on Experience on APP Dynamics for Monitoring Web Applications under test.
- Monitored Application and DB log files using Putty at runtime.
- Resolved performance bottlenecks like out of memory exceptions and deadlocks.
- Experience in thread dump and heap dump analysis to resolve performance bottlenecks.
- Used fiddler and http watch for capturing service call details.
- Used Tortoise SVN for version control.
- Used TFS and Team Tracker for project management.
Environment: Microsoft Visual Studio Ultimate 12.0, LoadRunner/ Performance Center, Unix, Web Load, WebSphere, WebLogic, Microsoft Azure, SQL Server 2012, Microsoft Visual Studio, C, C++, ANTS Profiler, HTTP Watch, Subversion, Team Tracker, Team City, Team Foundation Server.
Confidential, Easton- PA
Performance Tester
Responsibilities:
- Executed Load, Stress and Regression performance test for web applications under test.
- Developed Performance Test plan and Test Case Design Document with the input from developers and functional testers.
- Created automation test scripts with Unified functional testing and performed various customizations to the test script invoking various checkpoints for error handling.
- Utilized LoadRunner and Performance Center for conducting performance tests. Extensively used LoadRunner using Virtual User Generator to develop scripts and customize performance test harness Web Protocol.
- Utilized LoadRunner Controller and Performance Center to execute multiple scenarios.
- Used Test Results to provide summary reports, response times and monitor averages. Provided results by analyzing average response time graph, throughput and hits per second.
- Extensive familiarity with protocols like Web (HTTP/ HTML), Web services and Citrix. Parameterized scripts to emulate realistic load.
- Involved in performing load and stress test on the application and server by configuring LoadRunner to simulate hundreds of virtual users and provided key matrix to the management.
- Experience with SOASTA cloud performance testing. Setting up the cloud environment and installing all necessary components for performance testing.
- Used ALM for creating and tracking performance testing defects.
Environment: LoadRunner/ Performance Center 11.00, Websphere, Shunra, Weblogic, SiteScope, Soap UI, Java, .Net, Report Server Oracle Database, Cisco F5, SSL, Windows XP, JDK, SQL, Navigator.