- Around 8 years of experience in IT, specialized in Performance Testing of client servers web and mobile applications in various domains like Finance, Banking, etc.
- Expertise in end to end performance testing including Non - Functional test Requirement Gathering, performance test planning, Scripting, Execution, Tests monitoring and Results analysis and Reporting.
- Expertise in Virtual User Generator (VUGen) scripting for performance/load testing, Multiple protocols (Web HTTP/HTML, web service, Ajax TruClient, Mobile, Citrix).
- Extensively involved in Modifying Vuser scripts using correlation, parameterization, think time, pacing, error handling, user data validation, checkpoints and custom code based on the application under test.
- Expertise in HP LoadRunner, Performance Center and JMETER.
- Skilled in debugging and adjusting scripts by running in VuGen with Runtime Settings logs set to display all messages.
- Excellent experience in executing different types of Performance tests like Load, Smoke, Baseline, Stress, Volume, Scalability, Endurance, and other support tests.
- Well versed with various Software Development methodologies Waterfall and Agile.
- Client interaction, interaction with business to prepare WLM, Test Plans, Test scripts, data setup for test scenarios mapping to real-life scenarios.
- Experience on application monitoring and worked on tools such as CA APM, Dynatrace, Splunk.
- Analyzed server logs for errors using Splunk Source and Destination with appropriate time frame.
- Analyze production logs using Splunk and identify performance testing volume for REST APIs.
- Diagnose application performance bottlenecks using Dynatrace and monitor application health using Splunk.
- Good exposure to CI/CD concept and experience in using Jenkins for continuous testing.
- Knowledge of Performance trouble shooting (Thread dump, heap dump & Transaction traceability).
- Strong working knowledge in usage of Unix/Linux commands and ability to write shell scripts.
- Strong knowledge with defining and deploy monitoring, metrics, and logging systems on AWS
- Tested Applications that are migrated to Cloud (Azure and AWS).
- Skilled in Troubleshooting and debugging of scripts and execution issues.
- Experience in CA Rally for executing projects in agile methodologies.
- Highly motivated, dedicated, love to work in fast-paced challenging endeavors as strong team player with excellent analytical and communication skills.
- Actively participated in walk-through, inspection, review and user group meetings for quality assurance.
- Expertise in Problem solving & defect Tracking Reports using Defect Tracking Tools (JIRA, Quality Center).
- Experience in leading onsite/offshore performance teams.
- Solid presentation skills including experience with Power Point, Word, Excel.
- Excellent verbal, written and analytical skills with ability to work in a team as well as individually in fast paced, dynamic team environment.
Performance Test Tools: Micro Focus Load runner, Performance Center, Apache JMeter, Blaze meter, Gatling
Test Management / Defect Tracking Tools: HP ALM/Quality Center, Test Director, Bugzilla
Monitoring Tool: CA APM, Dynatrace, Splunk
Productivity Tools: MS Office Suite 2007, 2003, 2000
Web Server: WebSphere, WebLogic, Tomcat
Databases: Oracle 10g/11g, DB2, MS SQL Server, MS Access, Teradata
Operating Systems: Windows98/NT/2000/XP/VISTA /Win7/Server 2003, UNIX, LINUX,SOLARIS
Browsers: MS Internet Explorer, Netscape, Mozilla, Chrome
Methodologies: Agile, Waterfall
Performance Test Engineer
- Involved in decision-making and planning process with application Architect, BSAs and Project Manager to define performance-testing tools, testing approaches and capacity planning as per product owner expectations. Analyzed the Performance requirement and design documents.
- Written LoadRunner Scripts, enhanced scripts with C functions, Parameterized Users, stored dynamic content in LoadRunner functions, used client-side secure certificates.
- Responsible for developing Performance Test Scripts using Gatling.
- Develop test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems.
- Creating Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and Presentation of results.
- Extensively Worked in Web/HTTP, SOAP/REST API Web Service Protocol in LoadRunner and Used Virtual User Generator (VuGen) to generate LoadRunner Scripts to ensure that quality issues appropriately identified, analyzed, documented, tracked, and resolved.
- Created Single User, Base Line and Soak test scenarios. Random pacing between iterations introduced to get the desired transactions per hour.
- Executed Performance test and reporting of results of database queries testing using open source tool Jmeter.
- When end user experience monitoring is a high priority, more businesses today choose application performance monitoring (APM) solutions from Dynatrace to implement solutions for real user monitoring and user experience testing.
- Working with SDLC team to troubleshoot root cause of the issues related DB and Application servers using Dynatrace Tool.
- Collected performance monitoring statistics coordinated with tech architects, business analysts to analyze the performance bottlenecks & provided recommendations to improve the performance of the application.
- Used Dynatrace to measure web site performance in test environment to capture performance metrics of key product features.
- Used Blaze meter executed the performance tests in cloud performance testing.
- Used JIRA for Project Planning, Sprint Planning and defect tracking for different Projects.
- Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations.
- Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
- Used CA APM for performance test monitoring to take measurement of % CPU usage, JVM Heap memory Usage, Average response times, and database monitoring.
- Involved in documentation for the installation of LoadRunner on AWS
- Involved in documentation for Administrator perspective for integration with LoadRunner and AWS
- Involved in Performance troubleshooting to drill down application performance issue and GO NO-GO decision.
- Created Reports/Graphs using Load Runner analysis and Splunk tools.
- Followed Agile & Scrum Methodology in this project.
- Responsible for analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report and present to Project Manager and Business executives.
- Used SPLUNK tool to check whether the messages are triggering at back end.
- Used CA APM to further monitor various graphs like VM Heap, GC, threads status, Java Process utilization, JVM exceptions, collection leaks and context switch\sec to pin point issues.
- Collected the frequency of JVM Heap & Garbage Collection in DCT Server during Test.
- Worked on performance testing report and made recommendation for systems application performance improvement.
- Identifying the problems, prioritizing them and communicated the bugs to the developers using Bug Tracking Tool Quality Center.
Performance Test Engineer
- Managed a team of Performance Testing resources both onsite and offshore.
- Defining the performance goals and objectives based on the client requirements and inputs.
- Extensively Worked in Web, Web services Protocol in LoadRunner.
- Developed Performance test suites using LoadRunner.
- Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production.
- Responsible for creating rules and alerts for production in Dynatrace.
- Used Web (Http/Html), Web Services protocols in VuGen.
- Independently developed various LoadRunner scripts using various protocols like Web (http/html), Web Services.
- Enhancing scripts by building complex correlations, parametrize the static values, using groovy and java codes in Jmeter and C-language code in Loadrunner to replicate business scenarios
- Responsible for developing and executing performance and volume tests
- Developed test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems.
- Produced documentation for Performance team on setup of JMeter test environment, and assisted with research on distributed testing best practices.
- Setup and maintained Site scope monitors and used Wily Introscope for in depth Java JMX diagnostics and analysis.
- Parameterized unique Id’s and stored dynamic content in variables and paired the values to maintain uniqueness.
- Mainly focused on Monitoring using Splunk for monitoring production traffic and writing queries to find out the errors, exceptions and HTTP failures with wide variety of query combinations.
- Monitored database for sessions, connection pool and Memory issues.
- Accurately produced regular project status reports to senior management to ensure on-time project launch.
- Review of Business Requirement Document, Functional specification Document.
- Preparation of Detailed Test Plan, Test Coverage Matrix and Test templates.
- Gained good experience in coordinating/facilitating internal (System Level) testing sessions as well as external (User Acceptance) testing sessions.
- Prepared Test Cases
- Review the Test Cases
- Execution of Test Cases
- Maintain the Traceability Matrix.
- Test director for defect tracking
- Performed Ad-hoc testing to find the uncovered bugs.
- Generated the detailed reports of the Bugs, Pass-Fail report and comparison chart.
- Involve in daily status meeting
- Worked with development teams to ensure testing issues are resolved on the basis of using defect reports.