Automation & Performance Engineer Resume
New York, NY
SUMMARY:
- Over 9 years of IT experience working as a Performance and Automation Test Engineer, involved in solving challenging performance problems in high - traffic multi-tier architectures
- Use Selenium Webdriver 2.33 for testing the applications functionality
- Worked on load testing tools like JMeter, HP LoadRunner 12.01 and HP Performance Center
- Complete understanding of SDLC, QA testing methodologies and performance testing lifecycle
- Work closely with Development team and Client Business Team to get an understanding of the system architecture, system component interactions, application load pattern and the Performance SLA
- Prepare Test Plan for the performance testing activities and prepared a detailed test design on various load tests to be conducted based on the requirements gathered.
- Experience in creating VUGEN scripts in various protocols like Web (HTTP/HTML), Web Services (SOAP, Restful Services), Ajax Tru-Client, testing stored procedures and messaging queues using Java protocol
- Experience in scripting complex flows handling dynamic values and data inserts/updates
- Experience in designing the test scenarios to conduct baseline tests, scalability, spike, fail-over and stress tests
- Worked on measuring and tuning the front end/ browser side performance using Httpwatch, JMeter WebDriver samplers
- Working knowledge on Functional testing of SOAP services using SOAP UI, Google Chrome extension like Advanced Rest Client, Postman
- Perform CPU and Memory monitoring and extracted web, application and database servers’ metrics using Dynatrace
- Good understanding on application architecture including load balancing, clustering, JVM level insight, connection pooling etc.
- Prepare detailed test reports, Dashboards and summary reports highlighting the different load tests conducted and the performance achievements made from the engagement.
- Monitor messages queues in GEMS
- End-to-End performance testing experience starting from Requirements gathering to Report Preparation.
- Working knowledge of Test Management Tools like HP ALM and APM tools like App Dynamics, Dynatrace and OEM.
- Intermediate level of experience for Excel, MS Word, Unix Commands, TOAD, SQL Developer and SQL Queries, Mongo DB
- Collaboratively work with Capacity Planning Team to obtain the performance expectations of performance scenarios to be executed & provided results to validate current forecast models, and for future infrastructure architecture planning.
- Has been a performance lead and a great team player.
- Lead staff in the preparation, conduct of testing of new or revised applications/systems, and ensure tests are successfully completed and documented.
- Provide up-to-date information on project status, quality metrics, issues, and risks.
- Lead the development of a consistent testing methodology. Ensure appropriate standards and practices are documented, maintained and applied to all the projects and services supplied by the discipline.
- Review all commitments made by team members to ensure they can be realistically achieved and follow up to make sure they are met
- Good Analytical, Interpersonal and communication skills. Result oriented, committed, Quick Learner and hard working with a quest to learn new technologies and undertake challenging tasks
TECHNICAL SKILLS:
Testing Types: Performance Testing and Functional Testing
Testing Tools: HP LoadRunner, JMeter, Selenium and POC on Soasta and Gatling
Monitoring Tools: Dynatrace, HP Site Scope, HTTPwatch, Jprofiler, Graphana, Blazemeter Loadosophia
Test Repository Tools: Git, HP Performance Center, JIRA
Integration Tools: Jenkins
Other Tools: SoapUI
Operating Systems: Windows, UNIX and LINUX
PROFESSIONAL EXPERIENCE:
Confidential, New York NY
Automation & Performance Engineer
Responsibilities:
- Work in a Site Reliability team for building tests in CI/CD pipeline model
- Familiarity with Agile methodology
- Participate in sprint planning for testing the newly developed piece of code developed in Agile methodology
- Work in all the phases of testing from manual testing the functionality of the application, automating the test cases using selenium web driver
- Run regression tests for the whole suite of test cases
- Load test each new Restful Web Service through JMeter
- Monitor Response times and other metrics in Graphana and Blazemeter Loadosophia
- Monitor Front end performance through JMeter WebDriver
- Use APM tools like App Dynamics for drilling down the issues
Environment: On Premise Unix Env., AWS cloud env, Selenium Web driver2.33.0, JMeter 3.0, Loadosophia, SOASTA, Jenkins, JIRA, GitHub, Mongo DB, Slack, Hipchat
Confidential, Johns Creek GA
Senior Performance Engineer
Responsibilities:
- Participated in the identification, understanding, and documentation of business requirements, keeping in mind the need for the application based on the project Scope and SDLC methodology.
- Worked with development managers to translate the business requirements into functional requirements (FRD) and non-functional requirements.
- Developed Onsite / offshore resource planning
- Developed week wise resource allocation based on functional level of test strategy.
- Monitor daily test execution, Prepare contingency plans in order to mitigate risks identified.
- Created Vuser Scripts, which emulated typical business transactions and user actions by utilizing Virtual User Generator.
- Developed Vuser scripts by recording test flows, adding parameterization, correlation and custom code as needed to enhance the scripts.
- Worked with Web services like SOAP and Restful services (JSON Web API)
- Created scripts for testing the messaging queues using Java protocol and monitoring the queues using TIBCO GEMS
- Tested Business Intelligence tools with Truclient Protocol
- Created scripts for testing oracle stored procedures using Java protocol
- Parameterized Vuser Scripts using random, sequential and unique options in Load Runner VUGen 11.5 and 12.01.
- Involved in doing various tests in Performance Center
- Verified the database SQL queries in SQL Developer
- Executed test scenarios and monitored Web, App and servers' usages of CPU, JVM Heap, Throughput, Hit Ratio, TPS/Sec, Deadlocks, IO and Garbage Collections
- Observed Purepaths, Web requests, Connection Pool size Tagged web requests in Dynatrace during load test
- Measured response time for web service call provided for user and time usage for process in each internal call between each component.
- Lead all performance test reviews
- Involved in walkthroughs and meetings with the Performance team to discuss related issues.
- Review all performance test artifacts for comment and approvals
- Analysed various graphs and generated reports using analysis tool.
- Prepared final executive summary report at the end of the engagement with recommendation
- Publish performance analysis reports and conduct review sessions with all the stakeholders.
Environment: Unix Servers, Web services, HP Performance Centre 11.5 &12.01, HP ALM, HP LoadRunner 11.5 & 12.01, Dynatrace 6.0, SQL Developer, MS Office
Confidential, NJ
Performance Test Engineer
Responsibilities:
- Working with clients, project teams, development teams and business analysts on a daily basis to understand business needs and relate them in terms of performance testing.
- Educate teams on performance testing processes, methodology, performance & scalability and the science of testing
- Assisted Troubleshooting & System Tuning
- Coordinate with offshore team for creation of test scripts & test execution.
- Generate Daily status report to management providing details on test execution
- Identified business critical transactions of the application from production logs
- Prepared Work Load Model for every performance test conducted for the application.
- Analysing required effort in terms of resources needed, script complexity, scenario design challenges, overall man-hours required and documented before every project execution
- Worked in creation of scripts using Load Runner, Test Data Preparation
- Created Vuser scripts in VUGen and enhanced them with correlation, parameterization and updating various runt time settings like Run logic, think times, iterations, pacing, logging options and preferences.
- Validated the Environment readiness for Performance Testing by doing Smoke Test Execution in Controller.
- Did User migrations testing during the migration of customers from DNBi supply management to the newly build Customer Application.
- Worked with SQL queries to extract data from TOAD and to verify the integrity of the database
- Performed Load & Stress Test and Endurance Test executions in Controller
- As part of capacity management team, have tested all the DNB applications for multiple server deployments, DB upgrade and migrations.
- Monitored application, web and database servers using Shell scripts (VMSTAT) in XSHELL.
- Used Performance monitor to analyse the % CPU Usage, Memory and Requests per Second for each Scenario using different tools such as nmon, vmstat, sar, top etc.,
- Analysed the performance test results using the results and log files collected.
- Analysed the server resources such as Available Bytes and Process Bytes for Memory Leaks
- Identified Performance Issues using AWR reports and gave tuning recommendations for SQL tuning.
- Prepared executive summary reports for senior management review
- Preparing the Daily Status Report and Weekly Status Report
Environment: Linux servers, HP Performance Centre 11.5, HP Load runner 11.5, XShell, TOAD, OEM, MS Office
Confidential, NY
Performance Test Engineer
Responsibilities:
- Participated in the identification, understanding, and documentation of business requirements, keeping in mind the need for the application based on the project Scope and SDLC methodology.
- Worked with development managers to translate the business requirements into functional requirements (FRD) and non-functional requirements.
- Developed Onsite / offshore resource planning,
- Prepared WBS - Work break down structure for test life cycle. Depicting plan of environment availability & setup - Identify project specific resources. (HW, SW, Drivers, Utilities etc...), data requirements for system & interface level of testing.
- Developed week wise resource allocation based on functional level of test strategy.
- Monitor daily test execution, Prepare contingency plans in order to mitigate risks identified.
- Created VUser Scripts, which emulated typical business transactions and user actions by utilizing Virtual User Generator.
- Developed Vuser scripts by recording test cases, adding checkpoints, parameterization, correlation and custom code as needed to enhance the scripts.
- Developed VUser Scripts in Web, Web services protocols.
- Parameterized VUser Scripts using random, sequential and unique options in Load Runner VUGen.
- Involved in doing various tests like Performance Test, Capacity Test, Longevity Test, Simulation Test
- Executed test scenarios and monitored Web, App and servers' usages of CPU, JVM Heap, Throughput, Hit Ratio and TPS/Sec and Garbage Collections
- Measured response time for web service call provided for user and time usage for process in each internal call between each component.
- Lead all performance test reviews
- Involved in walkthroughs and meetings with the Performance team to discuss related issues.
- Review all performance test artifacts for comment and approval
- Analysed various graphs and generated reports using analysis tool.
- Prepared final executive summary report at the end of the engagement with recommendation
- Publish performance analysis reports and conduct review sessions with all the stakeholders.
Environment: Windows servers, HP Load runner 9.52, Windows Server 2003, Quality Center 10, Web services, Perfmon, MS Office
Confidential
Performance Test Engineer
Responsibilities:
- Worked in agile development environment with frequently changing requirements and features set.
- Created and analysed relevant testing environment so that the application to be tested meets the production requirements
- Interacted with business analysts and developers in requirements analysis, design reviews, testing and documentation for application developed in agile environment.
- Educate clients for the need to performance test the applications before deploying in production and categorize different load test scenarios.
- Assisting Project Manager with deliverables, maintaining project plans and schedules.
- Defining Performance Test Strategy and development of Performance Test Plan including all necessary details of Performance Testing (Objectives, Scope, Resources, Scenario Details, Test cases and Schedules).
- Performed Web log analysis to deduce workload and understand the peak workload use cases and peak connected sessions with different timings involved for performance testing understand the prime need to performance test each case
- Involved in Creating Test Scripts and Developed custom scripts for different workflows, user roles and business transactions
- Created Basic Scenarios for execution in Controller
- Responsible for Executing Load tests
- Measuring performance metrics (response times, throughputs, hits/sec etc.) and monitored resource demand metrics (%CPU, memory, etc.) ...
- Involved in preparing Load test Summary Report based on the Performance Metrics and Summarized result findings into meaningful charts and graphs
- Adding new graphs to the Analyzer reports, comparing results with SLAs, merging two or more graphs to compare results, exporting HTML reports.
- Coordinated with developers, database administrators while running the tests to monitor the server performance thus helped in identifying the performance bottlenecks.
Environment: Windows Servers, Quality Center/ALM, HP Load runner, Oracle, UNIX, SQL.