Performance Lead /devops Resume
Bentonville, AR
SUMMARY
- 10+ years of experience as a Performance /Automation Test lead involved in Testing of large - scale systems worked for large companies
- Expert in Load Testing and Stress testing large scale multi user Applications using HP Loadrunner, Performance Center, JMeter, Blazemeter and Gatling .
- Experience with Kafka, IBM MQ, ACTIVE-MQ, Rabbitmq and Redis .
- Experience with scripting in Unix/Linux .
- Experience in Performance monitoring tools HP Site Scope, Dynatrace, prometheus and grafana.
- Knowledge and understanding of Kubernetes and containers.
- Hands-on experience using profiling tools .
- Hands on into Analysis part (Bottleneck analysis,CPU utilization,Memory utilization)
- Hands-on knowledge of Splunk .
- Experience in web Service development using Spring Boot and Java .
- Experience Developing Spark scripts using Scala language .
- Experience in SQL /Nosql databases (Sql server, oracle and Cassandra).
- Experience in automated testing tools cucumber, Jmeter and Selenium
- Excellent working knowledge in Developing & Implementation of complex Test Plans, Test Cases and Test Scripts using Automated test solutions for Client/Server and Web-based applications.
- Excellent working knowledge on Vugen scripting protocols like HTML/HTTP, Web Services, AJAX TruClient,Mobile, Flex, Oracle ADF, Sharepoint, web socket and many other protocols
- Experience in test management tools HP Quality Center, Experience using HP ALM,JIRA, VSTS, Bug Base, Bugzilla .
- Experience in Software Development Life Cycle and Testing Methodologies.
- Familiar with mobile operating systems and performance test for mobile applications.
TECHNICAL SKILLS
Programming Languages: C#. Net, asp. Net, C, java, SQL and PL/SQL
Software: Oracle, Microsoft SQL Server, and Microsoft Access., Microsoft office share point 2007, IBM Rational Functional tester., IBM rational App-test (Security testing tool), Web performance trainer (Performance testing tool), Microsoft Visual Source Safe, Quest Software (Toad), Microsoft Windows 2000, Microsoft Windows XP, Microsoft Project, Microsoft Office
PROFESSIONAL EXPERIENCE
Confidential, Bentonville, AR
Performance Lead /Devops
Responsibilities:
- Performance test and tune kubernetes for optimal performance.
- Performance Test and tune Redis for optimal performance.
- Performance Test and tune Rabbit MQ for optimal performance.
- Performance testing for IBM MQ using load Runner / Jmeter .
- Setup Kubernetes Environment
- Install Prometheus Monitoring
- Install Grafana Monitoring
- Deploying Rabbitmq HA
- Deploying Redis/Sentinel HA
- Microservice development using spring boot .
- Create Sql scripts for data Purge .
- Use Liquid base for Database changes
Tools: Used: Load Runner, Jmeter, RabbitMQ PerfTest, IntelliJ IDEA, STS, CircleCI, Prometheus, Grafana, GitHub, Cassandra, DynaTrace, Jira
Confidential, Bentonville AR
Performance /Automation Test Lead
Responsibilities:
- Creating performance test strategy, design, planning, workload modelling, and elicitation of non-functional requirements for testing
- Create Performance scripts using Jmeter / Scala .
- Performance Results Mentoring /Analysis /Reporting.
- Java Heap memory analysis using DynaTrace / Java mission control.
- Build and enhance performance/test environments
- Creating Automation scripts using Selenium /Cucumber.
- Complete Automation for the Resiliency testing process using Jmeter / Concord .
- Complete Automation for the performance testing process using Concord / Jmeter.
- Developing Spark scripts using Scala language.
- Developing REST Web services using Spring Boot.
- Writing UNIX shell scripts.
- Code deployment using oneops /Looper .
- Wiring Ansible script for Concord.
- Witting Cassandra T-sql queries.
- Understanding Kafka Messaging basic concepts.
Tools: Used:Load Runner, Jmeter, Gatling,, IntelliJ IDEA, Spark, Jenkinens,cloud configuration management, GitHub, Cassandra, Concord, Java mission control, DynaTrace, OneOps,, Grafana, Automaton, Eris, VSTS, Jira
Confidential
Performance Test Specialist
Responsibilities:
- Meet with developers, architects, business analysts to review and gather information such as architectural design, user functionalities, batch dependencies, etc.
- Involving in the application architecture design meetings and provide recommendations to avoid future performance problems.
- Create workload model, Performance thresholds, Estimations, Develop Performance Strategy and implement them.
- Create performance scripts using Jmeter / Loadrunner /Blazmeter using the HTTP, Web services, etc. protocol (as applicable) to emulate the application.
- Use Site scope, Dynatrace, Perfmon and other APM tools to monitor the application and analysis the root cause of performance issues
- Involved in Root cause analysis and Performance troubleshooting.
- Create automation scripts using selenium IDE /Eclipse .
- Lead the performance team and conduct scripts review, solving technical issues and conduct sessions to new performance testers
Confidential
Test analyst
Responsibilities:
- Determine the flow of transactions for capturing as part of Performance Testing.
- Perform different types of Tests Web, API, Unit testing
- Create a comprehensive performance test plan and/or test strategy document.
- Assist the project team in the creation and review of service level agreements (SLAs) for various functionalities.
- Setup Production monitoring criteria.
- Determine Acceptance criteria for completion of the Capacity Planning and Performance Testing phases; Determine monitoring requirements and setup monitoring
- Create performance test data
- Create performance scripts using Jmeter / Load runner using the HTTP, Web services, etc. protocol (as applicable) to emulate the application.
- All scripts are to be appropriately correlated, parameterized, check points, think time added.
- Write java Bean Shell scripts.
- Review scripts with Performance Team and/or business team.
- Determine and validate system functions and user patterns.
- Build usage models based on these inputs.
- Setup Performance Test users and performance test data.
- Validate and configure connectivity and functionality.
- Run Baseline or Benchmark tests under light load to validate the correctness of the automated test scripts
- Run performance tests targeted at the applications from either an external cloud location or from inside the internal network.
- Run tests directly against the external components to determine performance and capacity requirements by eliminating network latency and external components.
- Run scheduled tests such as user experience tests, endurance tests, stress tests, etc.
- Execute batch processing jobs to evaluate performance and resource consumption.
- Performance-test execution involves running every test script and collecting results for all KPIs and metrics in the test plan.
- Analyze the results after each test and determine whether acceptance criteria are met and determine if tuning is required.
- This analysis may or may not include formal reporting.
- If necessary, perform Ah-Hoc testing focusing on a particular component for troubleshooting or tuning purposes.
- Run Re-tests after every tuning change in order to determine the impact on performance.
- Publish an informal report after each interim performance test.
- Perform root cause analysis
- Publish a formal final report after all performance criteria have been met.
Environment:
- OS: Windows Server 2007
- DB: Oracle, SQL server 2005-2008
- Languages: C#, Java
- Web Server: Apache
- Bug Tracking System: Quality Center
- Source Control: Start Team
Confidential
Software Quality Engineer
Responsibilities:
- Plan testing activities
- Requirement Study, Project Plan, Functional Diagram, Flow Charts, Logical, Physical Data Model
- Perform testing tasks requiring planning, scheduling, and testing to assure that developed products meet design specifications and are within total quality management standards.
- Defect Tracking using Bug Base.net and other defect tracking software tools
- Communicates with Software Engineers on development issues
- Requirements’ review.
- Write the Test Cases and Execute the Test Cases in Aptest manager
- Sharing in design meetings.
- Perform testing measurements to be able to take shipment decision.
- Performance testing (Using web performance trainer performance tool)
- Security testing (Using IBM Rational App-Scan tool )
- Automation testing (Using IBM Rational Functional tester )
- Creating Reports and Graphs in Aptest Manager.
- Developing the Test Scenarios and Test Cases for Integration and System testing.
- Using SQL Queries for checking the output of various reports.
Environment:
- OS: Windows Server 2003, Linux
- DB: Oracle, SQL server 2005-2008
- Languages: C#, Java
- Web Server: Apache
- Bug Tracking System: Bug Base. Net, Bugzilla, Bug host
- Source Control: Microsoft visual source safe, VSS
Confidential
Software developer
Responsabilites:
- Develops, tests, and modifies software to improve efficiency of internal operating systems.
- Maintains necessary documentation.
- Performs product design, bug verification and release testing.