Load / Performance Test Engineer / Lead Resume
East Lansing, MI
SUMMARY
- Highly skilled IT Professional with over eight (8) years of experience as a Software Test Analyst and Performance Engineer, and the extensive use of Automated tools such as JMeter, HP LoadRunner and Rational Performance Tester to test Web - based, Cloud Computing and Client Server applications or Web Services on Windows and UNIX/LINUX Platforms.
- Background includes extensive experience with gathering and interpreting performance test requirements, analyzing business use case scenarios, performing Integration, Functional, Regression, Compatibility, Mobile, Real-Time and Ad-hoc testing, and performance engineering SOA-based and API applications.
- Exceptional experience with preparing Load Test plans, QA Strategy, setting up environments, and working with Database Schema. Knowledge in IBMWebSphere Portal V6.x/V5.xAdministration.
- Experience in Installation, Configuration, deployments using xml access, Performance Tuning and Web content management.
- Working experience with multiple protocols such as HTTP, Web Service, Click &script, Citrix, JMS, Mobile Protocol, JDBC.
- Strong knowledge of web technology, J2EE, JAVA, JSP, ASP.NET, AJAX. Hands-on experience with service virtualization, Profiling and Tuning Web Applications, Analysis JVM Heap, Thread, CPU and Memory utilizations.
- Hands-on experience in Application Monitoring, Database Tuning, Performance, Debugging and identifying root cause of System Bottleneck.
- Demonstrates a solid understanding of complex architectures and derivations of key performance indicators for performance tests.
- Exposure to the full Software Development Life Cycle (SDLC), with hands-on experience in Performance Engineering, Scripting Executing Load Testing, Volume, Baseline and Capacity Testing, Performance Center, Writing VUGen Scripts, Configure Controller and Load Generator.
- Adept in the analysis of performance test results, preparation database test plans, Connection pooling for Database testing and tuning, and collaborating with Architects and Developers.
- Experience testing iOS and Android apps on a variety of devices and Mobile application Performance optimization.
- Hands-on Network virtualization, Bluetooth, WLAN setting and testing.
- Experience in Functional, Integration, Compatibility and Mobile performance testing.
- Experience with Performance Delivery methods, Tools and Techniques.
- Leading and working with an Offshore and Onsite team as well.
- Expertise on Industries Software-based Solutions, Strategy, Client operations & Delivery.
TECHNICAL SKILLS
Testing Tools: LoadRunner, Rational Performance Tester, JMeter, Performance Center/ALM, Quick Test Pro, Quality Center, HP Diagnostic, JMX, SoapUI, Rational Suites, ClearQuest, IBM MQ, Putty and JIRA.
Mobile Technologies: iPhone Tester/iPad peek, Device Anywhere, MobiReady, WebOS, SOASTA.
Profiling / APM Tools: JVM, DynaTrace Client, Wily Introscope/CA LISA (ITKO), Standard Set, JConsole, AppDynamics.
Languages: VBScript, C, UNIX Shell, XML, SQL, HTML, CSS, SOAP/REST Services, JSON, Java.
Databases: Oracle, H2, Apache-Cassandra, SQL Server, MySQL, DB2 and Rapid SQL.
Operating Systems: Windows, Linux, MS DOS, Fedora, Red Hat and Solaris.
Methodologies: Waterfall, Iterative, Agile/Scrum-Sprint, V-Model and RUP.
Servers: WebSphere, JBoss, Apache, WebLogic, J2EE, WS Ftp Pro, IIS.
Others: Microsoft Office 2013, Microsoft Project, TFS, VMware, Fiddler, RabbitMQ, SkyTap, BlazeMeter, PerfAnalyzer, PeopleSoft PeopleTools, Liferay, Firebug, Drupal, Oracle(Rac), YSlow and Shunra Network Virtualization.
Industries: Retail, Financial, Insurance, Travel, Mobile Banking and Corporate Software Technology & Operations.
PROFESSIONAL EXPERIENCE
Confidential, East Lansing, MI
Load / Performance Test Engineer / Lead
Responsibilities:
- Develop strategies for testing to ensure product quality
- Responsible for development and maintenance of benchmarking regime for cross asset server and continuous improvement of test harness.
- Create Non-Functional Testing, Performance Testing, Volume Testing Strategies, Performance Test Plans and Requirements for Applications, Middleware and Databases.
- Develop and design high-level test framework to support client side and Agile Performance Test. Proactively work with developers to address quality early in the SDLC.
- Communicate with Scrum team members to discuss the test report and flow up new story in JIRA/Agile automation Backlog Dashboard.
- Involved in WebSphere Process Server 6.x administration and setting up monitoring tools.
- Write and execute load, volume, Capacity and performance test for JAVA based platform
- Build Performance test for web and Cloud based applications in Linux and deploy in SkyTap environment.
- Create Load Test for the Profile Web Service and Database application across multiple high-profile projects in Java Implementation. Create thread groups requests and run them for the Connection Service.
- Set up Data for Cassandra Service and run Data Driven Test with JMeter
- Monitoring application and multi-web server metrics and analyze PerfMon Metrics.
- Analyze Memory Load, CPU, Thread, Response Code and Network I/O Load for performance bottleneck issue triaging. Use Commands such as Top, Perfmon, Wget, Sir and vmstat
- Use FTP program to upload files for offline testing, as well as for looking at server log files to uncover bugs.
- Run SQL Message Query Optimization for JDBC test in JMeter, capture the impact of performance issues, and share results.
- Perform Test Data Management/Automation and Data Driven Testing with JMeter.
- Build test automation for UI and WCF/REST Services and create reusable and shareable components Using JMeter in Linux platform.
- Monitor CPU usages of the Server and Oracle cluster in Cassandra database using Shell script and analyze Performance metrics to determine root cause.
- Work with Agile/Scrum team to resolve issues, and interact with the Database in the most efficient and productive way.
- Responsible for Testing Application servers, providing manual Navigation Using DynaTrace, and using JMeter to simulate test load
- Monitor Response time in PurePath Dashboard using DynaTrace client. Provide expertise on performance testing tools and capabilities by mentoring and coaching junior team members.
- Create Test reports of test results and test metrics. Participate in Defects to discuss the bottlenecks and attend Daily Stand-Up Scrum meetings and weekly status meetings. Send weekly report status to the manager.
Environment: Spring Boot, JMeter, DynaTrace, JMX, RabbitMQ, IBM WebSphere, Panda, Oracle Linux 6, F5, Java 8, WebLogic, JIRA, Apache Solr- Cassandra, ETL, SkyTap and Selenium (WebDriver)
Confidential, Columbus, OH
Performance Test Engineer
Responsibilities:
- Collaborated with the PINNACLE QA Team for Application Performance and Involved in analyzing the requirements and determined they are captured correctly and interpreting Performance test requirements.
- Tested application performance across workflows to ensure that the application can perform satisfactorily in a production environment.
- Worked as a web admin for regression testing and for the system enhancement. Coordinated with PINACLE QA team and collected all the Evolution for the Application.
- Responsible for creating script creating (VUGEN), Execution (Controller) and Analysis / Reporting. Performed Baseline, Load and Stress Testing Using LoadRunner and Present Performance statistics to the Team and to the Senior Test lead.
- Created, correlated and parameterized automated test scripts using the Web/ HTTP, AJAX Click and Script and Web Service protocols using LoadRunner.
- Developed Test scripts through LoadRunner VUGen and executed them Using LoadRunner Controller. Measured Response time of the important action of users using start and stop transaction functions. Extensively worked on Performance monitoring and analyzed the response time Memory leaks, hits/Sec and throughput graphs.
- Analyzed various graphs generated by LoadRunner Analysis and Communicated bottlenecks to the system administrator.
- Uploaded scripts in ALM Performance Center, Created Time slots, Created Test schedules and maintain scripts. Used Performance Center for Scripts in ALM project and submit defects.
- Identified Mobile Performance issues on the device side such as; threads, CPU, battery, Memory utilizations, Mobile content and Consulted with the team for Network virtualization.
- Assisted for Regression testing prior project deadline in areas of my expertise my knowledge such as iPad device and Mobile Application performance.
- Installed and configured application through DynaTrace Profiling tool. Added Header in the script, Monitored the server and doing some Manual Navigation Using DynaTrace Client.
- Performed Benchmark, Capacity and Load Test using JMeter for JDBC connection and proxy server. Developed Continues Data driven testing using CSV file/Counter service using JMeter.
- Performed tuning a Websphere portal that involved various systems and web server utilizations.
- Monitored hardware capacity ensures the necessary resources are available for testing.
- Worked closely with software engineer team members in order to tune and optimize product stability and performance.
- Provided support to Offshore and Lead them analyzing the reports. Communicated with team members to discuss the test report and Participated in Defects meeting, attended Daily Conference meeting, weekly status meeting and send weekly report status to the manager.
Environment: LoadRunner, ALM Performance Center, JMeter, JVM, QC, C++, Java, Linux, Web sphere Portal, Oracle DB, IBM MQ, DynaTrace, WebLogic and TLeaf.
Confidential, Minneapolis, MN
Performance Test Engineer
Responsibilities:
- Worked with business, Product Manager, Developer and UAT audiences. Gathering requirements; Developed Test plans to ensure accomplishment of load-testing objectives.
- Extensively worked on the VUGen script in Web, Mobile protocol and Web services (SOAP) Protocol in LoadRunner, simulate virtual users and transactions and simulated user think time.
- Developed LoadRunner scripts using VUGen for Single User, Baseline, Soak scenarios by storing dynamically varying objects IDs in parameters and validating correct downloads of the HTML page by validating the content in Sources.
- Configured LoadRunner Controller, Load Generator and Execute Performance Test for multiple Cycles of Test scripts. Developed and Implemented load and stress test with LoadRunner, and present performance statistics for the Application Teams.
- Assisted in application tuning and infrastructure capacity requirements to support high-volume peak periods of traffic.
- Created different Scenarios for each Application and executing them in ALM. Written SQL Query and identify which queries were taking too long and optimizing those with Database Tuning performance and Worked with Database administrator to index database to improve performance of the Applications.
- Uploaded Scripts, Created Timeslots, Created Scenarios, Maintained scripts and Run the Load Tests in Performance Center. Analyzed Test results Response time, Transaction per Seconds and Throughput per graphs. Monitored Application Server through Analysis.
- Analyzed various graphs by LoadRunner Analysis and communicated bottleneck issues to the System administrator.
- Installed, Setting up environment in JMeter and Configured Http Header manager.
- Performed Baseline test, stress test and high volume of users using JMeter and monitored the performance of the load test on the system and measured database response time, Http request, Login and proxy server.
- Created a Load Test for “Add Cart” Online shopping (web service) and established a JDBC Connection pool for secure service Using JMeter.
- Installed and configured application through profiling tools such as VisualVM, JConsole and Monitored Linux resources during the load test finding Bottlenecks and solving the issues on Linux servers using different monitors.
- Monitored resource utilizations such as; CPU usage % of Memory occupied in VM Stat I/O Stat JVM, Thread, System Processing time and latency in Linux
- Responsible for collecting the frequency JVM Heap and Garbage collection cleaned up in WebSphere during Test execution
- Communicated with team members to discuss test reports. Involved in walk-through and meetings with the Performance Test team.
Environment: LoadRunner, ALM Performance Center, QC, JMeter, Device Anywhere, J2EE, JProbe, JSP, C++, Bugzilla, JVM, Linux, Web sphere, IBM HTTP Server, Oracle, Wily Introscope.
Confidential, McLean, VA
Performance Engineer
Responsibilities:
- Communicated with Business, Product Manager and Developer. Involve in the preparation of the Test Plan, Business Use Case Scenarios, Test Data, QA Project estimate.
- Scripted in Oracle (2 tier), Web (HTTP/HTML) and Web Services (SOA) protocols
- Ensure environmental setup, Test Data and Resources are in place to complete QA Cycle. Writing VUGen Scripts for complex business scenarios, Configure and Run Test using LoadRunner
- Configured and used testing and monitoring tools and analyze data, create reports, recommendations and communicate test results.
- Recommended Java VM is tuned by tweaking settings like heap memory, garbage collection (GC) and other optimization parameters.
- Identified functionality and performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load
- Planning and executing Performance, Load, Volume and Capacity Testing. Uploaded Scripts, Created Test Schedules, Scenarios and Run Load Tests in Performance Center
- Identified root cause of system bottleneck. Analyzed Performance Test Results, Communicated Reports and Recommendations
- Worked with Scripts in ALM project and submit defects. Performed functionality, performance of web service verifications against web applications, JDBC data sources Using SoapUI
- Coordinated with Systems Engineering team and improved performance test architecture. Written scripts in Unix Shell, C, XML, HTML
- Worked with file manipulating for log analysis; measured CPU and memory for UNIX and LINUX systems and used commands such as performing, Sir, Top and vmstat
- Communicated daily and Weekly status report to management and project stakeholders. Worked in a team environment and Worked closely with the Developer.
Environment: J2EE, WebLogic, JVM, VMware, UNIX, Oracle 10g, LoadRunner, Performance Center, Quality Center/ALM, SoapUI, ClearQuest, Doors, DB2, Rapid SQL, SQL Server.
Confidential, Columbus, OH
Performance Tester / Software Test Engineer
Responsibilities:
- Performed Planning, designing, executed and evaluated Performance tests of web application and services and ensured optimal application performance using LoadRunner.
- Involved in Performing Load and Stress tests using LoadRunner and communicated performance reports.
- Provided recommendations to the application owner on steps to meet performance goals
- Maintained automated tests environments and scripts for performance tests. The Average Response time, TPS, Throughput, Web Page Breakdown graphs were analyzed for each scenario
- Worked with development team to ensure testing issues are resolved on the basis of using defect reports. Monitoring Application Server, Database server. Analyzed server access logs debugging application performance issues.
- Responsible for creating script creating (VUGEN), Execution (Controller) and Analysis / Reporting. Created, correlated and parameterized scripts using the Web and Web Service (with and without a WSDL) protocols.
- Modified the run-time settings during execution so that to emulate a real-time user by changing browser versions, think- time, pacing, content caching, action blocks, etc.
- Monitored PERFMON counters and windows resources such as; CPU, Memory and threads.
- Verified “Online System and Functionality” tests on various platforms. Performed End to End testing, Performed web testing, enhanced my script using Quick Test Pro.
- Developed code for functional and regression testing using data driven automation techniques and non-record/playback frameworks
- Managed the requirements using Quality Center. Performed Smoke, Integration, Functional, Performance, Regression, and Backend testing. Participated in QA team meetings and walk-throughs for weekly QA testing review.
Confidential, Bellevue, WA
QA Test Analyst
Responsibilities:
- Wrote and executed Performance and Load scripts for benchmarking the base WebSphere portal using HP and Mercury LoadRunner 8.1
- Used LoadRunner to measure the Transaction Response time, Network delay, Throughput, measured CPU and memory for UNIX and LINUX systems.
- Participated in all aspects of the product life cycle, such as; Web release, peer reviews, requirements, designs and test procedures.
- Assigned the tasks of analyzing test data, recording test performance, and tracking software defects, monitoring the installation and configuration of equipment in the system
- Recorded and debugged Test scripts using multiple actions in LoadRunner
- Developed and parameterized Vuser scripts for web, database virtual users using LoadRunner.
- Tested the application successfully up to 10,000 concurrent users.
- Interacted with Business Analysts and Developers for the Bug Fixes, Problem resolution and gathered the data for the test cases. Performed Web Testing for verification of Broken Links, web objects, frames, tables and HTML Content using QTP.
- Designed and developed Confidential Web services. Performed Functional, Integration testing and Regression testing using Quick Test Pro.
- Performed UI and compatibility testing, Usability testing with different handsets and their features and created automated Mobile testing using the Mobile Cloud for QTP
- Created automated worked in Mobile ticketing technology for the distribution of Travel vouchers, coupons. Worked with third party tools for managing flights and Hotel itinerary.
- Maintained and executed test cases and test scripts using Quality Center. Performed Database Testing and written SQL Queries for MS SQL Server 2000.Worked closely with Test Lead during the Software Testing Life Cycle (STLC).
- Reviewed and Summarized Business Requirement Specification (BRS), Software Requirement Specification (SRS) and User Requirement Document (URD).
- Developed Test Plan and Test Approach artifact with resource requirements and time estimates, verified content with the team members.
- Designed and developed Test Scenarios, Test Cases, and Test steps for various Business Services/methods covering both Positive and Negative testing requirements.
- Involved in the project from the requirement analysis phase till the completion of UAT.
- Testing and cross-verifying the reports using SQL queries. Writing SQL queries to test database for retrieving information, editing data and inserting the data.
- Developed SQL scripts, design, scripts related to the application testing, wrote Stored Procedures to ensure Database integrity. Utilized ClearQuest as a request for tracking the Defects and generating reports. Communicated daily and Weekly status report to Team Lead. Involved in walk-throughs and participated defects meeting.