Performance Test Engineer Resume
Detroit, MI
SUMMARY
- Over 7 years of experience in Software Quality Assurance, Performance testing and in Analysis, Design of manual and automated testing of various applications for Client Server and Web - based applications.
- Extensive knowledge in Performance Test Life Cycle for developing, maintaining and supporting the Web Applications, Client Server Applications.
- Working experience under Agile, Waterfall methodologies and based on onsite/offshore business process.
- Proficient in using test automation tools such as testing scripts for web and client server applications
- Expertise in script Parameterization and manual Correlation.
- Expertise in major Test Automation and Management tools such asHP's Load Runner, Performance Center, Quick Test Professional, Jmeter and Quality Center.
- Strong knowledge of SOAPUI testing, experience with the functional web services test, and MOCKS for web services testing
- Experience in DynaTrace API tool for user experience analysis, identifies and resolves application performances
- Involved in Integration testing, Regression testing, and System testing for the Enhancements.
- Expertise in Manual and Automated Correlations to Parameterize Dynamically changing Parameters values
- Experienced in monitoring CPU, Memory, Network, Hits/Sec and Throughput while running Baseline, Performance, Load, Stress and Soak testing.
- Experience in AppDynamics, Application performance management tool for infrastructure bottlenecks and real time analytics.
- Expertise in tracking defects using tracking tools such as Quality Center and Clear Quest
- Experienced in detecting Bottlenecks such as very high CPU usage, memory leaks and communicated with the developers to optimizing the code and working knowledge of Team quest and BMC
- Strong SQL skills to select, manipulate and summarize metrics captured from various databases
- Responsible for insuring that application are properly planned to meet performance and scalability requirements and ability to understand complex scenarios and problems.
- Expert in Diagnostic tool monitoring any risk availability in performance, and knowledge in PIVOT tables.
- Strong process and documentation skills for performance testing/engineering
- Quick learner by respect to latest technology, most excellent put into practice and system.
TECHNICAL SKILLS
Automated Testing Tools: Load Runner, Jmeter, Performance Center, Monitoring tools like SiteScope, Introscope, DynaTrace and Perfmon, Jmeter (Test HPSuite(LoadRunner 8.0/9.5/11.0/11.5 , Quick Test Pro 9.2/9.5/11.0/11.5.
Monitoring Tools: SPLUNK, OPNET, OPTIRE, EXTRA HOP, HP BAC, AppDynamics, SPLUNK.
Operating System: MS-Windows 98/2000/NT, XP, VISTA, WIN7, Unix, Linux and Solaris.
Productivity Tools: MS Office Suite, Microsoft Project
Bug Tracking Tools: Quality Center/Test Director, Bugzilla, JIRA and Clear Quest
Scripting Languages: Java Script, VBScript, UNIX Shell Scripting and TSL
Web and programming languages: C, C++, C#, Java, SQL and PL/SQL, HTTP, TCP/IP, AWT, Swing, HTML, DHTML, CSS,XML, XSL, XSLT, WSDL, Web services.
Servers and protocols: Web Sphere, Web Logic, Tomcat, JVM, and HTTP/HTML
RDBMS: DB2, SQL Server 2005/08, MS Access, Oracle 10g
Version Controls: Clear Case 7.1/8.0, Perforce and TFS 12.0
PROFESSIONAL EXPERIENCE
Confidential, Detroit, MI
Performance test Engineer
Responsibilities:
- Reading and understanding the performance testing business requirements.
- Involved in prepare the performance test plans for each build.
- Involved in verify the check list for performance testing for each build.
- Execute sanity testing for each build release and send the results to the management.
- Run the SQL queries in data base server side and export the data in to excel sheets for load testing
- Perform APM tuning of applications in development and critical production applications as well as to provide 3rd level performance support.
- Utilizediagnostic and monitoring tools to measure, detect, isolate and resolve performance issues found during application development performance testing including measuring, monitoring and capturing required infrastructure & application performance metrics, logs and reports
- Advice management of nay risks for availability, performance or capacity
- Responsible for establishing and implementing processes for reporting on application availability and end user experience for critical applications.
- Advice management of any risks for availability, performance or capacity, Involved in set up the ramp up, duration and ramp down for each script in controller.
- Involved in run the multiple scripts in controller for long time (24 to 72 hours with 2000 to 3000 users) and short duration (10 to 12 hours with 2000 to 3000 users) and prepare the report in excel or JRocket for hourly wise and day wise and sent the report to the management.
- Responsible for establishing and implementation for reporting on application availability and end user experience for critical applications including, Involved in periodically verify the data base tables by running the SQL queries for compare the script running.
- Responsible forHP BAC and OPNET for monitoring/alerting on end user experience, Involved in verify the log files of various servers and identify the server bottlenecks and Application score cards.
- Coordinate with security team to resolve firewall issues logging into Remote Desktop.
- Performed various load test using different loads by setting up scenarios by adjusting run time settings according to given workload and Utilizing the diagnostics and monitoring tools.
- Involved in attending all types of meeting such as AGILE stand up meetings, weekly status meetings, build wise performance testing meetings and rise the issues regarding performance of the application.
- Involved in coordinate with performance team members, developers, manual testers, network team members, DBA, and configuration team members for discuss the performance issues.
Environment: LoadRunner, Performance Center, Quality Center, SOPAUI, Load Runner Analysis, HP Diagnostics, JavaScript, SQL, Java, JAVA,, J2EE, JSP, XML/XLST, Oracle.
Confidential, Washington, DC.
Performance Tester
Responsibilities:
- Reading and understanding the performance requirements.
- Developed Load Test plans, Load Test strategies and Created User workflow diagrams in MS Visio
- Involved in Installation of Load Runner LoadRunner 11.0 tool on Windows Operating Systems.
- Created test plans according to NFR have to meet Business Requirements.
- Involved in Developing Vuser scripts using VuGen 11.0 Web (HTTP/HTML) protocol based on the test case requirement.
- Responsible for implementing JMeter based infrastructure including, Architecting the load testing infrastructure Hardware & software integration with LoadRunner.
- Involved in write SQL queries with DML commands and DDL commands to fetch the data for parameterized the scripts in controller.
- Involved in Connection to one controller and multiple load generators in performance center.
- Involved in adjusting Screen resolutions to match the script before recording.
- Involved in developing various Vuser scripts using LoadRunner based on the user workflows.
- Independently develop JMeter test scripts according to test specifications/requirements.
- Worked with Security team to resolve firewall issues logging into Remote Desktop.
- Managed Scripts through Synchronizations on the windows and performed error checking to make sure it passes through all the screens.
- Performed various load test using different loads by setting up scenarios by adjusting run time settings according to the test case requirement.
- Responsible for all phases, planning, developing scripts, execution of scenarios in performance center.
- Analyzed the Transaction Summary Report and graphs generated in a Load Runner Analysis module
- Analyzed CPU, Memory and Heap using SiteScope, Perfmon, IBM SA.
- Enhanced Vuser scripts by introducing the timer blocks, by parameterizing the scripts to run the script for multiple users
- Used Web Reg Find function to search for the text to see if the desired pages are returned during replay in Virtual User Generator
- Used Web Reg save param function to correlate the scripts to capture the dynamic data.
- Changed the runtime settings such as pacing, think time, Log settings, browser emulation and timeout settings in VUGEN and controller to simulate the scenarios.
- Actively attended meetings with other project team members to facilitate performance testing related questions and/or issues.
Environment: LoadRunner, Performance Center, Quality Center, SOPAUI, Test Director, IBM SA, LoadRunner Analysis, HP Diagnostics, VB Script, .Net, JavaScript, SQL, Java, JAVA,, J2EE, JSP, IIS, XML/XLST, Oracle, ERP, CRM.
Confidential, Scottsdale, AZ
QA Analyst
Responsibilities:
- Reviewed the Requirements documents and created the Test plan and test cases for functional testing.
- Conducted functional testing for ordering module.
- Conducted End to End UAT testing for system that.
- Interacted with the Developers and design team for defects and problem resolution
- Worked with transactions and validated the data by using SQL.
- Generated Bug Reports and Test case coverage reports for status meeting and also involved in resource planning for test cases coverage
- Involved in Bug Review meetings and participated in weekly meetings with management team.
- Provided testing results and weekly status reports to the QA Manager
- Verified the Web screens against the mockups provided by client.
- Tested the SoapUI Web services.
- Performed regression testing by repeated execution of test cases in each build by logging defects in Defects tab in Quality Center.
- Validated the data on screens against the database
Environment: Java, Quality Center, MS Office, SOPAUI, Oracle 10g, ebSphere 6.1.
Confidential
QA Tester
Responsibilities:
- Prepared Iteration Test Plan, Test Cases and Requirements Tractability Matrix (RTM) for above project
- Analyzed the user requirements by interacting with system architect, developers and business users.
- Analyzed the Functional requirements
- Conducted Integration and Regression Testing
- Performed manual testing on the criteria of the product qualification rules, locking and pricing to ensure the Application is stabilized
- Prepared Master and Detailed Test Plan.
- Assist in the setup of the Testing Environment
- Created test scripts for functionality, boundary, regression Testing.
- Prepared Test Data for system and Interface testing as per the specifications.
- Created Data Driven Tests to validate the same scenarios with different sets of test data
- Defects were tracked in Mercury Quality Center and Interacted with developers to resolve technical issues according to the priority levels.
- Executed test cases on each build of the application and verified the actual results against requirements.
- Attended various meetings with the developers, clients, and the management team to discuss major defects found during testing, enhancement issues, and future design modifications.
Environment: Quality Center, J2ee, Java, and Servlets, Jsp, Ejb, webSphere5.x, Windows XP, Oracle9.0, and SQL.
