We provide IT Staff Augmentation Services!

Performance Test Lead Resume

4.00/5 (Submit Your Rating)

Lanham, MD

SUMMARY

  • Experienced in the entire QA Life Cycle, which includes designing, developing and execution of the entire QA process and documentation of test plans, test cases, test procedures and test scripts.
  • Highly experienced with Software Development Life Cycle (SDLC) and various testing methodologies.
  • Experienced in Agile/Scrum and Waterfall Development Methodology.
  • Developed, implemented, and performed QA Manual and Automated test scripts.
  • Experienced in performing smoke testing, sanity testing, integration testing, functional testing, regression testing, system testing, and acceptance testing on Win/Web based and mainframe applications.
  • Experienced on IBM/Unisys mainframe systems for IRS tax return applications.
  • Working experienced with test management, defect tracking and automation tools (IBM RQM, Quality Center/ALM, Quick Test Pro /UFT and Load Runner and Performance Center).
  • Evaluated Non - Functional Requirements documents and identified performance test needs; types of tests, scripting scenarios and volume projections.
  • Develop a Performance test plan/strategy document that includes short and long-term test objectives, types of tests required, workload matrix, scenarios , etc.
  • Experienced in peer review of test cases and preparing test reports.
  • Testing experience with multiple platforms, web servers/application servers, databases, web technologies, object oriented programming languages.
  • Expert knowledge in LoadRunner scripting using Web HTTP/HTML, Web Service and Ajax True Client protocols.
  • Experienced in Bug-reporting and tracking using the test management tools like, HP Quality Center/ALM and IBM Collaborative Lifecycle Management (CLM).
  • Experienced in developing and executing test scripts using Automation tools i.e. QTP/UFT, LoadRunner, Performance Center, Jmeter and Selenium.
  • Good knowledge on analyzing the Load Runner results and working closely with cross functional technical teams in fixing the performance bottlenecks.
  • Coordinated with testing services and tools team to install Dynatrace agents on the Performance environments to identify the bottlenecks.
  • Working knowledge on Analyzing Heap Dump and Thread Dump.
  • Experience in on-site / offshore model and communications.
  • Experienced in building performance test strategies and frameworks based on the planned scope of performance testing.
  • Develop test execution scenarios for various types of tests such as load, stress , endurance , and run tests.
  • Perform heap and thread analysis, database deadlock detection, understand and detect resource contention, etc.
  • Document test results and develop custom made summary reports to satisfy both technical and non-technical stakeholders.
  • Used DynaTrace to diagnose and troubleshoot application performance issues.
  • Expert in windows Typeperf & Perfmon Utility to create custom config file and collect windows resources statistics remotely and generate report with PAL.
  • Experienced in using vmstat, Sar, Topas Utility & System Monitor in Unix system to measure Unix system performance under load.
  • Experienced in identifying Memory Leak issue, Java Heap, Garbage Collection issues in WebSphere Web Application Servers.
  • Extensively experienced to create and execute batch file & UNIX shell script.
  • Knowledge of testing front end web technologies based on JavaScript, DHTML, CSS, and of course, HTML.
  • Proficient of performance bottlenecks and end-to-end performance measures (server response time, throughput, network latency etc.).
  • Experienced in load testing, scenario creation and execution, measured throughput, hits per second, response time, and transaction time using LoadRunner Analysis.
  • Experienced in using MS Sharepoint for business collaboration.
  • Extensively experienced in MS Office Suites to create and analyze report and graphs.
  • Experience with writing SQL/PL-SQL scripts for test automation.
  • Expert in creating SQL queries against Oracle and MS SQL server.
  • Experienced in Service Oriented Architecture (SOA) testing and Web Services testing.
  • Self-motivated, able to work independently with moderate/minimal guidance.
  • Ability to work with minimum documentation and minimum supervision.
  • Proven ability to meet deadlines, ability to form, build and manage effective teams

PROFESSIONAL EXPERIENCE

Confidential, Lanham, MD

Performance Test Lead / QA Test Engineer

Environment: IBM RQM V.6.0.6, Vugen V.12.50, Performance Center V.12.50, Riverbed OPNET AppInternals Xpert(AIX) V.10, SoapUI, Java, JavaScript, Internet Explorer, Chrome, WebSphere, JBoss EAP V.6.0, MS Office 365,Oracle 12c, Red HAT Linux V.7, Unisys Mainframe, IBM Mainframe (JCL,TSO/ISPF, File Manager), DB2, IBM COBOL,IBM Rational Functional Tester, IBM Rational Performance Tester, DevOps, Microsoft Visio, Microsoft Project, C,C#, C++, and VB. Net, MicroFocus InfoConnect.

Responsibilities:

  • Performed Performance Testing for Foreign Accounts Tax Compliance Act (FATCA) applications- Financial Institution (FI), Qualified Intermediary (QI) and International Compliance Management Model (ICMM).
  • Reviewed and analyzed documents Unified Work Request(s) (UWR), Interface control diagram (ICD), Business System Report (BSR), Design Specification Report (DSR), Computer Operator Handbook (COH), Solution Architecture (SA) for Performance Testing.
  • Evaluated Non-Functional Requirement documents and identify performance test needs; types of tests, scripting scenarios, volume projections, etc.
  • Developed test plan, test strategies, test cases and test procedures from system/software requirement specifications and business requirements.
  • Created test scenario, test data, and traversal flow for test scripting and provided traversal flow to scripting team to develop the LoadRunner script.
  • Provided test scenario and test data for load test and capacity test to scripting team to execute the test.
  • Coordinated with EOPs SA and AppInternals Xpert(AIX) team to install AIX agent on the servers and provided servers metrics to AIX team to create dashboard for AIX.
  • Monitored the load test and capacity test using Riverbed OPNET AppInternals Xpert(AIX) V.10 monitoring tool to see the server’s CPU, Memory, Heap utilization and analysis the AIX results and provided recommendations to stakeholders.
  • Develop a Performance test plan/strategy document that includes short and long-term test objectives, types of tests required, workload matrix, scenarios, etc.
  • Monitored the average response time and error logs using Load Runner tool, Analyze the LoadRunner test results and provided the analysis test results to internal team and stakeholder.
  • Plan and execute performance test large scale applications handling high volume of transactions and number of users with stringent SLAs.
  • Test applications hosted on AWS cloud, on premise and mainframe.
  • Prepare test plan, test cases and test data for performance testing of UI, APIs, Batch Processes, Database and ETL.
  • Perform tests using tools like Load runner, performance center, Jmeter. Monitor Application and infrastructure using tools like CloudWatch and Splunk.
  • Working experience in SOAP/REST API Web Services testing.
  • Experience in the functional and nonfunctional testing of APIs and web services using REST protocols.
  • Experience and demonstrated ability running tests using the SoapUI.
  • Worked with Operations, DBAs and Developers to help solve performance issues in the application, network, and environment and provided analysis of performance test results, bottlenecks, sizing guidelines, including deadlock conditions, database connectivity problems, and system crashes under load, and recommended improvements.
  • Prepared Test Readiness Review (TRR) Checklist, TRR Agenda and TRR Findings Memorandum, developed and updated Work Breakdown Structure (WBS) for performance testing, updated Weekly Test Status Report (TSR) Snapshot/Narrative.
  • Participated in walkthroughs/meetings/reviews/coordination efforts. Provided input/participated in weekly status meetings.
  • Developed project folder checklist and created project folders in the SharePoint site and DOCIT location and uploaded all the project documents in the SharePoint and DOCIT.
  • Created test project, tester task and test plan, test cases and link test cases to requirements, create test scripts and generate test case execution records (TCERs), and created Requirements Traceability Verification Matrix (RTVM) using IBM RQM V6.0.6.
  • Created daily recap, test logs and KISAM logs for each test execution and send out daily recap to stakeholders.
  • Created and used SQL query to monitor ICMM files submission in the backend, validate the ICMM files submission, processing times for each file submission and analyze the processing times for thousands of files submission.
  • Prepared end of test status report and lesson learned and Participated in closeout activities.
  • Performed tasks associated to Systems Acceptance Testing (SAT), working with team members to test BMFDOCS changes in support of the “Tax Reform” initiative.
  • Review data contained in the Unisys Test Input files, identifying specific data elements and Processes within the test input files, and annotating that information into File Tracker documents.
  • Review Functional Specification Packages (FSP) and Unified Work Requests (UWR) to determine the necessary tests cases required to support tax code changes.
  • Perform and assist with quality assurance and software testing activities including unit testing, integration testing, functional testing, performance testing, and load testing to ensure the quality of products and processes.
  • Document test cases into Requirement Traceability Verification Matrix (RTVM) documents, converting that information into viable testing blocks within the Unisys Test Input files.
  • Conduct Peer Reviews of team members in support of UWR requirements - reviewing FSP and UWR changes for complete/accuracy, ensuring RTVMs reflect the necessary test cases, and the information is transposed correctly into the Unisys Test Input files.
  • Setup the Unisys Test Input files using Input Record layouts to process into the Generalized Mainline Framework (GMF) runs and review the outputs (TRANS and WORK records) for expected test results. Additionally, access the Error Resolution System (ERS) for the records expected to fail, and modify them to complete the process and verify the corrections were successfully processed.
  • Improve efficiency and production of testers by developing Process Improvements, standardizing Test Input files, and ensuring all Processes are represented by a test block. Capturing that information into complete “canned” RTVM documents.
  • Performed Systems Acceptance Testing (SAT) for Individual Master File (IMF), CADE2 Individual Tax Processing Engine (ITPE) and Business Master File (BMF) application.
  • Reviewed and analyzed requirements (Epic Scenarios) and develop test cases, RTVM for IMF ITPE application.
  • Working experience using flat files and/or VSAM files.
  • Experience accessing and manipulating various file structures (i.e. Sequential and VSAM) within software applications.
  • Experience conducting meetings with team members on application development activities.
  • Working experience on analyzing, designing, and testing the use and implementation of data archival and storage management systems.
  • Experience working on communicating project status to management.
  • Created data, test procedures and predetermined results.
  • Process test data and review output (using the expected results).
  • Provided individual status in daily scrum calls for assigned work items.
  • Participated in Agile Ceremonies (Daily Scrum, End of Sprint Review, Interval/Sprint Planning and Sprint Retrospective) activities for ITPE project.
  • Reviewed and analyzed documents e.g. Program Requirement Package (PRP), Control Record Layout (CRL), Computer Operator Handbook (COH),6209 and Appendix for CADE 2 ITPE Systems Acceptance Testing (SAT).
  • Working experience on creating Work Request Analysis Form (WRAF) for CADE 2 ITPE project.
  • Working experience on creating, modifying SCRS (Service Center Replacement System) transactions input files using IBM File Manager.
  • Working experiences on mainframe file data, Job Control Language (JCL), run logs for IMF ITPE application.

Confidential, Baltimore, MD

Software QA Analyst

Environment: HP ALM, Vugen12.0, Performance Center 12.0, DynaTrace, UFT12.0, SOAPUI , Java, JavaScript, VBScript, HTML, Internet Explorer, Chrome, WebSphere, iPlanet,, Aternity 7.0,, MS Office, Oracle 11g, Windows Server 2010, Solaris, Windows 7.

Responsibilities:

  • Developed VuGen Script, used manual correlation and auto correlation technique, set-up run-time setting, created scenario, analyzed the results to find out bottlenecks and root cause using HP VuGen and Performance Center.
  • Planed, created, modified and executed performance test scripts as needed.
  • Executed various kinds of tests based on requirements from business (baseline, load, and soak tests).
  • Experience in performing manual testing, system testing, integration testing, module testing, sanity testing and regression testing on Win/Web based applications.
  • Develop and execute test strategies, test cases and test scripts for Web applications, Client/server applications and Web services.
  • Participate in planning all testing activities accordingly to ensure deliverables are met on time; including test planning, test execution, defect tracking, and reporting.
  • Work closely with senior test team personnel, development, and business analysts to develop test cases.
  • Ensure adherence to QA process best practices including defect management, test case execution and metrics reporting, requirements traceability, component-based test design.
  • Execute manual test case procedures utilizing HP Quality Center/ALM.
  • Work in an agile team comprised of application developers, QA engineers, product owners and architects.
  • Performed Service oriented architecture (SOA) testing and Web Services testing using SOAPUI.
  • Used DynaTrace to diagnose and troubleshoot application performance issues.
  • Experience working in the agile development methodology.
  • Helped and mentored other testers in testing and Best Practices.
  • Designed, defined and documented unit and application test plans.
  • Transformed test plans into test scripts and executes those scripts.
  • Involved in test optimization through systemic improvements in process and quality.
  • Worked with operations, DBAs and developers to help solve performance issues in the application, network, and environment.
  • Provided analysis of performance test results, bottlenecks, sizing guidelines, and recommend improvements.
  • Identify functionality and performance bottlenecks, including deadlock conditions, database connectivity problems, and system crashes under load.
  • Generated automated test scripts by VB scripts using HP QuickTest Professional.
  • Developed QuickTest Pro. VB script test scripts in programming mode. Administered, maintained the Scripts.
  • Created automation test framework using UFT and maintained test scripts.
  • Developed automation script using UFT and executed in UAT environment for end user performance testing using Aternity monitoring tool.
  • Used InfoVista network monitoring tool for network utilization.
  • Used ALM/QC to organize and manage all phases of the software testing process, including planning tests, executing tests, and tracking defects.
  • Created and performed maintenance of object repository using UFT for functional, sanity, and regression testing.
  • Experience in working with XML and automated testing scripting with VBScript.
  • Developed SQL Queries and procedures to perform database testing.
  • Developed automation scripts using VB script.

Confidential, Seattle,WA

Software Test Engineer

Environment: Quality Center 10.0, Load Runner 11.0, Performance Center 11.0,MS Office, Oracle 11g, Windows Server 2010, Weblogic, Solaris, Java, C#., .NET, ASP.NET, ADO.Net, SOAP, WCF, WPF, Windows 7, Visual Studio Ultimate 2012,MS SQL Server, .Net Framework, MSMQ, FrontPage, PHP, SQL and PL/SQL.

Responsibilities:

  • Interfaced with business analyst to define performance requirement.
  • Found out and troubleshoot performance bottlenecks using LoadRunner.
  • Proficient in LoadRunner VuGen scripting with extensive in correlation library.
  • Custom LoadRunner code written to meet the requirement of the client along with error handling.
  • Executed various kinds of tests based on requirements from business (baseline, load, and soak tests).
  • Created various Scenarios to be run, used rendezvous points to create intense Virtual User (Vuser) load on the server, configured the scenarios before executing them in the LoadRunner.
  • Analyzed the results of the tests that were used to assist in the identification of system defects, bottlenecks and breaking points using LoadRunner.
  • Conducted End-to-End, manual testing of the system and prepared and managed test cases using Quality Center.
  • Used Quality Center/ALM to track, report, and manage defect throughout the test cycle and attended Defect Status Meeting on daily basis during testing cycle.
  • Identified performance issues & performed tuning in collaboration with Dev/DBA partners
  • Created and executed SQL query scripts for Database Testing
  • Involved in the test case walkthroughs assessment meetings.

Confidential, Inwood, New York

Performance Tester

Environment: Oracle 10g, Load Runner 11.0,Quality Center 10.0, MS Office, MS Excel,Windows Server 2008,Java, J2EE, VBScript, JavaScript, WebSphere, AIX, Performance Center, JIRA.

Responsibilities:

  • Gathered all the performance test requirements from various stakeholders & other business personnel.
  • Collaborate with business analysts, system engineers, network, development, and other teams to understand and validate configuration settings to ensure optimal product performance, scalability and availability for production environments.
  • Understand the application architecture and the scope for performance testing.
  • Allocated work amongst resources both at offshore & onsite.
  • Developed and updated test cases in Quality Center.
  • Tested enhancements, bug fixed for software releases and patches and document findings using Quality Center.
  • Responsible for definition, documentation, and execution of all performance test plans necessary to ensure the release of a fully validated system.
  • Performed all activities in the performance testing life cycle.
  • Responsible for designing and implementing performance tests for a highly scalable web-based systems.
  • Used JIRA to track, report, and manage defect throughout the test cycle and attended defect status meeting on daily basis during testing cycle.
  • Worked in ETL tool to extract data from Oracle, ASCII files, XML and MS excel.
  • Worked closely with product management and development engineers to conduct performance, load and stress tests.
  • Created performance scripts & scenarios using LoadRunner.
  • Developed VuGen Script, used manual correlation and auto correlation technique, set-up run-time setting, created scenario, analyzed the results to find out bottlenecks and root cause using HP LoadRunner.
  • Involved in tuning application to improve response times, queues and overall performance using LoadRunner.
  • Created and executed SQL query scripts for Database Testing.

Confidential, Hartford, CT

Software QA Analyst

Environment: QuickTest Pro, Load Runner, Quality Center, Oracle, SQL Server, Visual Basic, Windows 7/XP/Vista, HTML, Internet Explorer.

Responsibilities:

  • Generated test scenarios and procedures based on architecture, use cases, requirements and documentation working closely with Test Lead and System Engineer.
  • Responsible for the development of test data to be used in performing the required tests.
  • Used Quality Center for test-scenarios as per the functionality and business requirements specified in the use-cases and design specifications.
  • Performed complex functional, integration, and regression tests on multiple software products / product areas to validate links, objects, images and text on Web pages using QTP.
  • Created and performed maintenance of object repository using QTP for functional, sanity, and regression testing.
  • Created, enhanced and maintained high-end test scripts for various functional and regression using QTP
  • Generated automated Test Scripts by VB Scripts using HP QuickTest Professional.
  • Developed QuickTest Pro. VB script test scripts in programming mode. Administered, maintained the Scripts.
  • Created Configuration folders on the Server for QTP Environment Setup.
  • Created Libraries, Test Scripts, and Database Connectivity Scripts using QTP, used different types of actions like reusable actions, external actions with in the scripts.
  • Configured settings, record and run settings, options and web event record configurations.
  • Maintained and administered the objects in a common repository.
  • Created automation test framework using QTP and maintained Test Scripts.
  • Created connections between Quality Center and QuickTest Pro.
  • Created automated test labs in the Quality Center and selected scripts from Quickest Pro.
  • Ran QTP automated scripts every night and stored the results in Quality Center.
  • Developed some of the automated test scripts using Descriptive Programming techniques.
  • Generated Data Driven scripts that access the backend database
  • Created various Scenarios to be run, used rendezvous points to create intense Virtual User (Vuser) load on the server, configured the scenarios before executing them in the LoadRunner.
  • Developed automated test scripts for load, performance, and stress testing using LoadRunner.
  • Created and executed SQL query scripts for database testing.

We'd love your feedback!