Performance Test Engineer - Lead Resume
Washington, DC
SUMMARY
- 10+ years of diversified experience in Software Testing, Test Automation, Quality Assurance, and IT/Network/Systems. Currently specialized in Performance Testing.
- Microsoft Certified Professional/Six Sigma Greenbelt. Domain experience includes HealthCare, Hospitality, Finance and Telecommunications.
- Developed Performance Load/Stress Test scripts in CloudTest Lite, Loadrunner, executed the tests and analyzed the results for internal QA in the Healthcare domain.
- Tested Portal Authentication based on OAM 11g, Web 2.0 applications using technologies such as Ajax, Adobe® Flash, or interactive Web application frameworks such as Microsoft Silverlight.
- Highly experienced in writing Test Plans, Test Strategy, Test scripts and Test Scenarios from System/Software Requirement specifications according Business Requirements.
- Utilized QTP and Selenium for automating test scripts.
- Specialized in developing Test Scripts using CloudTest Lite, Performance Center, and LoadRunner.
- Experienced in testing Web - Based/E-Commerce and Client/Server applications on different operating systems like Linux/UNIX and Windows NT/2003/2008/XP environment.
- Involved in all phases of Software Development Life Cycle (SDLC).
- Experienced in SDLC Methodologies including Agile, Waterfall, Iteration, RUP (Rational Unified Process), Spiral and Testing Methodologies such as Functional Decomposition, Key Word Driven and Test Plan Driven.
- Experienced in developing Test Automation Framework, User Defined Functions Library with reusable VBScript functions.
- Proficient in Performance Testing and Automation on client/server applications, web-based applications and databases.
- Extensively experienced in GUI, Smoke, Integration, Functional, System, User Acceptance, Regression, Load/Stress, Performance, Black box, Back end, Positive and Negative testing.
- Over 7 years working in a leadership capacity within teams of 3-7 people.
- Extensive experience in bug tracking tools including TFS, JIRA, HP ALM 11.52, Quality Center, and monitoring tools including New Relic, HP Diagnostics, SolarWinds, Dell FogLight and SQL Profiler.
- Highly skilled in creating and executing test cases, writing test scripts from requirement documents and functional design documents using HP Quality Center.
- Strong experience in writing and executing complex SQL queries and PL/SQL statements for back-end testing.
- Worked on various technologies including Java/J2EE, Oracle, .NET, Web Methods and very good working knowledge of QTP object model, Java, .NET, Web add-ins and QTP test objects.
- Experienced in writing QTP Test Scripts using Custom Functions, Descriptive Programming and enhanced the scripts using Parameterization, Synchronization, Correlation, Regular Expressions, Check Points, Customized Exception handling and VB Script.
- Experienced in installing and configuring LoadRunner, performed stress testing of the application for various scenarios and analyze the results to improve its efficiency and scalability.
- Strong understanding of both wired and wireless networking technologies, as well as troubleshooting with regard to both wired and wireless networking issues.
- Experience supporting Microsoft Office applications.
- Desktop support best practices.
- Basic PBX/Telecom technologies, to include Adds/Moves/Changes.
- Experience and understanding of basic low-voltage cabling termination and troubleshooting for voice and data networks.
- Strong understanding of Payment Card Industry (PCI) best practices with regards to Information Technology.
- Strong technical skills including PC and Networks, knowledge of Windows Server, Windows XP and Windows 7.
- Adaptable as a team member or work independently.
- Ability to proactively identify and resolve problems with minimal supervision.
TECHNICAL SKILLS
Testing Tools: CloudTest Lite, Load Runner 12.02/Performance Center 11.52, ALM 11.52, UFT 12.02, Selenium
Bug Tracking Tools: ALM 11.52, JIRA, Quality Center, Pivotal Tracker, TFS
Languages: SQL. Familiar with Javascript, Ruby, C++, C#, Python, Perl
Web languages: HTML, XML
Scripting Languages: Java script, VB script
Operating System: Windows NT/XP, Window 7, Windows XP Pro, Windows Server 2003 and 2008, MAC OS 10, MS-DOS and Linux/UNIX
Microsoft Suite: Visio, Power point, Word & Excel
Software: Microsoft Office, Exchange, Backup Exec, SAP, Delphi, Dometic, Xeta, HSI
Database: Oracle, SQL Server, MySQL, DB2, No-SQL (CouchBase, MongoDB, DynamoDB), Informatica
PROFESSIONAL EXPERIENCE
Confidential - Washington, DC
Performance Test Engineer - Lead
Responsibilities:
- Lead implementation of Enterprise Performance Testing Center comprised of SOASTA CloudTest Lite, LoadRunner 12.02, F5 Networks load balancer, SolarWinds Server and Application monitoring and Dell Foglight SQL DB performance analyzer.
- Mentored and trained new members as necessary.
- Developed Performance Load/Stress Test Plans and Performance Business Scenario Plans for QA testing.
- Developed, debugged, troubleshoot, and executed performance test scripts and project files using enterprise Performance Stress Testing tools for testing interactive and batch processes.
- Analyzed Performance Test results assist with identifying bottlenecks and recommend appropriate performance tuning measures such as adjusting Heap size, Thread size, Garbage Collection schemes and F5 Load Balancer settings.
- Accountable for day-to-day performance test (PT) execution
- Manage performance test execution team
- Prepare & lead formal review of PT results with appropriate stakeholders
- Approves overall plan and activities
- Work with other leads to define weekly and future workload forecast
- Drive process improvements in the execution of performance test scenarios
- Review all test scenarios to be executed and conduct scenario review meetings
- Review all application release notes
- Manage UAT environment
- Communicate and coordinate efforts to implement any changes in the UAT environment
- Ensure the proper usage and configuration of UAT monitoring tools
- Perform Risk Analysis and recommend Mitigation strategies
- Manage Issue resolution process
- Review and participate in prioritizing issues
- Work with the appropriate subject matter experts to drive resolution of issues
- Responsible for escalation of unresolved issues
- Participate in monitoring issue resolution
Environment: ASP.NET, CloudTest Lite, LoadRunner 12.02, UFT 12.02, Win2008, Win2003, Oracle OAM, Java, SQL server, F5 Networks, SolarWinds, Dell Foglight, JIRA/Confluence, TFS, Selenium
Confidential - Sterling, VA
Sr. Performance Test Engineer
Responsibilities:
- Lead implementation of Enterprise level Testing Center for Automation and Performance Testing comprised of SOASTA CloudTest Lite and Cucumber Automation framework.
- Mentored and trained new members as necessary.
- Developed Performance Load/Stress Test Plans and Performance Business Scenario Plans for QA testing.
- Developed, debugged, troubleshot, and executed automated performance test scripts and project files using enterprise Performance Stress Testing tools for testing interactive and batch processes.
- Helped facilitate implementation of Ruby/Capybara/Selenium/Jenkins/Sauce Labs Automation Framework.
- Performed internal Load/Stress tests on QA Servers in an Agile environment.
- Contributed to developing qualitative and effective Test Scenarios, Test Cases and Test Plan in JIRA and PivotalTracker.
- Actively involved in entire SDLC and STLC of an application.
- Performed Test life cycle and bug life cycle through potentially shippable product increment within estimated framework.
- Created and executed Test Cases and analyzed results including Black box and Regression testing.
- Maintained the repository for regression test cases and updated the same based on change requests initiated by the users.
- Prepared the test data files, manipulated the test data files and uploaded the test data through FTP and executed the test cases.
- Actively participated in build deployment on different platforms (operating systems, multiple browsers).
Environment: CloudTest Lite, Cucumber, Capybara, Selenium WebDriver, Jenkins, JIRA, Zephyr, PivotalTracker, PaperTrail, GitHub, OS X, Win2008, MySQL server, VSphere VMs, Dynamo DB
Confidential - Fairfax, VA
Performance Test Engineer/Sr. Consultant
Responsibilities:
- Led the team in transitioning to SOASTA cloud-based performance tool CloudTest Lite in addition to LoadRunner/Performance Center. Mentored and trained new members as necessary.
- Conducted geographically distributed Cloud-based performance testing once a year for up to 20K users on Production systems while coordinating with critical support teams and monitoring critical resources. Included limited Mobile Testing on iPad and Android based tablets.
- Developed Performance Load/Stress Test Plans in LoadRunner using WEB/HTTP and TruClient protocols and Performance Business Scenario Plans for QA testing.
- Developed Performance Load/Stress Test scripts in JMeter, executed the tests and analyzed the results for internal QA in the Healthcare domain.
- Tested Portal Authentication based on OAM 11g, Web 2.0 applications using technologies such as Ajax, Adobe® Flash, Adobe Flex, or interactive Web application frameworks such as Microsoft Silverlight.
- Analyzed Performance Test results assist with identifying bottlenecks and recommend appropriate performance tuning measures such as adjusting Heap size, Thread size, Garbage Collection schemes and Cookie Persistence.
- Monitored metrics from New Relic and HP Diagnostics tools, including CPU and Memory utilization, transaction response times and SQL query processing times.
- Leveraged Lockheed Martin’s Enterprise Testing Center (ETC), a dedicated Performance Testing platform consisting of a Virtual Data Center, Performance Center (LoadRunner), ALM, HP Diagnostics and Cloudtest.
- Developed, debugged, troubleshot, and executed automated performance test scripts and project files using enterprise Performance Stress Testing tools for testing interactive and batch processes.
- Performed both internal and cloud-based Load/Stress tests on QA and Production servers up to 20,000+ users in an agile environment.
- Contributed to developing qualitative and effective Test Scenarios, Test Cases and Test Plan.
- Actively involved in entire SDLC and STLC of an application.
- Performed Test life cycle and bug life cycle through potentially shippable product increment with estimated framework.
- Used SQL Profiler to identify dbase related performance issues.
- Created and executed Test Cases and analyzed results including Black box and Regression testing with the help of JIRA and QTP.
- Used Table Checkpoints to check values of object properties, and Image checkpoint for property values of an image and information in a table respectively using Quick Test Professional.
- Automated test case through QTP for regression testing and Build Acceptance testing.
- Created descriptive programming and custom function library for regression testing through VB Script in QTP.
- Used QTP to perform data driven testing and to parameterize data.
- Maintained the repository for regression test cases and updated the same based on change requests initiated by the users.
- Used Rendezvous point to better control and generate peak load on the server thereby stressing it, and measuring its performance using LoadRunner.
- Created Virtual User Scripts, defined User Behavior, ran Load Test Scenario, monitored the Performance, and analyzed Results using CloudTest Lite and LoadRunner.
- Prepared the test data files, manipulated the test data files and uploaded the test data through FTP and executed the test cases.
- Actively participated in build deployment on different operating systems.
Environment: CloudTest Lite, Performance Center, ALM 11.52, LoadRunner 11.52, QTP 11, JIRA, Win2008, SQL server, ASP.net, IBM Websphere Java (JVMs), F5 Networks, HAProxy
Confidential, Washington, DC
Test Automation Engineer
Responsibilities:
- Reviewed requirement documentation (Business, Use case, Functional); analyzed the user requirement specifications and designed the test cases.
- Developed qualitative and effective Test Scenarios, Test Cases and Test Plan.
- Actively involved in entire SDLC and STLC of an application.
- Attended the walkthroughs and status meetings.
- Performed Test life cycle and bug life cycle through potentially shippable product increment with estimated framework.
- Used Agile QA methodology for the entire product life cycle.
- Updated the Risk Matrix after conclusion of the severity and priority of the defects.
- Performed Build Acceptance testing for application once build comes in test environment.
- Reviewed Report Module with positive and negative test case and performed negative testing.
- Performed Compatibility testing with Virtual Machine, as few modules were Desktop Application.
- Ensured that all necessary documentation is appropriately organized for each build.
- Manually executed test cases for each testing cycle and conducted functional, integration, system and regression testing.
- Actively involved in Bug Life Cycle and reported bug through Quality Center.
- Created and Executed of Test Cases and analyzed results including Black box and Regression testing with the help of HP Quality Center and Quick Test Pro.
- Performed data validation and testing extensively by using SQL queries including multi-table joins, and sub-queries.
- Used Table Checkpoints to check values of object properties, Image checkpoint for property values of an image and information in a table respectively using Quick Test Professional.
- Automated test case through QTP for regression testing and Build Acceptance testing.
- Created descriptive program for regression testing through VB Script in QTP.
- Used QTP to perform data driven testing and to parameterize data.
- Maintained the repository for regression test cases and updated the same based on change requests initiated by the users.
- Used Rendezvous point to better control and generate peak load on the server thereby stressing it, and measuring its performance using LoadRunner.
- Created Virtual User Scripts, defined User Behavior, ran Load Test Scenario, monitored the Performance, and analyzed Results using LoadRunner.
- Prepared the test data files, manipulated the test data files and uploaded the test data through FTP and executed the test cases.
- Performed User Acceptance Test (UAT) on GUI screens to ensure that developers met the user expectations.
- Actively participated in build deployment on different operating systems.
Environment: Quick Test Professional, Load Runner, Quality Center, WinNT, Oracle OAM 10g, SQL Server, Oracle
Confidential, Chevy Chase, MD
Quality Assurance Analyst
Responsibilities:
- Reviewed Business and Functional requirement documents; designed and developed test scenarios for Manual Testing of all the modules and performed the tests.
- Designed, developed Test Cases, Test Scripts and scenarios for UI, Functionality, Regression, Integration, and System testing.
- Responsible for creating traceability matrix to ensure all requirements are covered by test cases.
- Extensively used test management tools like Quality Center to carry out test case documentation, execution, and defect logging.
- Developed and exported test cases to Quality Center and mapped all test cases to the business requirements/rules.
- Responsible for defect management includes, defect logging, defect tracking, defect triaging and defect closure.
- Experienced in writing SQL queries to perform Backend Testing.
- Documented test results and SQL outputs and provided daily execution status to team lead.
- Created SQL queries to retrieve data from database to validate the input data.
- Created and executed automated test Scripts using QTP for Functional, Integration, and Regression testing.
- Scripted test suites in QTP for functionality testing of web pages.
- Involved in performing End-to-End testing and Integration Testing of the application.
- Sound hands on experience in Performance testing using LoadRunner.
- Used LoadRunner Controller to perform Load Test, Longevity test and Stress Test.
- Experience in using Team foundation server (TFS) to write test cases, map requirements to test cases, and log defects.
- Conducted Browser compatibility testing using different browsers.
- Participated in the Bug review meetings and Bug Triage meetings.
- Assisted the team lead with daily QA tasks. Attended project meetings, release meetings, requirements review meeting and QA status meetings.
- Assisted Users and Business Analyst in completing User Acceptance Test.
Environment: SharePoint, .Net, SQL Server Management Studio, Quality Center, QTP, LoadRunner, Team Foundation Server (TFS), PVCS, Windows XP, MS Office
Confidential, Richmond, VA
Software Tester
Responsibilities:
- Involved in preparing the Test Plans, Test Scripts, Test strategy document for the CSPM project.
- Linked Test Cases to Application Functional Requirements ensuring traceability throughout the testing process.
- Analyzed user requirements, attended Change Request meetings to document changes and implemented procedures to test changes.
- Uploaded all test cases in TestDirector and executed those using Test Director.
- Debugged the test cases, verified the test results and reported the defects using Test Director.
- Tracked bugs using Test Director and performed regression testing of the entire application once the bugs are fixed.
- Created high-level test cases for the source to target mappings in ETL.
- Executed the Test cases using PL/SQL developer and verified the results.
- Worked on SQL joins, cardinalities, loops, aliases, views and aggregate conditions.
- Performed System, Integration and Regression testing on web-based application using QPT.
- Used HP Quick Test Pro to create new scripts and maintain existing repository.
- Generated VB test Scripts using QTP for some of the applications; Master Object Repository was maintained in the central repository and changes were made to it for every version.
- Worked extensively with developers to resolve the errors and bugs.
- Tested the source to target mappings and verified the data.
- Documented all the mappings and reports that were tested.
- Created a QA summary document at the final stage of the project.
- Generated test scripts for different users logging into the system with different user roles.
Environment: Informatica Power center, TestDirector, Oracle, DB2, Erwin, QT, LoadRunner, Windows, UNIX
Confidential, Arlington, VA
Responsibilities:
- Executed the planning, installation and maintenance of hotel technology comprised of Hardware, Software, POS Systems and Telecommunications.
- Supported:
- Microsoft Windows XP Professional/Server 2003, 2008 Exchange network of servers, desktops and laptops.
- Nortel Options 61c/Meridian/Elite Innovations Telecommunications and Voicemail.
- Micros/HSI Restaurant and Starbucks Point-of-Sale.
- Xeta Call Accounting.
- Dometic Minibar/El Safe.
- Janus Reader Board System.
- Exinda Layer-7 Performance Tuning Appliance and Elfiq Load-Balancer.
- Security Surveillance and Saflok/Onity Key Systems.
- 100% Cisco Wireless/Wired infrastructure and Guest Portals.
- PrintMe and GBCBlue Business Center Applications.
Environment: Windows XP Professional/ /Exchange, UNIX