Sr. Performance Engineer, Test Resume
Warrenton, VA
SUMMARY:
- Extensive experience working as a Software Quality Assurance Engineer with more than 10 years of proven expertise in database management, designing and developing test strategies, maintaining Financial, Banking, Health Care and Insurance Applications and diversified Capability in Manual and Automated testing of Web and Client/Server Applications on UNIX/Windows Environment; with experience as SQL Server Database Administration (DBA)
- Experienced in managing a technical team on large and complex projects, setting up SQA environments and infrastructures, defining the software quality strategy, directing SQA activities, leading and mentoring junior SQA staff.
- Proficient knowledge in software development methodologies (Waterfall, Iterative, and Agile/Scrum, Software Development Life Cycle (SDLC) and CMMI.
- Tested applications manually and also used automated testing tools for testing the integration and implementation of large - scale systems for internet and internet-based applications.
- Supported day-to-day operations like applying updates, fixes and analyzing production support issues.
- Wide experience in reviewing and understanding business rules and testing requirements and writing detailed Test Plans, Test strategies, Test Design, Test Data Design, Test Cases, Test Scripts and Test Matrix.
- Incomparable working knowledge of designing and implementing all SQA test strategy plans, manually and automated test solutions for client/server and Web based applications with HP and IBM Rational test suite (Quick Test Pro/UFT, Win Runner, Load Runner, Test Director/Quality Center/ALM, Rational Quality Manager (RQM), and Rational Team Concert (RTC).
- Proven expertise in the review, design, development, and implementation of a software quality assurance infrastructure.
- Performed various types of testing including Smoke, Functional, Negative, Boundary, Integration, Regression, Database, Web Services, System, User Interface (UI), 508, Security, Black Box, End-to-End, User Acceptance, Performance, Load and Stress.
- Ability to handle multiple tasks, work under pressure and get things done.
- Exceptionally efficient and reliable with excellent communication verbal and written skills.
TECHNICAL SKILLS:
Testing Tools: Win Runner, Load Runner 9.0/10.0/11.0 , Quick Test Pro 9.5/10.0.11.0/ UFT, Quality Center 9.0/10.0/11.0 /ALM, ALM PC Rational Clear Case, Clear Quest, Test Manager, Rational Team Concert (RTC), Quality Manager (RQM), JAWS 5.0/8.0, InFocus, ACC Verify, soapUI, Snag It
Scripting Tools: TSL, VB Script, Java Script.
Operating System: Windows 98/NT/2000/XP/7, UNIX/REFLECTIONS, MS-DOS, MVS/ESA, Linux and Mac OS 9.0/10.0/11.0 .
Database: Oracle 10g/9i/8i, My SQL, SQL Server 2000/2005/2008 R2, MS Access, DB2, Sybase, Toad Oracle, SQL Developer.
Application Server: BEA Weblogic, Sun application server.Web Servers IBM Web Sphere, Microsoft IIS, Apache, PHP, ASP, Sun Java system Web Server, IE and Netscape.
Languages: Java, Perl, PHP, TSL, C, C++, Visual Basic, SQL, PL/SQL, HTML, DHTML, XHTML, COBOL, Business Objects.
Version Managers: Visual SourceSafe, CVS, PVCS, Star Team, Serena Change Man Version Manager, Share Point, JIRA, IBM Rational Products.
Utilities: MS Word, MS Excel, MS Word Perfect, MS Applications, MS PowerPoint, MS Visio, MS Project, Lotus Notes, REMEDY Client 6/7, Serena Team Track 6.6, Micro Strategy, Matlab, Labview, AutoCAD
PROFESSIONAL EXPERIENCE:
Confidential, Warrenton, VA
Sr. Performance Engineer, Test
Responsibilities:
- Supporting ITF performance testing for Healthcare Quality Information Systems (HCQIS) and Infrastructure and Data Center Support (HIDS) contract for CMS
- Conducting load, performance, Stability, Stress and Breakpoint test for PQRS GPRO, PQRS Submission, PQRS EIDM, PQRS SEVT, PQIP, Crown web, QMARS, Composite portal, HQR, AXWAY, DDST, FIVS, ESRD application.
- Creating Performance Test Plan, Test design, Test data, Script work follow, Script dictionary based on the ITF engagement document.
- Working closely with ADO, Middleware, DBA and OPNET team to define the testing needs.
- Creating performance test cases and test scripts based on the requirement describe in the ITF engagement.
- Performing smoke/shakeout test before running any performance test scenario.
- Running performance test scenarios using LR Controller and Performance Center (ALM PC) to validate database entries, log entries, and test data based on ITF requirements.
- Using HP Load Runner Analysis, Riverbed OpNet analysis, HTTP Watch and others test tools to validate performance requirements.
- Logging any defects found in performance testing using a defect management tool QC/ALM.
- Producing Load Runner Analysis report, interim and final test summary report after performance test completed.
- Conducting maintenance weekend smoke test for all applications, Agent, Controller and Performance Center servers.
- Attending weekly ITF Staff Meeting, Weekly Team Meeting, Application walkthrough, Engagement Form and script review meeting with development team, ITF testers and QA manager. Technical/ Troubleshooting meetings with DBA, Middleware, OPNET team. junior testers and helping others ITF Team members as needed.
Environment: /Tools: HP Load Runner (LR), QC/ALM, HP Performance Center (ALM PC), Windows XP/7, Windows Server 2003 R2 & 2008 R2, Ca LISA, PLSQL Developer, HTTP Watch, Riverbed Performance Management, UNIX, SharePoint, Exceed, SSH Tectia, Oracle WebLogic.
Confidential, Chantilly, VA
Functional Analyst, Lead
Responsibilities:
- Used the Agile Process for all aspects of project work.
- Worked closely with developers, test team members and product managers to define the testing needs.
- Met with business analysts and technical leads to understand product specifics to create corresponding test strategies and scenarios.
- Created the Unit Test Plan (UTP) based on business and technical requirements document.
- Executed various test types within the Agile Process: Smoke, Functional, Boundary, Integration, Regression, Database, System, End to End, Web services, Connectivity, Security, Automated.
- Ran automated Load and Performance test using SoapUi and Load Runner to validate requirements and present reports to managements.
- Assisted with the design of test scenarios to ensure that tests can be executed against the requirements based on technical stories and user stories.
- Executed test scenarios to validate database entries, log entries, and test data based on requirements.
- Attended technical Peer Review Meetings to identify what is included within a release using user stories presented by the business analysts.
- Attended Release Scope Finalization Meetings to identify Acceptance Criteria Matrix.
- Used SoapUI, Advanced Rest Client, HermesJMS, and various test tools to validate system requirements.
- Logged and distribute defects found in testing using a defect management tool - JIRA/RTC.
- Created the Test Evaluation Summary (TES) after the functional testing is complete.
- Supported SQA testers (IV&V) as needed
- Contributed to an atmosphere of cross-functional teamwork within the project lifecycle.
Environment: /Tools: Windows XP/7, Unix, Advanced Rest Client, HTML Tool, SoapUI, Load Runner, Putty, Xming, Notepad++, WinSCP, HermesJMS, SharePoint, Rational Team Concert (RTC)/JIRA, Rational Quality Manager (RQM), Rational Requirements Composer (RRC), MongoDB, Oracle Weblogic
Confidential, Washington, DC
Sr. Software Test Engineer
Responsibilities:
- Developed Test Plans and Test Approach artifacts with resource requirements and time estimates.
- Analyzed business requirements and converted into testable scenarios, and consulted development staff with regards to system functionality.
- Developed Test Cases/Test Scripts based on user story and traceability between requirements to test cases.
- Involved in preparation and execution of test cases based on user stories for CRM UD application.
- Performed full Smoke, Functional, Integration, Regression, System, Web services and Security Testing in Test, Pre Prod and Prod environment.
- Used SoapUi for webs services testing for CRM UD application. .
- Used TOAD/SQL Developer to ensure that tables in the database are updated correctly.
- Frequently used HP Quick Test Pro (QTP) for Regression testing.
- Tracked defects analyzed and documented using Rational Team Concert (RTC).
- Responsible for providing support for acceptance testing in the following ways: Creating test data, understanding and selecting specific test scripts to be used in the acceptance testing, executing tests, interpreting test results, logging test issues, and communicating test issues and/or defects to the development team.
- Performed Load, performance and stress testing using HP Load Runner for CRM UD Applications
- Responsible for supporting the development team discussing the possible defects.
- Responsible for verifying that all defects have been corrected in subsequent test executions.
- Involved in the project from the requirements analysis phase until the completion of UAT.
- Trained new testers about the environment, application and testing process.
- Effectively interpret and communicate test results with responsible engineer(s).
- The projects followed Agile/Scrum Software Development Methodology, where developers, users and testers worked together to create Stories, document requirements by interviews and analysis.
- Attended Daily Scrum meeting, Story planning meeting, Sprint review meeting, Iteration Planning Meeting, Retrospective meeting and weekly Project Status Meeting with Development Team, Testers and QA Manager.
Environment: Windows NT, Windows XP/7, Rational Team Concert (RTC), Rational Quality Manager (RQM), Rational Requirements Composer(RRC), Quick Test Pro (QTP), Load Runner, SoapUI, AIDE SharePoint, TOAD, SQL Developer, Oracle9i, J2EE and JavaScript, C++, UNIX.
Confidential, Suitland, MD
Sr. Software Tester
Responsibilities:
- Investigated design documents, business requirements, systems requirement specifications and applied information to the testing process.
- Responsible for test plan and test scenario development and maintenance.
- Identified and established test requirements and documented them using Quality Center for requirements management.
- Developed test design, test cases and mapped them with requirements using Quality Center.
- Involved in preparation of Test data design, and Test data generation for the project on the basis of business requirements.
- Conducted manual testing for web-based/client server application in a timely and accurate manner that handled high volume transactions daily.
- Performed back end testing to verify all changes made in data tables were being updated in the database by using SQL Developers, and TOAD.
- Managed the testing process, logged and tracked defects using Quality Center.
- Used Software Quality Assurance methodologies and best practices by capturing the data and ensuring the test results are filed in the Quality Center.
- Created automated test scripts using QTP for regression testing after new every build, bug fixes and modifications were completed
- Used HP Load Runner for Performance, Load, Stress Testing.
- Identified and implemented improvements in existing development lifecycle and processes.
- Worked on multiple applications which involve various front ends, middle tier and database.
- Coordinated with developers to resolve issues and fix defects and close them.
- Performed Smoke, GUI, Functional, Positive, Negative, Black Box, End-to-end, System, Integration, 508 compliance, Boundary, Database, UAT and Security Testing.
- Performed User Acceptance Testing for a production release.
- Worked as valuable member of the IV&V team.
- Managed all QC, QTP and Load Runner licenses.
- Created test status report and final test summary report for the client for all applications.
- Attended reviews and walkthroughs to better understand the requirement documents.
- Attended weekly Project Status, Defect tracking, peer review and Technical Test meetings with Development Team, QA Manager, and worked closely with them to define Test Scope, Gap Analysis, Risk, Dependency and Constraints.
Environment: Java, JSP, TIBCO, SOA, Java script, VB Script, People Soft, HTML, XML, Windows, MS Access, J2EE, UNIX, SQL Server, SQL, Oracle SQL Developer, TOAD, PVCS, Team Track Quality Center, Quick Test Pro, Load Runner, VAX, VMS, Lotus Notes, Snag It, Remedy, Serena Change Man Version Manager, Text pad, JAWS, Primavera Web, DOORS, Peer Review Database, Change Request Database, Metrics Data Storage Repository Database
Confidential, Rockville, MD
Web Application Tester
Responsibilities:
- Analyzed and interpreted design specifications, business requirements and end-user feedback and applied information to the testing process.
- Converted business requirements and specifications into test plans, test cases, and test scripts.
- Maintained traceability between test documentation and requirements.
- Executed test cases, reported progress and issues in a timely and accurate manner.
- Used bug tracking systems (Quality Center) for tracking defects and change requests.
- Used HP Quick Test Pro (QTP) to create new scripts and maintain the existing repository.
- Performed User Interface, Functional, Integration, System, End-to-end and Regression testing.
- Documented test execution results and created defect tracking records for problem resolution
- Collaborated with the Support and Development team to research and test production issues.
- Identified opportunities for continuous improvement.
- Tested application developed in Microsoft.Net and/or J2EE technology platforms.
Environment: Java, JSP, HTML, XML, SQL, Windows, Unix, C, C++, Visual Basic, Quick Test Pro, Application server side based on J2EE framework, UML, .Net Application, Quality Center/Test Director.
Confidential, New York, NY
SQA Analyst
Responsibilities:
- Involved in complete testing life cycle including, writing test plans, test cases, collecting test data for various projects and documenting test results.
- Created test plans for integration system testing. Assisted the Software Quality Assurance team in creating the test environment.
- Prepared Test Plans from Functional Requirement Documents (FRD) and test scenarios supporting the testing procedures by sign-offs with business analysts and project managers.
- Extensively used SQL Analyzer to perform database testing.
- Performed regression testing on web-based applications.
- Generated VB test scripts using QTP for some of the applications.
- Master Object Repository was maintained in the central repository and changes were made to it for every version.
- Responsible for developing test scripts using Load Runner.
- Performed volume and stress testing using Load Runner.
- Used Controller to perform load, longevity and stress test
- Performed average CPU usage, response time, Transection per Second (TPS) analysis for each scenario.
- Performed smoke and mini regression testing in production.
- Participated in test environment setup for application testing.
- Performed GUI, integration, regression, system and UAT testing.
- Created test logs and documented the test summary using bug tracking systems (Quality Center/ Test Director) and sent them to developers.