We provide IT Staff Augmentation Services!

Qa System Test Engineer Resume

2.00/5 (Submit Your Rating)

New Jersy, NJ

Expertise Summary:

• Over Nine plus years of IT experience in QA testing with extensive Web based, Client/server for Finacial, Banking, Telecom domains.
• Experience in creation and execution of Test Plans, Test Scripts and Test Cases using both Manual and Automated testing techniques.
• Experiencein performing System testing, Integration testing, Performance testing, Functional testing, Regression testing and User Acceptance Testing.
• Experienced in analyzing Business Requirements and Specifications. Worked with Development team and Business Analysts to analyze the test scenarios and ensure that test requirements are correct and complete.
• Sound knowledge and exposure in working with the various relational databases like Oracle, SQL Server etc.
• Experience with QA Methodology and QA Validations to ensure the Quality.
• Experience in creating test scripts for automation testing using Mercury Interactive Tools Quick Test Pro, Win Runner and LoadRunner
• Extensive working knowledge in UNIX/Linux operating systems.
• Good knowledge in DBMS and RDBMS concepts for large oracle databases.
• Good knowledge in Unix and Shell Scripting.
• Strong Communication and Interpersonal skills to maintain effective work relationships with all levels of personnel. Clear Understanding of Business Rules and Ability to work as a part of team.

TECHNICAL SKILLS

Operating Systems: Windows 98/NT/2000/XP, UNIX
Testing Tools: Mercury Quick Test Pro 8.2/9.2,Load Runner, Rational RequisitePro (ReqPro), ClearQuest Test Manager (CQTM), Doors,
Bug Tracking Tools: Quality Center 8.2/9.2, Test Director, PVCS tracker, Clear Quest,
Languages: C, C++, SQL, RDBMS Oracle, SQL Server, DB2, MS Access, Sybase, Rapid SQL
Web Technology: JavaScript, VBScript, ASP, JSP, Servlets, XML, HTML
Version Controls Clear Case.

Work Experience:

Confidential,April’2011 – Present
QA System Test Engineer

Responsibilities

  • Involved in analyzing system design specifications and developed Test Plans, Test Scenarios and Test Cases to cover overall quality assurance testing.
  • Execution test cases and reported bugs using Quality Center.
  • Testing of the entire process including reports and back-end data validation
  • Participate in test plan and test cases review meetings.
  • Performed Back end testing by writing SQL statements like Join, Inner Joins, Outer joins and Self Joins used Rapid SQL and SQL Developer.
  • Performed Backend/SQL Test for Insert, Update, and Functions.
  • Use some of the basic commands while testing UNIX.
  • Perform and validate positive, negative, system and integration testing.
  • Developed SQL Queries to retrieve or create test data from different Oracle Test databases
  • Used Rational Suite to Developed and execute Manual test cases for Functional and Regression testing of various modules of the application.
  • Performed different kinds of tests such as System test, Integration test, Regression test, User acceptance test, Smoke test, Ad hoc test, Load/Performance, Volume and Stress test etc.
  • Performed database testing by writing queries in Oracle and Sybase.
  • Connected remotely to UNIX servers using PUTTY and FTP files across different test environments by using Hummingbird and Command prompt.
  • Ran the batch jobs using client component Ascential Datastage Director for validates jobs, schedules jobs, runs jobs and monitors jobs.
  • Generated Reports in Front end and validated Reports in Backend to ensure data integrity and validate Business rules.
  • Reviewed Error log files in UNIX box when order fails to load into SQL tables.

Environment: Sybase, Rapid SQL, Doors, Quality Center, Rational Clear Quest, Unix, Rational Clearcase.

Confidential,VA June’2010 – March’ 2011
QA Test Engineer

Responsibilities:

  • Written Quality Test Plan based on the Business Plan, Qualified Idea phase, Concept Phase, Definition phase and Development Phase and Plan, Test Strategy, Business Requirements, Solution Design, and Interface Design Documents.
  • Created detail Test cases for each test phase to ensure complete coverage. Test Cases were incorporated both positive and negative test conditions. Executed test cases from Quality Center.
  • Maintained Traceability Matrix to track the requirements to the test cases to ensure complete test coverage in the Mercury Quality Center.
  • Prepared SQL scripts for test data preparation for Back End functionality testing.
  • Effectively coordinated between end-users, development and testing teams.
  • Performed Integration testing, System testing and Regression testing
  • Wrote SQL scripts to verify data in the DB.
  • Performed Back end testing by writing SQL statements using SQL Plus and SQL Developer.
  • Involved in System Testing, Regression Testing and Integration Testing.
  • Participated in requirements walkthroughs with users to get better understanding.
  • Used Rational Clear Quest for bug tracking and followed up with development team to verify bug fixes, and update bug status for third party application.

Environment: Java 1.5, J2EE, JSP 1.2, Servlets 2, Javascript, Ajax, ExtJS, Oracle BPM 10gR3, Web Logic Server 10g R3, Aptana Studio 2.0, SQL Developer 2.1.1, IBM Rational Team Concert (RTC) 2.0, CSS, Oracle 10g, Windows XP, IBM Rational Clearcase, RequisitePro (ReqPro), ClearQuest Test Manager (CQTM)

Confidential,Ashburn, VA Oct’ 2008 – May’ 2010
QA Software Tester

Responsibilities:

  • Written Test plans, Test cases, executed Test cases for SQL/backend test and tracked defects in Quality Center based on the Business Requirements, Functional Requirements, Business Workflows, A&D documents and ICD documents.
  • Participated in BREQ meetings and FREQ meetings to keep track of new requirements from the project.
  • Reviewed the A&D documents with Tech lead, Database Developers and Test team for better understanding of the requirements.
  • Performed System Testing, Integration System test (IST), End to End (E2E), D2D Test, Environment Shakeout test, Implementation Shakeout test, regression testing, UAT test and Production test per the needs of the application and record Issues / Defects and track in Quality Center.
  • Participated in End to End testing flowing orders from Order entry to Billing
  • Validated flat files coming from downstream systems and mocked up feed files by using vi editor.
  • Performed Back end testing by writing SQL statements like Join, Inner Joins, Outer joins and Self Joins used TOAD and SQL Developer.
  • Track and report upon testing activities, including the test case execution stage, defect status if any defects opened during execution and the testing results status.
  • Tested the data extraction procedures designed to extract data into flat feed files.
  • Connected remotely to UNIX servers using PUTTY and FTP files across different test environments by using Hummingbird and Command prompt.
  • Ran the batch jobs and validates jobs, schedules jobs, runs jobs and monitors jobs.

Environment:
Windows 2000/XP, Oracle 9.i, 10g, Outlook, UNIX, Humming Bird, Putty, TOAD, SQL Developer, XML, Quality Center 9.2 IE 6.0, 7.x, JAVA, Web Services, JCL, COBOL, DB2.

Confidential, Feb’2007 –Sep’2008
QA Analyst

Responsibilities:

  • Written Test Plan and Test Cases. Written and Executed Test Scripts.
  • Participated in System Testing, Regression Testing and Interface Testing.
  • Performed operations on the front end and check all the tables involved whether they are updated correctly as per the specifications. Verified Data with the Toad Interface tool running SQL queries in the Data Base (Oracle8i & 9i).
  • Participated in Functional, Integration, Regression, System testing and end -to-end testing.
  • Interacted with development team in discussing/resolving various application defects.
  • Written SQL statements manually to validate data from Database using SQL*PLUS.
  • Used Mercury Quality Center for bug tracking and followed up with development team to verify bug fixes, and update bug status.
  • Performed System testing in different browser versions IE4.0, IE5.0, Netscape 4.08, and Netscape 4.73.

Environment: Quality Center, Oracle, SQL, HTML, UNIX, Windows NT/XP,Mainframe, MS Office Tools, MS Visio, QTP

Confidential,MD Oct’ 2006 – Jan’ 2007
QA Test Engineer

Responsibilities:

  • Developed Test Plans, Test Cases to test Screens and workflows for Quality Assurance Involved in both Manual and Automated testing.
  • Conducted end-user training sessions (120+ total users) for exercising full use of application.
  • Developed and executed real-world scenarios based on the client\'s business requirements to ensure functional/non-functional integrity and define feature boundaries.
  • Performed various testing methodologies which includes Smoke Testing, Regression Testing, Security Testing, User Acceptance Testing and System testing, OS compatible test, Brower compatible (cross browser) testing for each build manually and report defects using bug tracking tool Test Director.
  • Responsible to develop and execute WinRunner scripts for Arbitron’s web based application.
  • Implemented Performance test plan, executed Performance and Stress tests against different project releases, using Mercury’s Load Runner tool and sent analysis reports to project team.
  • Worked with team members in executing various functional test cases and perform Black Box testing.

Environment: JAVA, Oracle 9i, UNIX, WinRunner, Test Director, and LoadRunner.

Confidential,NY Jan’2006 – Sep’2006
QA Test Engineer

Responsibilities:

  • Involved in various phases of the project including gathering requirements, analysis, functionality documentation and created ER diagrams to represent the logical data flow of application.
  • Interacted with client\'s personnel to define business processes and software usage.
  • Created change requests for developers by defining the required features and functionality. And worked closely with technical staff to validate business requirements.
  • Assisted in writing test plans and testing the functionality of the web-based application.
  • Extensively used Microsoft Excel in creating Test scripts using Excel functions, logged and tracked defects using Issue log.
  • Responsible to design Test Cases, Test Scripts and execute Test Scripts for each release and track and report defects using bug tracking tool Issue View.
  • Wrote test plans, test cases of the application, business process and performed Usability and Integration testing.
  • Attended several walkthrough meeting with the Business Analysts, Project Manager and developers and provided feedback accordingly.

Environment: Oracle, HTML, Microsoft Tools, Windows 2000 and UNIX, MS SQL Server

Confidential,Elkhart, IN Sep’2003-Dec’2005
(QA Analyst)

Responsibilities:

  • Interacted with software developers to understand applications functionality and navigational flow.
  • Took part in preparation of Project plan, user acceptance testing. Got detailed understanding of business functionality and took the responsibility of preparing Functionality Test Plan which was used by testing team while doing their testing process.
  • Created test scripts using WinRunner tool and performed interface, functionality, and regression testing on new builds of the software.
  • Utilized TestDirector for tracking requirements and communicating them to the team during the test process and integrated this capability with e-mail, ensuring that all the communication about a requirement is in a single location.
  • Identified functionality and performance issues, including: deadlock conditions, database connectivity problems, and system crashes under load using LoadRunner.
  • Utilized LoadRunner Analysis tool to analyze the response times of the business transactions under load.

Environment: ASP.net,SQLserver2000,WinRunner, LoadRunner, Test Director, HP UNIX, Oracle 10g/9i and Windows 2000.

Confidential,McLean, VA Jul’2002-Aug’2003
(QA Analyst)

Responsibilities:

  • Wrote test plans and procedures, Created and executed test scripts for client\'s web-based application
  • Eventually created and executed automated test scripts using WinRunner tool
  • Used Test Director to track, analyze and document defects
  • Wrote and Reported defects using TestDirector and verified fixes and closed bugs during regression testing.
  • Conducted smoke testing, performance testing, data driven testing, screen navigational testing, Application Integration Manually
  • Effective co-ordination between development team and testing team
  • Wrote and executed SQL Statements to retrieve data from backend.
  • Execution of the test scenarios and scripts and review of product functionality

Environment: Java, SQL Server, Oracle, HTML, JavaScript, JSP, UNIX, Windows NT, WinRunner, Test Director and LoadRunner.

Education:

Associate Degree in Computer Information Systems.

We'd love your feedback!