Qa Tester Resume
Seattle, WashingtoN
SUMMARY
- About 3 years of experience in the technology industry in Software Quality Assurance and software testing with diverse projects, clients and industry.
- Experienced in analysis, design, development and software testing.
- Strong experience and knowledge of manual and automated testing, white - box, black-box testing, system testing, Agile testing, functional testing, regression testing, and load testing. Versatile in a wide range of manual and automated testing tools.
- Excellent analytical, problem-solving and documentation skills, Team player with excellent interpersonal and communication skills.
- Diversified experience in Software Quality Assurance and Software Testing with in-depth knowledge of Software Development Life Cycle (SDLC), having thorough understanding of various phases of SDLC such as Requirements gathering, Analysis/Design, Development and Testing.
- Experience in preparing Test plans, Preparation of Requirement Traceability Matrix (RTM), writing Test cases, developing and maintaining test scripts for manual and automated testing environment.
- Performed Manual and Automated Testing on Windows and UNIX platform.
- To perform different Test levels.
- Perform AgileTesting methodology.
- For Executing Scripts manually, involved in preparing data.
- Experience in all Facets of the QA Life Cycle and SDLC, with timely delivery.
- Experience in Black Box, Positive, Negative, Data-driven, Unit, Integration and System.
- Performed Back-End Testing manually and using automated testing tool by executing SQL queries.
- Experienced in Regression and Functional Testing.
- Experienced in reporting bugs using Bug tracking tools such as Hp Quality center.
- Performed Smoke testing, Sanity testing, Usability testing and Security testing of various applications.
- Extensive knowledge of Test Matrix and Traceability Matrix and performing Gap Analysis.
- Have inserted many check points using QTP to test availability of the data in the application.
- Expertise in Performance Testing, Stress Testing and Volume Testing of Client/Server and Web based applications using different tools.
- Used Test Director and HP Quality Center for Bugs reporting and communicating to developers, product support and test team members.
- Conducted User Acceptance Testing (UAT) to ensure the correct business logic.
- Ability to quickly understand and interpret business requirements.
- Good Documentation and Process Management skills with an ability to effectively understand the business requirements to develop a quality product.
- Excellent Communication Skills along with verbal and written skills. Exceptional ability to learn new concepts in least time.
- Possess Excellent Professional Skills for working independently and as a team member to deliver the results in the most efficient manner and ensuring a healthy environment among the team members.
TECHNICAL SKILLS
Programming Languages: SQL, C/ C++, Java, XML, Visual Basic and SOAP UI
Operating System: Windows, Linux, Unix
Testing Tools: HP Quality Center, JIRA, Toad, QTP, Load Runner, Rational ClearQuest, Rational Functional Tester, Win Runner, Unix shell scripting, DOORS, Eclipse, JTAF, SOAP UI, REST
Scripting Languages: VB Script, Unix shell scripting, Perl, TSL
Databases: Oracle, MS SQL Server, DB2, MS Access
Development Methodologies: Waterfall, V-Model, Spiral- Model, AGILE, SCRUM
Defect Tracking Tools: Quality center, JIRA, Product Studio, Redmine, Rational Clear Quest
Other Core Competencies: Requirement Analysis, test planning, test execution and test results analysis.
PROFESSIONAL EXPERIENCE
Confidential, Seattle, Washington
QA TesterResponsibilities:
- Involved in writing test cases and test scripts to verify requirements and design based on defined testing standards.
- Responsible for defect identification, documentation, communication and tracking in accordance to related procedures in the available tools.
- Performed Scalability, Performance and Load testing using Load Runner creating Virtual User Scripts, Defining User Behavior, Running Load Test Scenario and Monitoring the Performance, Analyzing Result
- Coordinated and Collaborated with QA team, Architects, Project Managers, Business Analysts, Leads, and Developers to ensure that the Quality Assurance deliverables and approach are comprehensive and meet required objectives.
- Ensured Test traceability from high level requirements to lower level requirements to test cases.
- Identified and executed positive and negative test cases and test scenarios and the data sets required to support them.
- Analyzed requirement documents and used it to come up with a suitable test plan.
- Developed automation test scripts for performing regression testing on the application using QTP.
- Tested the interface between database and the application
- Verified the application’s functionality on different configuration with QTP
- Handled dynamic Objects using regular expression in QTP
- Maintained various versions of Test Script.
- Performed Sanity and Smoke testing
- Perform AgileTesting methodology.
- Back End Testing using Database Check Point in QTP
- Used Load Runner for Performance, Stress, and Load Testing
Environment: QTP, UFT, Load Runner, MS-SQL server, Windows 2000
Confidential, Kent,WA
QA Tester
Responsibilities:
- Worked with SMEs to understand business requirements and prioritized Test cases based on customer needs and Business Requirements.
- Prepared Test Plans using Enterprise Quality Center.
- Analyzed change requests and created/updated test cases for maximum test coverage
- Coordinate testing effort, test deliverables, status reporting, defect escalation to management and development team
- Produced Requirements Traceability Matrixes to test cases using Quality Center.
- Logged and tracked defects in QC and JIRA.
- Participate in the technical review of test automation project deliverables
- Developed and executed test cases with different test sets for different objectives based on the business/functional requirements.
- Facilitating test case reviews with the development and business team.
- Responsible for creating and maintaining the regression suite for projects for regression of new releases.
- Performed various types of testing, such as Functional, Negative, Regression and User Acceptance Testing, AgileTesting methodology.
- Executed functional tests on Web Based UI Application with various test data
- Performed User Acceptance testing for different payment system.
- Performed integration testing during different testing phases.
- Queried Oracle Database using SQL to validate Business Process and Conducted through backend testing
- Used Quality Center to identify, logging, reporting and prioritizing defects.
- Reviewed extensive SQL Queries with complex multi-table joins and nested queries.
- Documented defects in Enterprise Quality Center and helped developers identify the modules and code sections to be fixed.
- Produced Requirements Traceability Matrixes for test case traceability.
- Work with development, QA, and Automation engineers to research, design, and develop complex test automation units.
Confidential, Seattle, WA
QA Analyst
Responsibilities:
- Wrote SQL queries in SQL Server and Oracle databases to perform Backend Testing
- Drafted test strategies, test plan and test cases based on functional specification
- Executed test cases manually and verified actual results against expected results.
- Performed manual testing at initial stages followed by automated testing
- Performed Backend testing using complex SQL queries manually by Unix Shell Scripting
- Validated some of the reports, using complex SQL queries, generated during the process
- Performed negative and positive testing manually, along with functionality testing
- Interacted with developers and the Business on a regular basis for enhancing the requirements.
- Recorded requirements artifacts, types, attributes and Traceability criteria in the Requirements Management Plan document and categorized requirements using the FURPS model Functional, Usability, Reliability, Performance, Supportability
- Was involved in User Acceptance Testing and interfaced with automation team to ensure the automated scripts comply with system test requirements.
- Performed data integration testing on the application
- Involved in writing scripts for the load tests as per the requirements
- Documented and reported defects within established process and tracking systems using Mercury Quality center.