- 5+ years of professional diverse experience in quality assurance and testing in a wide variety of environments
- Performed various testing types like Functionality, compatibility, reliability, usability, installation, Stress load, backend, data driven, security, regression and related testing methodologies.
- Testing of web application, client - server and database projects on Windows and UNIX operating systems.
- Worked on all the phases of SDLC and Strong knowledge in Agile, waterfall and iterative models
- Strong in Analyzing Business Requirements, Business System Specifications, Business System Requirement Specifications and Use Cases and Story Board Requirements.
- Knowledge of QA principles, ISO standards, SEI SW CMM and/or SE CM models a plus.
- Familiar with object oriented programming, UNIX scripting, software languages such as C++, JAVA, SQL.
- Excellent ability to learn things quickly, Equally strong in logical and analytical thinking and In-depth knowledge in timeline analysis, reviewing designed test cases of different projects in different domains.
- Generating QA metrics report on each project daily basis which are daily status reports, weekly status reports and monthly reports.
- Effective and proficient in team meetings, BA meetings and Developers Concerns. Proven ability to work independently and as a team member to assure accomplishment of goals and objectives within the timelines.
- Followed Risk based approach testing methodology in time constraints
Testing Tools: HP ALM, QTP, Quick Test Professional (QTP), Silk Test, Rational Clear Case, Requisite Pro, Quality Center, Clear Quest, Bugzilla, Visual Intercept, Microsoft Office
Operating Systems: Windows, MS DOS, HP-UX, Sun Solaris, Linux
Database Systems: Oracle, Sybase ASE, MS SQL Server, MS Access, DB2
Web Servers: Apache, Tomcat, Windows IIS, BEA WebLogic, IBM WebSphere
Version Control: MS Visual Source Safe, Rational Clear Case.
ETL Tools: Informatica, DTS, FTP, Sunopsis.
Software: MS Office Suite, MS Project, MS Visio, MS Works, Financial Software Products
- Baseline all the requirements and gathered various related documents from BA’s and Developers
- Analyzed Business requirements and identified the scenario’s .
- Identified the functionalities of the application to be automated in the AUT (Application under Test).
- Analyzed the positive and negative flows by using Use Cases of the Application.
- Involved in preparing automation Test Plan Document.
- Created and maintained Requirements traceability matrix in Quality Center.
- Developed VB script for all the modules which need to be automated using QTP.
- Enhanced the Script by inserting Verification points, Regular Expressions, Synchronizing points.
- Analyzing the Backend DB2 table and mapping it with invoice structure.
- Changing the DB2 tables' content as per analysis.
- Performed backend testing in DB2 to verify the validation from the application.
- Worked with Keyword Driven, Test automation Framework in QTP.
- Performed all the web services using SOAP Scope testing tool.
- Parameterized data for Data driven testing in order to implement Retesting using multiple sets of data.
- Conducted Regression Testing on every new build after the Bug fixes with the help of QTP Logged the defects in to Quality Center and followed until it fixes.
- Performed backend testing using SQL Queries to compare the data from the application.
- Exported the Manual Test cases from Excel to HP ALM
- Executed Automation test cases from Quality Center.
- Prepared Requirement Traceability Matrix to verify that all the business requirements which are mapped with the test cases
- Used Object spy to capture the objects of the application and also stored the objects in Object Repository.
- Updated the testing status to Business Team, Developers and QA peers when it was necessary.
- Worked closely with the defect management coordinator and Performed defect analysis for each and every signoff of the Testing life cycle.
Environment: s: DB2, VB SCRIPT, SQL, Scrum, Quality Center, HP ALM, SQL SEVER, SOAP, VB SCRIPT
- Analyzed hardware and software test requirements and test data required for setting up the E2E environment
- Involved in Vendor management and Stake-holders meetings to work on Business requirement documents technical specifications
- Interacted with product team to fill gaps related to different phases of project
- Create detailed cost breakdowns, projections and estimates for the E2E environment with the help of Team Lead
- Create and maintain detailed project implementation plans, timelines and deliverables list
- Prepared rrequirements traceability matrix to track the Requirements vs Test cases coverage
- Conduct meetings with application teams, BSA’s and DBA’s to setup test data and work as coordinator for the system integration
- Drafted and executed test cases aligned with Test Scenarios by using Risk-Based testing approach
- Was involved in Data and transaction mapping for order flow process
- Document the order process flows for test execution
- Create test data to test connectivity, functionality and proper flow of information across applications
- Was involved in off shore resource meetings and Knowledge Transfer sessions
- Designed and executed test cases and scripts to support Smoke and Sanity testing
- Executed and maintained regression test suite as each build is migrated to production
- Generate test reports and present the QA status report along with defect status reports
- Executed and maintained automated test scripts to support Regression testing to test against build in UAT
- Worked on performance test, integration test, system test, black box test, data validation test and white box test-0
- Was involved in sessions with BA's, SMEs, and Development teams to accomplish the in-scope deliverable.
- Conduct peer to peer review meetings on daily basis to discuss outcomes of the executed test scripts and decide further road-map and to maintain a progressive environment.
- Scheduled Walkthroughs for the Project Owners & Management team to provide them in detail statistics of the testing effort
- Designed various Test Artefacts - Test Plan, Test Metrics, RTMs and Test Cases..
- Performed Smoke Test, Functional, Regression, Performance, data validations and compatibility tests for the AUT.
- Facilitate the requirement gathering and JAD Sessions with BAs, SMEs, Development and Testing team.
- Create RTMs for mapping the Test Cases with the provided Use Cases to track the requirement coverage.
- Create & maintain FRMs related to every functionality by analyzing the risks involved.
- Executed test cases, analysis of test results and maintenance of Test metrics.
- Performed Functional and Regression testing of the online community application using QTP.
- Provided statistics to analyse latency (transaction response time), throughput, pages/second served and system resources graph for key objects, like CPU utilization, Memory, and Thread usage.
- Documented the results from each test run and conduct in-depth analysis on the transaction response time and the performance of each server.
- Generated detailed performance report that includes graphs and tables for various performance object counters and transaction response time.
- Determine baseline performance metrics and bottlenecks for specific business / work flow to be measured against future changes through different applications and URL's using Load Runner.
- Wrote SQL queries to validate and update the data stored in the databases.
- Was involved in Cross-browser testing for different versions of IE, Mozilla Firefox.
- Maintained and created HTML reports and post results in ALM.
- Used HP ALM to manage and coordinate the test planning, execution, PR process and defect management.
- Log the bugs and follow up with the Dev team until the issues are resolved and retested.
- Created and maintained the daily Test Summary Report.
- Collaborated with different members of the team to document and validate functional requirements
- Was involved in sprint grooming and requirements analysis sessions
- Participated in reviewing all High Level and Low Level Documents
- Member of the core team for project related decisions
- Involved in system testing of all features manual as well as automated using QTP.
- Performed Stress testing and Load testing of features using Load Runner.
- Reviewed and validated the test plans, test strategy docs, configuration matrix, and test cases
- Prepared various metrics and reports on defect analysis
- Was involved in Defect-triage meeting to discuss on the bugs
- Was part of Knowledge transfer and training sessions with team members, particularly new hires with test related processes
- Involved in designing SQL queries to generate defect reports by vendor, severity, components and submitters
Environment: Windows Server Operating Systems, Linux, Localized Operating Systems, MS SQL, QTP, Quality Center