Qa Tester Resume
TX
SUMMARY
- 6+ years experience in IT Industry with strong knowledge and experience in Analysis, Development, Testing and Troubleshooting in a variety of challenging environments.
- Proven experience in translating requirements into specification documents, design automated Test Scripts, Test Plan formulation and Test Cases for manual testing as well as automated testing.
- Experience in creating Test plans, Test Scenarios, Test Cases & Test Scripts and analyzing Test Results.
- Proficient in Oracle Programming: SQL, PL/SQL, database triggers, stored procedures, functions, constraints, packages, indices, clustering and materialized views. Efficient inOptimizing, Debugging and testing SQL queries.
- Expertise experience in Developing Unix Shell Scripts (Korn).Invoked the Korn shell torun the scripts developed using features like redirection, file description and commandgrouping.
- Hands on experience with Mercury Tools: WinRunner, QTP, Loadrunner.
- Expertise with Test Management tools: Test Director, Quality Center, ALM
- Design, development and coding with ORACLE 10g/9.2/8.x/7.x, SQL Server 2000/ 2005/2008 for data validation.
- Implemented the Software Development Life Cycle (SDLC), Agile, Waterfall and Spiral Process for the testing life cycle to follow the Confidential process in the application.
- Extensive experience in different testing methods like System Testing, Functionality Testing, Integration Testing, End to End Regression Testing, GUI Testing, Web application Testing, Black box testing .
- Collaborated with Developers during Unit testing (including QA presence during code reviews to ensure conformance to group standards)
- Experienced in environments requiring direct customer & client interaction during specifications, test plan design, implementing and QA phases.
- Strong analytical and good communication skills. Flexible in adapting new technologies
TECHNICAL SKILLS
Testing Tools: Winrunner, LoadRunner, Rationalsuite, TestDirector, QualityCenterQTP 8.2/ 9.2/9.5/10.0, Rational Robot, RationalTestManager, ALM
Languages: XML, C/C++/C#, SQL, PL/SQL, HTML, VB.NET, ASP, UNIX shell scripting.
Protocols: TCP/IP, UDP, FCIP, ISCSI, IFCP, HTTP, SNMP
Operating Systems: UNIX (Sun OS, HP), Linux, Windows XP/NT, DOS
Databases: Oracle 9i/8i, IBM DB2 8.x/7.x, MS SQL Server 6.5/7.0/2000/2005/2008, TOAD
Data Modeling: ERwin4.1/3.5.2,ER/Studio, Dimensional Data Modeling, Star Schema Modeling Snow - Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling,Visio.
Data Warehousing: Informatica Powercenter 7.1.1 / 6.2.1 /5.1 (Repository manager, Source analyse warhouse designer, workflow manager, workflow monitor), Data cleansing, Erwin, OLAP, OLTP, SQL*plus,SQL*Loader, Data stage.
PROFESSIONAL EXPERIENCE
Confidential, TX
QA Tester
Responsibilities:
- Reviewed and Analyzed Business Requirement Document -- BRD and created detailed test scenarios to validate the application functionality.
- Tested the Application using Agile methodology.
- Tested the entire application and performed various tests like Positive & Negative, Business Functionality, Integration, and Regression, End to End and also supported User Acceptance Testing (UAT).
- Manage application and Salesforce integration for lead management by MC.
- Used ALM for Defect Reporting and Tracking and also for managing the entire test documentation like Requirements Management, Test Case Designing, and Test Execution & Generation of the required reports. Participated in weekly status meetings and interacted with the developer team to discuss technical issues.
- Attended and organized various meeting like, Requirement Review, Defects Triage, Go no Go meetings and effectively contributed for successful application delivery within the deadlines.
- Created automated functional regression test scripts using QTP and scheduled, organized and executed the test scripts in ALM-QTP integrate environment and documented the test results.
- Used ALM extensively for writing test cases, Execution of test cases, maintenance of test results, defect reporting & Tracking and also for generating detailed test reports.
- Tested cross browser functionality of browser as well such as Internet Explorer, Chrome and Fire fox
- Validated the online application's compatibility on mobile devices like iphone, ipad & other PDA's. Tested App downloads installation, accessibility, navigation, functionality, convenience and end to end functionality of the application.
- Involved in Writing SQL Queries to retrieve data from data base
- Performed SQL validation to verify the data extracts and record counts in the database tables.
Environment: JAVA, .Net, JSP, Adobe Flex, ALM, SQL, QTP(9.2), VB Script, UNIX Shell Scripts, SQL server 2008, IE, Chrome, Firefox
Confidential, NY
QA Tester
Responsibilities:
- Tested the Application using Agile methodology.
- Involved in System Testing for both Automated and Manual Testing.
- Attended Sprint planning, story sizing and daily scrum meetings.
- Conducted meetings for the Story Acceptance with Product Owner including Business Analyst, Unix Team and Development team.
- Participated in Walkthroughs with Team lead, Business Analyst, Unix team, Product owner and the Development team to discuss the outstanding defects and scope change requests
- Daily monitored and Coordinated with QA offshore Team.
- Peer Reviewed Test cases written by offshore.
- Uploaded and executed the Functional Test cases into HP Quality Center.
- Executed the test cases in several phases as the developers fixing the bugs.
- Written/Updated UAT scripts for the users.
- Sent out QA Go/NoGo form before moving to UAT/Production environment.
- Supported users in UAT Testing.
- Conducted meetings with Product owner, BA’s, Unix team and Dev team to walkthrough each UAT defects logged by Users.
- Used an Incident Report (IR Tracker) for the bug tracking purposes.
- Manual Testing was done to perform functional testing on the interface.
- Performed Back End Testing by executing SQL statements.
- Created and reported defects using ALM.
- Web testing, GUI testing, end to end testing, production assurance, System testing, and User Acceptance testing(UAT) were performed as a part of different quality activities such as bug tracking and control
- Conducted extensive Regression testing as a part of Release test.
- Automated the Application by using Quick Test Pro.
Environment: JAVA, JSP, Portlets, Adobe Flex, Quality Center, SQL, QTP(9.2), VB Script, UNIX Shell Scripts, Java Script, QC(Quality Center), TOAD.
Confidential, NY
QA Tester
Responsibilities:
- Developed Test plans and Detailed Test cases based on business requirements.
- Analyzed the process to ensure that Business requirements are met and adhere to standards imposed by DW operations.
- Tested the Application using Agile methodology.
- Attended Sprint Palnning, Story sizing and daily Scrum meetings.
- Attended the Requirements reviews, Design reviews, Code reviews to understand the overall process flow and Source to Target Mappings.
- Closely worked with Business Analyst and DSO team to better understand the Requirements.
- Daily monitored and Coordinated with QA Offshore team.
- Peer Reviewed test cases written by offshore team.
- Regression Testing were conducted after each engine run.
- TOAD is used to test various ETL process flow graphs to meet the Business rules and Technical Requirements.
- Involved in Writing SQL Queries to retrieve data from data base
- Performed SQL validation to verify the data extracts and record counts in the database tables.
- Validated tests by cross-checking data in backend writing SQL queries.
- Used QC (Quality Center) for test management and defect tracking
- Worked very closely with the Dev team to analyze at the backend level and to describe the defects.
Environment: JAVA, SQL, QTP(9.2), QC(Quality Center), TOAD, Client/Server Application
Confidential, VA
QA Tester
Responsibilities:
- Responsible for development, documentation, and maintenance of the test data
- Reviewed business/system requirements, change requests and design specifications to identify gaps and ensure requirements/business rules are clear, consistent, and testable..
- Developed test scenarios and designed, and prioritized test scenarios and test cases that will provide efficient coverage of requirements consistent with acceptable level of risk.
- Created RTM and participated in the Test Case Peer Review to ensure test cases appropriately translate and map to requirements.
- Identified test data required to complete test scenarios and interacted with the automation test team, development team, users to ensure test conditions are in sync.
- Developed SQL Queries to perform database testing.
- Performed extensive manual and XML API testing.
- Involved in functional and Regression Testing, modified and documented scripts for Regression testing.
- Created Project, Requirements, Test Plans and Tests in Quality Center.
- Responsible for liaisoning with the downstream systems in verifying that these systems are not adversely affected by the changes.
- Responsible for data conversion testing efforts.
- Performed Backend Testing by using SQL statements for the purpose of database integrity and for data validation of parameters.
- Identified defects and tracked the status of the defects by coordinating with the development team using Quality Center.
- Actively participated in JAD sessions, defect review meetings and created defect/test metrics/ dashboard reports.
- Coordinated with developers to fix the bugs and conducted Functional (Positive and Negative), and Integration Testing.
- Used Doors for requirements tracking and use case processes.
- Involved in functional and Regression Testing, modified and documented scripts for Regression testing
- Responsible for data conversion testing efforts.
Environment: Windows, Unix, Oracle 9i/10g, Web Logic, J2EE, Quality Center, DOORS, QTP, ETL, Backend, Ilog, Jrules, JAVA,Clear Quest, Clear case, SQL Plus, Rapid SQL, MS Excel
Confidential, NJ
QA Tester
Responsibilities:
- Analyzing and reviewing the user/business requirements and functional specs documents.
- Involved in System Testing for both Automated and Manual Testing.
- Developed Test Plans in and test Strategies for manual testing.
- Automated the Application by using Quick Test Pro.
- Understood the Entity relationship diagrams to build the SQL Queries for Database testing.
- Worked with Query analyzer and Toad to write the SQL Queries.
- Compared the Extracted data with Business Objects Reports
- Created Project, Requirements, Test Plans and Tests in Quality Center.
- Developed automated test scripts from manual test cases for Regression testing based on the requirement documents using Quick Test Professional.
- Extensively worked with VB Scripting to customizing for enhancing the scripts.
- Used Quality Center for defect reporting and tracking.
- Developed the Test Automation Methodology for automated testing of the entire application by using Quick Test Pro.
- Developed automated test scripts from manual test cases for Regression testing based on the requirement documents using Quick Test Professional.
- Performed back-end testing on the Oracle database by writing SQL queries .
- Performed Load Balancing Testing by using Load runner.
- Created Scripts By using VUGen in Load Runner.
- Manual Backend Testing to verify integration and validity of data in both OLTP & OLAP systems.
- Tested the extraction of the data from different OLTP source systems, modification &loading into the target Staging/Target database.
Environment: QualityCenter9.2/10,QTP 9.2/9.5/10.0, LoadRunner 8.2, VB Scripting Oracle, PL/SQL,Toad, IBM Mainframes, Terminal Emulators, DB2, PVCS, C# Java, J2EE, VBScript, WebLogic, Server 8.1, SQL Server, Informatica Power Center
Confidential, NY
QA Analyst
Responsibilities:
- Involved in collecting requirements and interacting with customer to ensure that the application is implemented with business requirements.
- Created, executed and analyzed Test Cases based on functional requirement.
- Conducted Smoke testing for every restart and system enhancements.
- Developed automated Test Scripts in QTP.
- Enhanced the QTP scripts by using different check Points.
- TOAD is used to test various ETL process flow graphs to meet the Business rules andTechnical Requirements.
- Writing a SQL Queries to retrieve data from data base
- Executed Test cases using both Manual and Automated testing method.
- Test the application does not break when multiple users are accessing the application.
- Performed Functionality, GUI and Browser, Integration and System Testing, User acceptance testing.
- If multiple version of invoice is generated, then tested to make sure that the website is displaying the latest version of invoice.
- Used SQL queries to make sure that the data inserted properly in the right tables forwhich the data was entered from front-end screens.
- Verified the invoice amount is calculated correctly and export data functions working correctly.
- Parameterized the scripts in QTP to test the multiple sets of data.
- Tested the extraction of the data from different OLTP source systems, modification and loading into the target Staging/Target database.
- Conducted Regression testing after every new build.
- Validated tests by cross-checking data in backend writing SQL queries.
- Conducted backend testing using Toad
Environment: Quality Center 9.2/10.0, QTP 9.2/9.5/10.0, ASP.net, C#, Visual Basic, VBScript, SQL Server, Oracle, Business Objects, Informatica