We provide IT Staff Augmentation Services!

Software Quality Engineer  Resume

Portland, OregoN

SUMMARY:

  • Experience in defining detailed test plans, including organization, participants, schedule, test and application coverage scope.
  • Strong analytical experience in developing detailed business level test scenarios and individual test events and scripts based on functional and business requirements.
  • Experience in developing, reviewing and managing requirements traceability (requirements, application components, test cases, test case execution results).
  • Contributed, developed Test Plans, Cases of various Business applications, and complete familiarity with all phases of SDLC.
  • Experienced with supporting Test Cycles, Compiling Test Status Reports, participating Defect Status meetings, Project Status meetings interacting with Project Managers & Operations Teams.
  • Proficient in: Functional, Integration, System, UAT, Load & Performance and Regression Testing.
  • Expertise in writing SQL queries in SQL *Plus command line, Oracle SQL Developer, MS SQL Server MGT Studio, TOAD for Back - end Testing of various databases, such as Oracle 11g, MS SQL Server 2005/2008, Facets & Redis.
  • Used QC/ ALM, Rally for maintaining the Test Plans, Test Cases, Test Execution, Defect tracking and managing, Bug Reporting and familiarity with other features of Confidential QC/ ALM and Administration.
  • Developed automated Test Scripts using QTP/UFT, Microsoft Visual Studio Coded UI and automated various Business Flows for E-Commerce activities.
  • Conducted Performance Testing using LoadRunner for E-Commerce applications and generated summary reports for Transactions time, Resource monitoring and Applications health check for different kinds of user profiles.
  • Worked with a variety of Operating Systems including Windows 7, Windows Server 2008 R2 Enterprise .
  • Excellent communication skills, ability to work as part of a team and on own.
  • Versatile team player with excellent interpersonal and technical documentation skills and handling multiple projects simultaneously.

TECHNICAL SKILLS:

Test Management &Defect Tracking Tool: Confidential Application Life Cycle Management ( Confidential ALM), Unified Functional Testing (UFT), Confidential LoadRunner, Microsoft Visual Studio Coded UI, Rally

Database: Microsoft SQL Server 2008, Oracle 11g, Facets & Redis.

Office Application: MS Office Professional Plus 2013 (Word, Excel, Visio, Lync, OneNote, Outlook).

Operating system: Windows 7, Windows 8, Windows Server 2008.

Language& Framework: .NET, C#, C++, Java, XML, J2EE, VB Script, SQL.

EMPLOYMENT HISTORY:

Confidential, Portland, Oregon

Software Quality Engineer

Responsibilities:

  • Actively contributed in sprint planning meeting by creating and breaking down QA US (User Story) into tasks and estimating completion time for those tasks.
  • Participated in daily scrum call by updating in Microsoft OneNote with work plan for current day (today), work completed day before (yesterday) and impediments, if any.
  • Participated in triage meeting and provided feedback to reported defects by describing and demonstrating STR (steps to reproduce) to DEV’s as well as to whole team.
  • Attended backlog grooming for product backlogs, user story(s) for current sprint as well as partial/incomplete user story(s) that needs to be assigned to future sprint(s).
  • Worked closely with the QA Lead to outline strategy and process for bringing automation in line with existing functional aspect of the test effort.
  • Responsible for the design and development of automated test script using Microsoft Visual Studio Coded UI for windows (client server/desktop) application.
  • Created, executed and maintained automated scripts for desktop application using Microsoft Visual Studio Coded UI.
  • Followed keyword-driven framework to develop C# scripts using Microsoft Visual Studio’s Coded UI for better maintenance and code reusability for test automation.
  • Designed and generated keyword-driven automation test scripts using C# to address areas such as - regression testing, positive (happy) testing, or usability in preparation for implementation.
  • Developed keyword-driven framework to develop C# scripts using Microsoft Visual Studio’s Coded UI for ODBC in order to perform back-end (database) validation.
  • Participated in product design reviews to provide input on functional requirements, product designs, and potential problems.
  • Closely worked with QA Lead to design, modify, and execute test plans, test scripts, test cases, and have documented both manual and automated tests in detail.
  • Maintained detail documentations for test execution of test cases and re-test of defects.
  • Shown extensive expertise in quality assurance and testing phases including - integration testing (regression), back-end testing using MS SQL Server Management Studio 2012, Citrix login for Facets Database UI, and deployment readiness testing.
  • Executed back-end testing to validate data between multiple databases (Redis database in contrary to Facets database).
  • Proven expertise in the analysis of defect status, defect tracking, logging & reporting.
  • Proficiency in writing of very detailed test cases (to cover maximum functionality), generating and maintaining test scripts as per the business specifications, user and functional requirements.
  • Extensively used Rally to manage, execute and track test plan, test cases, test results, team as well as own task status, testing status, sizing and estimation.
  • Sent out acceptance request for user stories & defects on completion, both owned and not-owned.
  • Participated in product features training (for business) meeting.
  • Attended code review meetings to get overview of product features to initiate test prep for those features.
  • Attended retrospective meetings to rate and reflect on team’s approach for accomplishments at current sprint as well as likely approach for higher accomplishments in future sprints.

Environment: .NET (4.5), C#, Microsoft Visual Studio 2013 (Coded UI), Rally, Microsoft SQL Server Management Studio 2012, TOAD, Citrix, Facets Database, Redis Database.

Confidential, Florida

QA Analyst

Responsibilities:

  • Analyzed Business Requirement Specification (BRS), Software Requirement Specification (SRS) and User Requirement Document (URD).
  • Developed Test Plan and Test Approach artifact with resource requirements and time estimates.
  • Designed and developed Test Scenario, Test Cases, and Test Steps for various Business methods covering both positive and negative testing requirements.
  • The projects followed Agile methodology, where developer, user and tester worked together to create stories (individual features of a project), document requirements by interviews and analysis.
  • Wrote story narratives, functional & non-functional system requirements, Test scripts.
  • Performed Manual Testing of web-based and client-server applications and used Confidential ALM for test management tool.
  • Worked on ALM for creating and documenting Test Plans and Test Cases and register the expected results.
  • Extensively used SQL *plus and Oracle SQL Developer tools to access and manipulate Oracle 10/11G database and validated back-end testing.
  • Developed automated Test Scripts by writing VBScript to perform Functional Testing, and Regression Testing of the application implementing data-driven framework, hybrid framework.
  • Performed Functional and Regression testing following data-driven framework using Confidential UFT.
  • Involved in writing SQL queries, Database Checkpoints to verify data quality and calculations.
  • Involved in Daily Meetings and Walkthrough with various teams as required for better understanding the business requirement, software specification, and development process flow at various stages.
  • Utilized ALM to track, report, and manage defects throughout the test cycle and attended Defect Status Meeting on daily basis during testing cycle.
  • Attended weekly Project Status Meeting with Development team and QA Manager, and worked closely with QA Manager to define Test Scope, Gap Analysis, Risk, Dependency, and Constraints.
  • Participated in Task Forces to implement changes and accomplish important initiatives throughout Testing Operations.
  • Acted as an end users, business users and project team to resolve issues.
  • Communicated with Application Developers, Project Manager and other Team Members on Application testing status.

Environment: .NET, Java/J2EE, VBScript, Application Lifecycle Management, Unified Functional Testing, Oracle 10g, SQL Server 08, SQL, PL/SQL,TOAD, IBM Web Sphere, and Windows Server 2012.

Confidential, Beltsville, Maryland

Software Test Engineer

Responsibilities:

  • Analyzed software requirements, workflows and designs to define detailed test suites, test cases, test data and procedures.
  • Written test plans, documenting bugs, and communicating with Development, Operations, and Product Management.
  • Developed test cases, test scripts, and test scenarios from approved requirement and design documents.
  • Assisted in Project Management, by accurately reporting testing progress and status over time.
  • Tracked defect disposition, test defect resolution along with regression testing to ensure system stability.
  • Worked very closely with the Application Design Team and provided practical feedback to the Designers.
  • Worked with the QA Manager in developing R esponsibility Matrix es for the Team Members periodically.
  • Developed automated Test Scripts by writing VBScript to perform Functional Testing, and Regression Testing of the application.
  • Performed Functional and Regression testing using Confidential QTP.
  • Developed custom function/sub-routine libraries to support automated testing solutions.
  • Created both keyword-driven and data-driven framework to reduce automated scripts maintenance time and repository.
  • Created XML file ( data-driven framework) for test input data for better scripts optimization and enhancement.
  • Performed functional decomposition of requirements for developing test cases.
  • Worked with users and Business Analysts to define and design test scenarios and test data.
  • Wrote SQL queries to test Data Integrity, Referential Integrity and performed Database Testing for the Application.
  • Used Quality Center as the test repository and used it for executing the test cases and scripts and logging & generating various reports and graphs for further analysis.
  • Used Quality Center for bug tracking and reporting, also followed up with the development team to verify bug fixes and update bug status.
  • Designed Performance test scenarios using LoadRunner, Executed Stress test and analyzed the results.
  • Used Rendezvous concept of LoadRunner to generate peak load onto the server thereby stressing it, and measuring its performance and performed dynamic Parameterization and Correlation testing.
  • Used Load Controller to create Goal Oriented scenario and moderated Load Test and identify system’s capability. Conducted Load and Reliability testing on Website’s workflows to identify and report performance bottlenecks.
  • Used LoadRunner Analysis to analyze test result and locate bottleneck for performance degradation.
  • Attended periodic meetings, teleconferences and led discussions on problem resolution.

Environment: ASP, ASP.NET, VB.NET, C/C++, HTML, DHTML, VB, SQL Server, TSQL, TCP/IP, Windows Professional, IIS, MS Exchange Server, Confidential LoadRunner, Quality Center, Quick Test Professional and IE.

Confidential, McLean, Virginia

Software Quality Assurance Specialist

Responsibilities:

  • Reviewed Business Requirements Documents and the Technical Specification.
  • Involved in Planning, coordinating, developing Test Plans, Test Procedures, and Test cases based on the Requirements and Design Documents.
  • Conducted Functional, System, Integration, Regression, Performance, UAT, and Smoke Testing of Customer Care and Billing application (Ensemble) with specific focus on CSM (Customer Service Management).
  • Developed Test Scripts by writing VBScript and conducted Regression testing for various application modules using QTP.
  • Tested the application manually by executing test cases prior to automation.
  • Used QTP to create Test scripts using VBScript for both System Testing and Regression Testing.
  • Developed Test Plan for System, detail & over all Test Plans using the Business Specifications.
  • Conducted the system integrated testing of the application manually for different Modules.
  • Developed test plans to outline the scope, approach, resources, and schedule of testing.
  • Designed manual test cases based on functional requirements and technical documents, and executed test cases during System, Regression and User Acceptance testing.
  • Executed SQL queries to validate the front-end data with the database (back-end). Used DML to manipulate the data wherever applicable to verify the functionality.
  • Performed End-to-end Testing which involves complete application environment in a situation that mimics real-world use, such as interacting with a database, applications or backend system.
  • Evaluated and documented the application’s ability to meet the established requirements.
  • Involved in the team meetings with representatives from Development, Database Management, Configuration Management, and Requirements Management to identify and correct defects.
  • Tested many functional areas including uploading of published texts into the database, classification, and search results relevance estimation and sorting, Graphic User Interface, and sharing information.
  • Performed testing of a web application developed for the customer access: developed test cases for functionality testing based on functional specification and feature descriptions, executed test scripts, verified bugs fixed.
  • Actively assisted developers to solve issues by reproducing reported defects on demand basis.
  • Responsible for automated testing of Smoke Test and Regression Test using QTP.
  • Developed Data Driven Automation Test Framework and enhanced test scripts using Parameterization, Optional Steps, Synchronization points and verification points.
  • Effectively handled dynamic values while enhancing scripts using Regular Expression
  • Created data-driven automation framework and used external xml based file to store test data for easy maintenance of automation scripts.
  • Created shared object repositories and associated with test scripts for robust automation test suites.
  • Reported test automation test results to management accordingly.

Environment: Quick Test Professional, Confidential LoadRunner, TestDirector, Quick Test Professional, SQL, Java, J2EE, HTTP, XML and IIS SQL Server 2005/2008.

Hire Now