We provide IT Staff Augmentation Services!

Senior Qa Analyst Resume

Irving, TX

SUMMARY

  • 7+ Years of diversified IT experience in Quality Assurance, Validation, Testing Client - Server and Web based applications.
  • Experienced in testing financial, billing and Telecom applications.
  • Experienced in Oracle 9i/10g RDBMS, PL/SQL programming, SQL*PLUS.
  • Expertise in Unix Sun Solaris, basic shell scripts, Dimension modeling, data warehouse concepts
  • Actively involved in various phases of testing life cycle such as functional, regression, integration, system and User acceptance testing
  • Specialized in analyzing the functional specifications and writing detailed test plans, test cases, execution of test scenarios, analyzing the actual results against the expected results, creation of test data and automation of test cases for system and regression testing and analyzing bug tracking reports.
  • Proficient in Manual and Automated Testing using Win Runner, Quick Test Pro and experience in Test Director, Quality Center and Rational suite
  • Sound knowledge of SDLC methodologies and SEI CMMI practices
  • Excellent interpersonal, analytical and communication skills.
  • Extensive work experience and expertise in GUI application testing. Functional, Integration, Regression and System testing.
  • Expertise in Agile and Iterative project experience.
  • Good experience in Test Strategies, Test Plan Creation, Development and Management.
  • Excellent experience in Development of web based and client-server applications.
  • Good experience Web and Adobe Flex apps testing.
  • Excellent knowledge in QA methodologies, QA processes and software lifecycle
  • Excellent knowledge in Requirements gathering and Functional specifications Review
  • Well versed in Software testing methodologies, Development of test cases, Design and Execution of strategic Test plan and test scenarios.

TECHNICAL SKILLS

Tools: Win Runner 8.2/7.0, Load Runner 9.5, Mercury Quality Center 11.0,Quick Test Pro 8/9.2, Requisite Pro 7, Test Manager, Clear Quest 7.0, Rational Functional Tester, Sub version, Rational Robot, BOXI Reports, Microsoft Visual source safe 6.0.

CM Tools: PVCS Version Manager 8/7, Clear Case 7.0

GUI/Languages: TOAD 11.0, SQLPLUS, TSL, HTML, JAVA, JScript, VBscript, ASP, XML, Developer 2000 (Forms 6i, Reports 6i), Visual basic 5.0/6.0, Informatica 7.1, Business Objects 5.1,, MS SharePoint2010,C/C++,C#.Net,ADO.NET,ASP.NET.

Web Server: Web logic 5.0, MS IIS 5.0/6, JBASS, APACHIE.

Databases: Oracle 10g/9i/8i, MS-Access, MS SQL Server 6.0/7/2000, DBMS.

Software: MS-Office 2003/2000, MS Word, MS Excel, MS Power Point

Operating Systems: Windows XP/2000/2003/Vista (Beta)/Vista, MS DOS 6, Sun Solaris UNIX 10, RH Linux ES3

PROFESSIONAL EXPERIENCE

Confidential, Irving, TX

Senior QA Analyst

Responsibilities:

  • Had an in depth knowledge on the requirements and end to end business flow which helped in formulating detailed test specifications including test plan and test scenarios based on requirements.
  • Worked with Business Analyst regularly for developing test cases from the Business Requirements and ensured complete test coverage
  • Reviewed functional specifications and architecture design
  • Coordinated with the offshore team on daily basis and was responsible for handoff to the offshore teams to ensure 24 by 7 testing cycles
  • Created QTP Driven scripts which executes all or selected manual test cases in Excel sheets.
  • Automated the Legacy contracts using QTP
  • Analyzed use cases and detail system designs for creating testing scenarios
  • Developed traceability matrix to map Business and Functional requirements with test cases
  • Defined test strategy and test plans based on product requirements and functional specifications
  • Designed, wrote and execute test scripts for System, Integration, Regression, and Performance testing
  • Was responsible for Interface Testing and Security Testing
  • Played an integral role in E2Ei/VRD testing that often required putting in long hours and working on weekends
  • As a part of End to End worked on more than 10 applications like SFDC(Sales Force.com), PQ(Pro Quest), CAMEO, Premisys, vPrice, OM+, eVal, ECS, Provisioning Controller
  • Had good working knowledge on all the applications that were part of E2Ei
  • Collected test metrics and reported them in a systematic fashion
  • Managed the automation of the test cases for Regression Testing and GUI verification
  • Responsible for automating new applicable tests into the automation framework.
  • Responsible for generating test data and test coverage analysis.
  • Coordinated testing activity wherever required for both onsite and offshore team
  • Participated in daily scrums to effectively communicate the progress and road blocks in order to deal with the constantly changing requirements.
  • Responsible for uploading test plans, test scripts into Quality Center
  • Involved in tracking and reviewing defects using the Quality Center
  • Designed and scripted detailed test cases in Quality Center for regression scenarios
  • Prepared detailed Test Summary Reports

Environment: Windows 7, Java, SQL, Quick Test Pro 10, Quality Center 11.0, HTML, MS Project, UML, MS Office Suite, GCT Tool

Confidential, Centennial, CO

SQA Analyst

Responsibilities:

  • Review Business Requirement Specification Document and the Technical Specification of the application.
  • Involved in project coordination and implemented QA methodology.
  • Developed traceability matrix to map Business and Functional requirements with test cases
  • Interaction with developers, discussing the Specifications provided by the Analysts and also the changes and the discrepancies in the application.
  • Define test strategy and test plans based on product requirements and functional specifications
  • Design, write and execute test scripts for System, Integration, Regression, and Performance
  • Worked with SQL queries for data manipulations
  • Wrote Test Cases for more than 50 screens in according with the business logic.
  • Designed and scripted detailed test cases in Quality Center for regression scenarios
  • Prepared test strategy and test plan documents for new enhancements.
  • Used TOAD for querying and testing the data which is retrieved from development database
  • Produce detailed UAT and System test cases that are still used
  • Collect test metrics and report them in a systematic fashion
  • Performed manual testing, functional testing, performance testing and regressing testing.
  • Managed the automation of the test cases for Regression Testing, GUI and Functionality testing
  • Responsible for automating new applicable tests into the automation framework
  • Tested various links of Home Page, text-hyperlinks, image-hyperlinks and Web based applications.
  • Responsible for generating test data and test coverage analysis
  • Coordinate testing activity wherever required for both onsite and offshore team
  • Manage Defect Management Process in Quality Center and interact with the developers to resolve technical issues/ incidents
  • Participated in daily scrums to effectively communicate the progress and road blocks in order to deal with the constantly changing requirements.
  • Responsible for uploading test plans, test scripts into Quality Center
  • Involved in tracking, reviewing, analyzing and comparing defects using the Quality Center
  • Developed SQL queries and executed them in DB to perform backend testing
  • Designed and scripted detailed test cases in Quality Center for regression scenarios

Environment: Windows 7, Oracle, UNIX, Java, JavaScript, SQL, Quick Test Pro, Quality Center 11.0, Bug Tracker, .Net, ASP.net, HTML, MS Project, UML, Visio, MS Office Suite, Manual, and Automation Testing

Confidential, Atlanta, GA

SQA Analyst

Responsibilities:

  • Involved in Black box, functional testing, integration testing, load testing, regression testing and System testing for modules developed in EXTJS and Adobe Cold Fusion
  • Developed Test Plan, Test Strategy and Test Cases using QC
  • Conducted the Test Plan and Test Case Review meetings with all the project members.
  • Worked as a Test Director administrator and responsible for creating users, groups and assigning permissions.
  • Wrote and executed manual test scripts based upon existing product functionality, specifications, requirements and other sources as needed.
  • Involved in identifying Test Data for executing the Regression Test Cases
  • Developed Baseline Scripts for testing the future releases of the application
  • Tested the functionality of applications by using QTP and also automated the application for Regression testing.
  • Developed the DAT (Data Driven Automation Test) Architecture, is the data-driven methodology implementation using portable scripts and function libraries. The framework uses the concept of modularization and re-usability.
  • The DAT architecture scripting approach separates the common actions into script libraries to enable the use of the scripts across different projects within the organization.
  • The DAT framework incorporates a driver function within every library script to achieve portability across the applications. The library function scripts operate totally independent of each other. The sub functions within the library scripts operate independent of each other to establish a uniform standard flow from the driver scripts to the library scripts.
  • Mapped the security and functional requirements in Test Director to the scripts to ensure traceability.
  • Created T-SQL statements and also Stored Procedures in SQL Server for various schemas as and when required as per the business requirements.
  • Recommend program improvements or revisions to product management, programmers and system analysts.
  • Involved in preparing Daily status Reports.

Environment: Windows XP, VBScript, QTP 9.0, WinRunner 7.6, Load Runner 8.0, Test Director 8.0, Cold Fusion, Extjs, SIEBEL, IIS Web Server, HTML, XML, UNIX, SQL Server, MS ACCESS, Microsoft Visual Source Safe 6.0.

Confidential, Norwalk, CT

QA Analyst

Responsibilities:

  • Analyzed Business, Functional Requirements and automated test cases for both positive and negative tests.
  • As a part of AGILE team I was involved in Project planning session for Estimating of the time and labor for executing the test cases
  • Built the QC 9.2 Architecture for better understanding the Capability and Sub Capability structure
  • Worked on creating Keyword driven automation framework for testing web application.
  • Created QTP Driven scripts which executes all or selected manual test cases in Quality Center or Excel sheets.
  • Customized Classes, created new methods, override existing methods as required for automation.
  • Developed & updated automation scripts using Quick Test Pro on different functionalities of the application
  • Performed Data Driven Test using the test Data from Data files, Excel files etc
  • Used Quality Center as a central repository to store manual test cases, automated QTP generated scripts
  • Developed and maintained automated test scripts using Quick Test Pro on different functionalities of the application
  • Identified all the Properties of Object Repository to ensure that each object is recognized by the script using Object Spy in QTP
  • Involved in development of test cases for Java application. Automated test cases for each valid transaction and reported bugs.
  • Performed System Testing, Regression, Integration Testing.
  • Used customized and built-in Exception Handlers to make sure the scripts work in unattended mode
  • Created Classes and Methods in order to work with Custom Objects as well as standard Objects.
  • Wrote Automation scripts in order to perform back-end testing by running SQL queries on Oracle Database.
  • Created Scripts which can dynamically create the Queries based on the mapping document and executed them.
  • Responsible for validating the quality of data in the reporting mart for campaign purposes.
  • Responsible for testing and sending out the status reports on weekly basis.
  • Responsible for updating and maintaining the Keyword Driven Automation scripts and Quality Center for all the defects found during functional and regression testing and following up the bug life cycle.
  • Conducted meetings to turnover test cases to Regression team.
  • Performed Integration testing to validate integration between sub modules of the application.
  • Tested the Quality Center 10.0 Beta version to know how it affects the testing process.

Environment: Quick Test Professional 9.5, Quality center 9.2, Quality center 10.0, SQL, Oracle SQL PLUS, Java, J2EE, Exceed 2008(UNIX Solaris), MS Excel

Confidential, Billerica, MA

Senior QA Analyst

Responsibilities:

  • Setting up the QA environment by installing all the hardware devices and software applications.
  • Define and write program specifications and Test Plans.
  • Review Use Cases and Create Test Cases based on it.
  • Create UAT Test Specs based on the BRD documents.
  • Involved in Database testing and executing SQL Test Scripts.
  • Writing T-SQL queries to fetch the data from the database and validating the data.
  • Followed Scrum Methodology through the programs.
  • Interact with the Business Analyst and Developers from the project.
  • Maintain Functional and Technical documentation.
  • Identify inefficiencies in, and recommend changes to, IT practices, procedures, and approach to customer service.
  • Testing the Database, Web Services and Smart client applications.
  • Testing Capture and Document Authentication software.
  • Testing Facial Recognition software.
  • Verifying printed cards based on the Card design document.
  • Verifying all security features for printed cards(Digital water marking)
  • Reviewing defects with the project team.
  • Worked on Domestic and International programs.
  • Experience with Configuration management tools like Tinderbox and Perforce(source control)
  • Developing Test Coverage report to QA Manager.
  • Supported Customer UAT Environment (travelled to the Customer site for DE and IA DOT UAT Support)

Environment: Asp.Net, C#.Net, AJAX, SQL Server 2008, SQL, Agile/Scrum Methodology, Oracle 11g, Web Services.

Confidential

Software QA Analyst

Responsibilities:

  • Conducted Distributed and multi-user testing.
  • Analyzed user requirement document and developed test plans, which includes the objectives, testing strategies, test environment and others.
  • Attended Change Request meetings to document changes and implemented procedures to test changes.
  • Wrote test cases and test scripts for the functionality testing.
  • Closely worked with developers and System Engineers to nail down the technical problems.
  • Tested Application with Development, Staging, and Production Environments.
  • Prepared and authored various user levels, system level documents.
  • Prepared Various System and interface Documents for Updated QA Process
  • Coordinated with development and functional teams regarding releases and bug tracking system.
  • Coordinated training for users with latest releases and change requests.

Environment: HTML, DHTML, JavaScript, VBScript, Oracle 8.1, Win Runner, Test Director, Windows XP

Confidential

Software QA Analyst

Responsibilities:

  • Analyze the Functional Business requirements and Design documents, Developed Test Plans for different Testing stages
  • Prepare Test Cases for Modules like Bodily Injury Liability, Property damage Liability, Personal Injury Protection, Uninsured Motorist Coverage, and Comprehensive Damage
  • Perform Manual Testing for the modules mentioned above
  • Participate in enhancement meetings, investigated software bugs, and helped the developers to resolve technical issues
  • Conducted Navigational testing by clicking on various hyperlinks and verifying that it is redirected to the correct page
  • Performed backend testing by executing SQL queries with TOAD for data base authentication
  • Involved in User Acceptance Testing (UAT)
  • Extensively used Win Runner for functional and Regression testing
  • Performed regression testing on the application after modifications were made
  • Maintain Defect Report using Test Director and sending weekly status reports showing the issues to be resolved across the team
  • Make schedules for system and integration testing for each release
  • Prepare detailed Test Summary Reports
  • Perform Load/Performance, Regression Testing using Win-Runner
  • Interact with developers for Spec Reviews, technical problems and to resolve critical bugs

Environment: Test Director, Win Runner, Quality Center, Windows XP, SQL Server, ASP, HTML, TSL, MS Office

Hire Now