We provide IT Staff Augmentation Services!

Qa Analyst Resume

St Petersburg, FL


  • Around 7+ years of experience in Quality Assurance, Requirements Gathering, Software Design, Development and Testing Management.
  • Good understanding of various software development models (SDLC) like Agile, Waterfall, and Verification & Validation and experienced in implementing all the mentioned methodologies.
  • Expertise in gathering the requirements, analyzing the requirements and designing and developing Test plans, Test cases and Generating Test Scripts.
  • Worked in testing methods like Integration testing.
  • Executed tests in appropriate test regions in support of Integration testing using Quality Center.
  • Hands on experience in manual testing and Automated testing using QTP, Quality Center, Microsoft Product Studio and Test Director.
  • Expertise in testing websites by performing content validation, navigation testing and compatibility testing.
  • Extensively performed System Integration testing, data validation for websites, navigation testing, Unit testing, GUI testing, Regression testing, Functional, Positive testing, Negative testing.
  • Experience in test case definitions, development and maintenance of test scripts and documentation of all the phases of the QA lifecycle.
  • Experience in SQL, PL/SQL, DML, DDL, writing complex SQL & Testing scripts for Backend Testing against relational Databases including Data warehouse systems.
  • Experience with writing SQL queries for data validation and data integrity testing.
  • Involved in gathering requirements, developing Test Plans, Test Scripts, Test Cases & Test Data using specifications and design documents.
  • Extensively used rational tools (Test Manager, Clear Case, Rational Functional Tester and Clear Quest.
  • Extensively used Clear Quest and Quality Center for Defect Tracking.
  • Interacted with Business users regarding Business Requirements.
  • Attention to detail and ability to work in tighter schedules and on different applications concurrently.
  • Ability to work in a team environment. Strong communication and interpersonal skills. Ability to interact with customers with ease and professionalism.


ETL Tools: :


Databases: :

ORACLE 8.x/9.x/10 g, Teradata, DB2 UDB 7.8 EEE, SQL Server 7.0/2000, MS Access 97/2000.

Reporting Tools:

Business Objects 5.1 / 6.0, MicroStrategy 9.2.1


C, C++, Oracle, VB6.0.

Web Technologies:

XML, DHTML, HTML, JSP, JavaScript, VBScript



Version Control Tools:

VSS, Clear Case and AbInitio EME.

Testing Tools:

Quality Center 10, Clear Quest, Test Manager, Rational Requisite Pro.

Other Tools:

TOAD, SQL Assistant.

Operating Systems:

Windows 95/98/2000/NT/XP, UNIX

Confidential,St.Petersburg, FL Apr\' 2012 to Till date
Role: QA Analyst

  • Involved in team review of requirements/design document for purpose of creating Test Cases from the Use Case documents and for test planning to validate requirements testability.
  • Obtained a detailed knowledge of the business process being followed in the project environment.
  • Identified system integration requirements, coordinated the collection and verification of business needs.
  • Gathered functional and non-functional requirements
  • Defined test data and test environment requirements
  • Created Requirement Traceability Matrix(RTM)
  • Involved in different types of testing: Functional testing, Regression testing, Build Verification Testing, Acceptance Testing, GUI Testing, sanity check, Validation Testing, Database Testing and Integration Testing.
  • Communicated with team and developers through all phases of testing
  • Extensively involved in analyzing business and functional requirements. Involved in preparing Test strategy, Test Plan, Test Cases and Test data.
  • Worked with development team members, testing team & business people to better understand system functionality in order to improve testing quality control services
  • Prepared test data for positive and negative test scenarios as per application specifications and application requirements and wrote test plans for web distribution
  • Conducted Walkthroughs of the test plans with the development and design team. Understand the requirements and formulate the test specifications including test plan and test scenarios based on requirements.
  • Performed Web Services testing using Quality Center for Manual Testing.
  • Tracked bugs using Quality Center and performed regression testing of the entire application once the bugs are fixed. Tested the website for all the functionalities. Organized and managed planning and execution of Test Cases and tracking and fixation of the errors/defects using Quality Center

Environment: Windows 2000, HTML, XML, Java, ASP.Net, Windows XP, Oracle 9i, Quality Center(Test Director).

Confidential,Holland, MI Jan’ 2010 to Mar\' 2012
Project: Application Maintenance Support
Role: QA Tester

This application involves maintenance and enhancements to a sales application for one of the world’s largest consumer goods company and enjoys over 50% market share of the Johnson Controls products in the United States. The client monitors their sales force operations using legacy system, which is supported based on onsite-offshore delivery model.


  • Analyzed user requirements and software requirement specifications documents
  • Implemented iterative testing using Agile Methodologies.
  • Worked on Integration testing based on Claims System.
  • Responsible for Integration testing with the focus on database, including analyzing and determine programming technical requirements.
  • Participated in full life cycle testing activities throughout all phases of the SDLC
  • Performed a thorough analysis of business requirements, functional requirements and technical requirements to develop robust End-to-End test conditions, cases and scenarios
  • Gathered requirements and developed run tests.
  • Extensively executed Extraction Transformation and loading using SQL queries on oracle tables to verify the data for back end.
  • Created test plan, wrote design steps, copied test steps from the Quality center for the regression test cases.
  • Used Quality Center as defect management tool for adding defect and tracing the changes executed manual test scripts inHP Quality Center and documented results.
  • Performed End-to-End, Functional Testing, Regression Testing.
  • Analyzed testing process by creating reports and graphs using the Quality Center
  • Developed manual test cases for Positive, Negative and Functional testing.
  • Involved in Fail Over, Recovery and Restart Testing
  • Tested the ETL process for both before data validation and after data validation process.
  • Developed SQL scripts to validate the data loaded into warehouse and Data Mart tables using MicroStrategy.
  • Performed data analysis and data profiling using SQL and Informatica Data explorer on various sources systems including Oracle and Teradata.
  • Worked with Cross-Functional teams to resolve issues
  • Coordinated the issues with the technical team and raised the defects in QC.
  • Prepared test status report and weekly status report.

Environment: Windows XP, HP Quality Center 9.2, Informatica Power Center 8.6.1, Teradata V2R6 (MLOAD, FLOAD, FAST EXPORT, BTEQ), Oracle, Java, .NET, MY SQL, MS Office.

Confidential,OH Oct’ 08 – Dec ‘09
Role: QA Analyst


  • Involved in understanding and analyzing requirements to identify key functions and determining the scope of the application.
  • Performed end-to-end Integration testing which covered the entire business process flow from the initial point of sales through to all the downstream systems.

Tested complex ETL Mappings and sessions based on business user requirements and business rules to load data from source flat files and stage tables to target tables.

  • Worked with Data warehouse developers who extensively used Informatica, ETL tool to design mappings to move data from source to target database using stages to understand the functionality.
  • Worked on Functional Testing and Integration Testing based on Customer Accounts.
  • Analyze the enhancement requests raised by business and provide solutions.
  • Identify the problem areas and suggest improvements.
  • Upload the test cases to Test Director as part of the legacy system Test Plan.
  • Participated in Team meetings on a regular basis and involved in active discussion in order to improve the quality of SPA by better strategies and testing approach.
  • Working with the Technical teams and business users to get them on the same page.
  • Involved in Black Box testing (Functionality, GUI Testing).
  • Responsible for Creating, Enhancing, Executing test cases and Tracking defects using Test Director.
  • Regression testing (run regression cycles to test the core logic of the application for every new release of a product).
  • Test case preparation and consolidation and helping team in testing related activities.
  • Verify development of rule-set test scenarios, test cases, expected results and analysis of XML test scripts.
  • Developed SQL scripts to validate the data loaded into warehouse and Data Mart tables using Infomatica.
  • Performed testing of the web application using Test Director to check the stress of the server.
  • Discussions between development, business and other 3rd party team to resolve issues/doubts.
  • Helping development team in test results documentation and reports creation.
  • Working closely with different teams to gather requirements for testing.
  • Analyzing test results and log defects.
  • Ensuring the successful completion of the cycles so that the online regions are up as per the requirement.

Environment: Mercury Test Director7.0, Quality Center 9.2, Informatica Power Center 8.1, VB Script, HTML, XML, Oracle 8.i, Windows XP, MS Office

Confidential,Schaumburg, IL Nov’ 07– Aug’ 08
Project: Accounting Data Warehouse
Role: QA Analyst

Confidential,to publish its financial statements and related schedules and reports for calendar years 2005 through the end of the Catch-Up period (2007). The Accounting Data Warehouse (ADW) will be used as a source system during the Catch-Up period and as a source system once Fannie Mae is at an 11-day close (Get Current).ADW is a certified source of data for most consumers in the Mortgage Securities and Loans (MSL) Get Current work streams. Data stored in the ADW is extracted from its various sources and is transformed per business requirements. Once the data has been loaded into ADW, it is certified and approved by a Get Current business team before the Get Current work streams consume the data.


  • Involved in validating the data that has been populated into Data warehouse Fact and dimensional tables using Ab Initio Data Warehouse tool.
  • Participated in Business Analysis, Technical Requirements and System Design Document Reviews.
  • Worked closely with Business Analysts and Developers in Analysis for existing Information available.
  • Performed smoke test after each and every code drop.
  • Participated in the reviews of Source to Target (S2T) mapping document.
  • Designed and developed Test Scenarios, Test procedures, Test Cases, and Test Data according to Functional Specs and Activities of the Business Stream.
  • Developed Positive and Negative Test Data based on business requirements and Source to Target (S2T) mapping document to make sure all the business rules are covered.
  • Expertise in SQL queries for cross verification of data.
  • Used Mercury Quality Center for creating Test Plans, documenting them and creating test cases and registering the expected results.
  • Reported and tracked defects using Clear Quest.
  • Involved in performing Integration testing, System testing, Functionality testing, UAT and Regression testing.
  • Involved in System testing and debugging during testing phase.
  • Participated in daily status meeting, Coordinated with the developers and Business Analysts to resolve the defects and close them.

Environment: Ab Initio (GDE 1.13.13, Co>operating system 2.13.7), Oracle 10g, Shell scripts, Perl, Clear Quest 9.0, Mercury Quality Center 9.2, Clear Case, MS Visio, Business Objects, TOAD, Autosys 4.5.

Client: Confidential,New York, NY Apr’ 07 – Sep’ 07

Project: DoubleClick’s Data Migration
Role: QA Analyst

Confidential,created application that uses information extracted from internal ad systems and incorporates the data into one easy-to-view reporting interface. The application itself is a Windows based software application that can be easily accessed by both internal and external users over the Internet. Users can view, run, print and save reports all from one centralized interface.

ASPP Interim Reporting: ASPP is the Advertising Server for Programming and Promotions. America Online, Inc. (AOL) is currently involved in a joint venture with DoubleClick, Inc. to maximize revenue by utilizing DoubleClick’s Ad delivery technology. Interim solution requires that DART (Dynamic Advertising Reporting and Targeting) API to provide the required information to AMS for the ASPP Ads in an Order so that the corresponding Contracts and Line Item information. AOL receives the DART Orders and Ads information from DART in order to auto-generate corresponding AMS Contracts and Line Items. This mapping between the DART Orders and AMS Contracts will be maintained and will enable the AMS systems to report on impressions and clicks data for DART Orders and Ads.

Data Migration Phase II:
America Online, Inc. (AOL) is currently involved integrating the data with DoubleClick, Inc. The AOL existing AMS data will be migrated to the DART system to automate the transformation of data and reduce the manual processes for the transition to the new DoubleClick Ad Serving process. The goal for this project is to provide a comprehensive data extract to DoubleClick in order to execute the Phase II data migration. Data transformation rules are defined in order to provide the data in a format that is aligned with the DoubleClick applications.


  • Reviewing all designs and testing of documentation generated during normal design process.
  • Prepared Test Strategy Document, Test Cases and Test Procedures using Business requirement document and Functional requirements document of the system.
  • Actively involved in specifications, requirements meetings and development of Test Plan and Test scripts.
  • Prepared QCD (Quality Control Documents).
  • Involved in performing Functionality testing, Validation testing, Negative and Positive testing, testing manually on the first release of the application.
  • Involved in testing applications for the Business Continuation Plan.
  • Tested the values for test, Component parameters in Quality Center.
  • Wrote scripts to perform basic Sanity Testing and uploaded Test Scripts in Quality Center.
  • Logged issues in Test Manager and tracked them to see whether the bug is fixed in the next version.
  • Used Test Manager for creating Test Plans, documenting them and creating test cases and registering the expected results.
  • Communicated with Developers regarding technical issues.
  • Created the log files and enhanced the scripts to direct the error messages to that respective log file while running the scripts.
  • Involved in source Data Analysis along with developers and Analysts.
  • Modified and maintained test cases with changes in application interface and navigation flow.
  • Prioritized and reported defects using Test Director and Mercury Quality Center, to present documents and reports in weekly team meetings.
  • Involved in prepare the documentation and manuals for User Acceptance testing.
  • Performed End-to-End testing after bug fixes and modifications.
  • Participated in weekly status meeting, Coordinated with the developers and testers to resolve the defects and close them.
  • Actively participated in all phases of testing life cycle including document reviews, inspections, and project status meetings.
  • VSS to upload track and manage scripts and documents.

Environment: Java, Oracle 9i, Sybase, Shell scripts, Perl, Web-Logic 8.2, Test Manager, Rational Requisite Pro, Clear Case, DBArtisan, Autosys 4.5, Remedy, Windows NT, UNIX, MS Excel.

Client: Confidential, INDIA Oct’ 04 – Mar’ 07
Software Engineer

  • Worked with Business Analysts, Architects, Developers and Users to analyze the business requirements, functional specifications, and technical specifications.
  • Formulated detailed Test plans based on the Requirements and created test cases in Test Director for manual testing of the application.
  • Mapped the requirements to the scripts in Test Director.
  • Created, updated and maintained the Traceability Matrix mapping the requirements to test cases for Online Ordering Processing and Billing modules.
  • Developed test scenarios based on the End User requirements, conditions and scripts.
  • Used Test Director to create the test requirements, manual test cases and defects.
  • Analyzed and imported test data from spreadsheets into Test Director using Excel.
  • Performing smoke testing after each build every day.
  • Interacting with developers regarding priority of bugs and update the status of bugs once they are fixed.
  • Managed the testing process, log and track defects using Clear Quest.
  • Used SQL queries for performing the Back End Testing.
  • Executed SQL statements against multiple databases and confirm results displayed in different types of reports Performed Functional Testing, System Integration Testing, Performance Testing, Positive and Negative Testing, and Regression Testing.

Environment: JDK 1.2.2, JSP, Servlets, Weblogic4.5, and Oracle 7.x
Education: Bachelor of Engineering (Information Technology)

Hire Now