We provide IT Staff Augmentation Services!

Qa Analyst Resume

Minnetonka, MN


  • 7+ years of Experience in the IT industry with emphasis in Testing/Quality Assurance. Excellent skills in Manual and Automation Testing (QTP).
  • Extensive experience in Quality Assurance Life Cycle such as setting up test environments, developing Test Strategies, Test Plans, Test Cases (Manual / Automated), Test execution, analyzing results, generating Defect Reports and Tractability Matrices.
  • Expertise in descriptive scripting in QTP.
  • Extensive experience in testing SOA applications, UI applications written in C#.Net
  • Working experience in testing applications in Industry Automation Domain
  • Exposure to Energy Automation Domain
  • Tested open source PBX, Asterisk in Telecommunication Domain
  • Expertise in Automated Tools – Quick Test Professional (QTP), Quality Center, and Clear Quest
  • Formulating Test Strategies and Preparation of Test Plan and Test Scenarios
  • Reporting and prioritizing software bugs in conjunction with the Development & QA Managers Interaction with Team and Users. Also independently lead/mentored QA Team and Coordinator between off shore and onshore team members’ w/varying skill-levels to execute/deliver on testing the business application.
  • Strong working Knowledge of programming languages like, C++, VC++, C#.Net, VB .NET
  • Experience in testing database applications using SQL and PostgreSQL
  • Experience in coding using C# for application development
  • Extensive working experience on Functionality Testing, GUI Testing, Regression Testing, Stress Testing, Integration Testing, User Acceptance Testing, Database Testing, Load Testing and Black Box Testing.
  • Experience in debugging C++ Platform Code in Visual Studio 2005 and 2008
  • Hands on experience in QT programming for internationalization
  • Experience in IBM Rational Purifier and Pure Coverage for code coverage and memory leak
  • Very strong experience on conducting root cause analysis, risk analysis.
  • Experience in working on Service Oriented Architecture (SOA)
  • Experience in doing static code analysis.
  • Knowledge in performance testing using LoadRunner
  • Basic knowledge of Python
  • Experience in analysing non-functional requirements
  • Strong in Design, Development and Execution of Test plans, Test Cases, Test Scenarios and Scripts.
  • Strong working experience in development environment based on Agile and Scrum Methodologies
  • Strong in Design, Development and Execution of Test plans, Test cases, Test Scenarios and Automated Scripts.
  • Experience in using bug tracking tools such as Mercury Test Director and Quality Centre for requirement analysis and tracking test results.
  • Analyzing bugs, interaction with development team members in fixing the errors.
  • Very good experience in Version Control Systems like Rational ClearCase, ClearQuest, Rational Suite and CVS
  • Relationship management with various Project Managers, Analysts and End Users for test planning and execution
  • Strong working experience with onsite and offshore model and excellent coordination experience.
  • Excellent communication and interpersonal skills, has clear understanding of business procedures, adapt new environment quickly and ability to work as an individual and as a part of a team.
  • Very work ethical and capable to delve into the new leading Technologies.
  • Worked under tight deadlines and capable to effectively handle multiple ongoing projects.

Testing Tool : Quick Test Professional (QTP) 9.5, Test Director
Code Analysis : Rational Purify, Rational Pure Coverage, Source Monitor, FxCop, Simian
Bug Tracking Tool : Test Director, Rational ARTS Plus, Rational Clear Quest
Languages : C++, C#, Python, VB
Version Control : Rational Clear Case, CVS
Other Tools Used : MS Visio , Microsoft Visual Studio 2008, MS Visual Studio 2005
Operating System : Windows XP

ACADEMIC Qualification

  • Masters in Computer Application

Client: Confidential,Minnetonka, MN May 2010 – Nov 2011
Project: Loop Wizard (LoWi)
Title: QA Analyst
Domain: Energy Automation

Loop Wizard is a process/tool (inside or attached to DIGSI) needed, which provides the configuration settings for “Fault isolation and Service restoration” functionality in an easier and convenient way. LoWi is and integral part of DIGSI 4.x product.

Roles and Responsibilities:

  • Facilitated formal review meetings with developers to report, demonstrate, prioritize and suggest resolution to issues discovered during testing cycles
  • Involved in Requirement Gathering & Analysis and preparation of Functional Requirements Specification (FRS).
  • Defects were tracked, reviewed, analyzed and compared using Quality Center.
  • Associated with development Team helping in bug issues and understanding of the project.
  • Developed Test Plans, Test Cases Test Scripts, Test Scenarios, Test Data and Traceability Matrix
  • Developed project plan and tracked time and assigned to team members
  • Reported to QA Manager and attended weakly team meetings
  • Code review and test case reviews.
  • Done unit testing of the entire software.
  • Done integration testing of UI with a sequence of DIGSI devices.
  • Followed agile methodology for the entire life cycle of the project
  • Provided leadership and mentorship to team members in order to make sure of consistent knowledge of the requirement and the solutions.
  • Coordinated with the whole testing team to ensure total quality control on all deliverables

Environment: Quality Center 8.0, QTP 9.2, Rational Clear Case, FxCop, Simian, Windows XP, C#.Net

Client: Confidential,Princeton NJ April 2008 – May 2010
Project: Network Management and Control System (NM&CS)
Title: QA Analyst
Domain: Industry Automation

Description: NM&CS is an industrial automation framework based on a Service Oriented Architecture (SOA). It provides a runtime environment for Services and Components, an API for several extension points, development tools, and much more.

Role and Responsibilities:

  • Designed and developed test cases for state machines of each of the customers.
  • Directed the QA Testing by providing guidance to other test developers in writing test scripts explaining them the test strategy and the test scenarios
  • Involved in Requirements Gathering & Analysis and preparation of Functional Requirements Specification (FRS).
  • Co-ordinate the review of business and functional design documents with end users and developers.
  • Followed Agile SCRUM methodology for the implementation of projects
  • Managing the Test Team and delivered the high quality products/projects.
  • Defined and implemented verification and validation processes across the project and manage QA and regression testing environments.
  • Estimate the test efforts, test schedule and work with the project managers to support the project management
  • Closely worked with the business team to understand the requirements and business process to plan and execute the Testing efforts
  • High Priority issues are escalated to Top Management for quick action.
  • Reviewed test cases written by the team members.
  • Done user acceptance testing, load testing and integration testing for customers.
  • Reported bugs on ARTS plus
  • Helped developer in analyzing the bugs
  • Performed root cause analysis on the problems found in the software.
  • Done analysis of memory leak issues in Alarm Service, using Purify.
  • Done static code analysis using Pure Coverage
  • Reported synchronization issues while doing load test and integration test
  • Developed testing framework for non-functional requirements in C++.
  • Designed unit test cases for the module.
  • Developed and tested unit test cases using QTP for functional requirements
  • Discussion with the Customer support team on a weekly basis regarding weekly status on Change Requests and the issues found.
  • Participated in Design discussion workshop

Environment: C++, XML, Rational Purify, Rational ClearCase, Rational Pure Coverage, Load Runner, Source Monitor, QTP 9.5, MS-Excel, Test Director, ARTS-Plus and Windows 7/ XP/ 2000

Confidential, February 2008 – April 2008
Project: Multi Language Support
Title: Software Tester
Domain: Energy Automation

Description: The PowerCC UI applications required to support multiple languages such that the people from different countries can use the same product in their native languages. For this purpose MLS framework provides configured language per user. MLS framework provides multi language support for UI data as well as for engineering data.

Roles and Responsibilities:

  • Created, Developed Test Cases and involved in preparing Test Plansbased on Business Documents, Use cases, Technical documents.
  • Tied test cases in the Quality Center to the Requirements for traceability matrix, imported and exported Test Cases to Quality Center.
  • Executed test cases in Quality Center and documented the defects in the Reporting tool.
  • Performed Smoke Testing, Functional Testing, GUI Testing, Stress Testing, Regression Testing, Integration Testing, Black Box Testing ,System Testing & User Acceptance Testing
  • Automated test scripts to test the functionality.
  • Verified reports developed against the result
  • Done testing of prototype for incorporating MLS Support for PowerCC applications using QT internationalization feature.
  • Participated in requirement analysis of the software.
  • Done static code analysis using Source Monitor and Pure Coverage
  • Performed Unit testing of the module
  • Reviewed unit testing of applications which used VB and MFC for support of MLS.
  • Analysed test plan which was written for the support of National Language Support for PowerCC.
  • Coordinated with testing team to convert requirements to test cases.
  • Reviewed Test Cases

Technical Environment: Test Director, Clear Quest, QTP 9.5, Rational Purify, MS Office, Rational Clear Case, Source Monitor, Pure Coverage, Windows XP

Confidential,Bangalore, India November 2007 – January 2008
Project: Siemens Test Automation System
Role: Sr. Software Tester

Domain: Industry Automation

Most applications are GUI intensive and it must be possible to test them effectively and reliably. Test automation can provide a solution to perform tests repeatedly without user interaction. Commercial products are expensive and cannot offer solutions to varied GUI controls, platforms, deployment environment and customizations. System Test Automation (STAS) is a GUI test automation tool that is capable of supporting test automation for a wide variety of controls in Windows environment and can be easily customized and extended to suit requirements of different applications.

Roles and Responsibilities

  • Test Requirement Analysis and Preparation of Test Management Plan
  • Establishment Performance requirements, engagement objectives and scope for applications.
  • Work with Performance Engineers to design performance tests that will meet the engagement objectives.
  • Responsible for client obtaining value from engaging Performance Engineering Services.
  • Created Test Plans and Test Results Documents.
  • Created Scripts and enhanced the scripts by adding correlation, error check points, debug checkpoint using QTP.
  • Created Scenarios, Executed Scenarios, analysed reports, reported response times and bottlenecks.
  • Performed unit testing of the software by testing Siemens applications.
  • Written test scripts for unit testing of the module
  • Reviewed test cases of modules developed by other team members
  • Written user manual of the module

Environment: QPT, VB 6.0, Test Director, Clear Quest, CVS, Source Monitor and Windows XP

Client: Confidential,India
Project: ASTERISK May 2007 – October 2007
Title: Software Tester

Domain: Telecommunication

Description: Confidential,is a fully featured VOIP gateway designed for corporate customers and small network operators. Asterisk was a project for implementing open source code for use of commercial purpose in Teles Voip Box, as a PBX.

Roles and Responsibilities:

  • Participated in design and requirement analysis
  • Prepared test plan, test cases
  • Executed test cases manually to see whether Asterisk is working for customer requirements.
  • Done unit testing of Asterisk
  • Done integration testing after loading Asterisk in VOIP Box.
  • Knowledge sharing with the new team members
  • Was key responsible person of the product testing
  • Writing knowledge sharing document for the new joined team about the scripting

Environment: Asterisk OpenSource, CVS, Putty and NetBSD 3.0

Confidential, November 2005 – April 2007
Project: TELES.iCharge

Role: Software Tester
Domain: Telecommunication

Description: Service program done for rating of calls made through TELES Switch. For an operator providing public telephony services, CDRs are the source of connection information. Billing is based on this connection information. Carriers and providers need to have a reliable and flexible way to convert CDRs to charges. TELES has developed TELES.iCHARGE to convert CDRs to charges.

Roles and Responsibilities:

  • Develop test cases based on business requirements and execute test scripts.
  • Created QTP Scripts using VB Scripting and executed with QTP.
  • Identify and retest incidents.
  • Identify business events/components and test cases which must be tested
  • Assist in the development and reviews of process test scripts; expand these scripts during testing as necessary
  • Execute the test scripts and validate the results
  • Document actual test results in the Test Case Matrices and Scripts.
  • Written test cases and executed UTS
  • Performed load testing
  • Performed database testing of the application using SQL
  • Done release of the versions of the project

Environment: QTP, SQL Server, Test Director, ClearQuest, PostgreSQL and Windows XP

Confidential, October 2004 – November 2005
Project: TELES.iRoute
Role: Software Tester

Domain: Telecommunication

Description: Confidential,is a competitive solution that enables VoIP operators to interconnect their networks to incumbents\' SS7 networks and then provide portability services to their end users. iRoute is a service program done for routing calls made through TELES switch

Roles and Responsibilities:

  • Report Regression Testing: Reviews & executes test procedures.
  • Validates data according to the test’s expected results.
  • Communicates test procedures to receiving system tester or sending system tester
  • Records test results in TestDirector.
  • Reports defects in ClearQuest and Verifies fixes and also Whitebox testing.
  • Worked in Agile methodology.
  • Perform System Testing
  • Perform test planning and test execution for customer.
  • Responsible for writing system test plan, test script from requirements specification.
  • Test Execution & reporting and logging bugs.
  • Participated in developing and updating testing documentation according to department standards and procedure.
  • Performed Database testing using SQL server

Environment: PostgreSQL, SQLServer, ClearQuest, Test Director and Windows 2000

Confidential, June 2004 – September 2004
Title: Software Developer

Description: The purpose of the project is to increase the performance of adding and deleting values in a lookup array, when the objects are growing dynamically. In a multithreaded environment objects have to be stored and shared or accessed by multiple threads. A shared object is stored at a one location only and the worker threads get the objects or pointers to the objects from this central location. In order to have the full control of the stored objects it is a common practice to identify such objects via a unique id created by the central location. The lookup of an object obviously affects the performance. Typically a central location is nothing more than a hash or binary tree. Both algorithms have a problem when a huge number of objects have to be stored.

Roles and Responsibilities:

  • Bug fixing and maintenance after the release of the software
  • Done code optimization
  • Done test case reviews
  • Involved in discussions with documentation team
  • Coordinated with testing team for better output of the product
  • Done memory leak fixing

Environment: C++, CVS and Windows 2000

Confidential, January 2004 – May 2004
Software Developer

Description: Parsing performance tool is an internal tool which was used by the load testing team to parse and filter the CDR XML File, which was collected from Teles.iSwitch. The data was used as an input for other software, to route calls and for billing purpose.

Roles and Responsibilities:

  • Algorithm design of the functionality of the tool
  • Design of object model
  • Development of the tool
  • Unit testing
  • Performance testing with large xml files
  • Coordinated testing team for testing with real time data

Environment: C++, CVS, Windows 2000