We provide IT Staff Augmentation Services!

Quality Assurance Lead Resume

5.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • 11+ years of overall IT industry experience with emphasis on Software QA Testing of Middleware, Portals, Client - Server and Web-Based Applications, API’s, Databases, IBM Mainframe applications, and mobile apps.
  • Expertise in implementing Software Quality Assurance Life-Cycle procedures that involves developing test-plans and test-scripts for both manual and automated testing. Basic knowledge on using automation tools like Selenium
  • QTP and Load Runner to enhance the script to test the functionality and performance of the application.
  • Excellent working knowledge of System Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC) and Defect Life Cycle.
  • Very good Exposure in Agile Methodology- Participation in Backlog grooming, Sprint planning, Sprint/story Kick off Meetings, Daily scrum, Sprint Review Meetings, and Sprint Retrospective.
  • Experienced in developing test plans, test cases, generating and maintaining automation scripts. Setting up the test-environment and executing testcases.
  • Design, develop and execute test scenarios, e-business testing, full life cycle end-to-end testing & perform function verification tests.
  • Assisted automation team to generate scripts using tools like Selenium and QTP.
  • Strong ability to think functionally and analytically both from a piece-part perspective as well as from a sum-of-whole perspective.
  • Identify and recommend solutions to the Management on business, system & data issues.
  • Experienced in testing various applications like Java, .Net, Client-Server, Web-based and IBM Mainframe Applications.
  • Experience in Jira, HP Quality Center, ALM -Design Test Steps, Requirement Mapping to Tests, Executing Tests Manually, Defect Logging, Defect Reporting
  • Experience in REST API Testing using Postman.
  • Extensively involved in Black Box Testing - Functionality Testing, GUI Testing, Regression Testing, Stress Testing, System Integration Testing, User Acceptance Testing, Interface Testing (Menu, Sub-menu etc), Database Testing, Load Testing, Benchmark Tests and API Testing.
  • Used Software Configuration Management tools, Bug Tracking tools and Project Management tools.
  • Data-Driven tests and test code independent methods had been used effectively, thereby saving lot of maintenance costs and making the scripts more user friendly.
  • Good at all versions of Windows Operating Systems. Partially worked on Unix.
  • Used SQL Editors like Toad, DB Visualizer to verify outputs from different database tables (database viewing).
  • Highly interacted with Business Analysts, Developers and End-User’s in subject specific matters.
  • Experience in 24X7 production support, ETL executions and resolving root causes.
  • Ability to quickly learn new Technologies and applying its full range of capabilities.
  • Exceptional verbal, written communication and excellent problem-solving skills.
  • Able to work independently and participate as a strong team player, able to lead a team.
  • Good working knowledge on Hours forecast, Resource Management and scheduling.

TECHNICAL SKILLS:

Testing Tools: Quality Center 9.0/9.2, Application Life Cycle Management 11, Quick Test Pro 9.2/9.5, Selenium, Load Runner, Soap UI, SDLT JIRA Instance, Postman.

Languages: C, C++, Java, VB Script, Base SAS, .Net, SQL, PL/SQL

Internet Technologies: VBScript, JavaScript, XML, HTML

Oracle 8i/9i/10g/11i, MS: SQL Server 2000, DB2, Teradata

Database connectivity tools: TOAD, Teradata SQL Assistant, SQL Server Management Studio, SAS EG 4.2

MS: DOS, Windows, UNIX

MSWord, MS Excel, MS: PowerPoint, MS-Outlook, MS-One Note

ETL Tools: Informatica 8.6.1/8.1/7.1.2/6.1. X, Business Objects XIR 2/6.5.1/6.0/5.1. x/4.0, Data Stage 7.1/8.x, Ab Initio, SAS DI (Data Integration), SSIS Package.

PROFESSIONAL EXPERIENCE:

Confidential, Chicago, IL

Quality Assurance lead

Responsibilities:

  • Attend Iteration planning meeting and finalize the user stories and estimation for the 2 weeks sprint/iteration.
  • Prepare high level scenarios based on backlog stories.
  • Analyze system requirement specifications and develop test plans, test cases to cover overall QA System Testing.
  • Develop test cases based on the User stories and functional requirement specification document.
  • Attend daily standup meeting and end of iteration meeting.
  • Involved in preparation of Test plan and mapping test cases to requirements.
  • Used Selenium WebDriver with TestNG framework for Automation regression suite.
  • Validate API’s using POSTMAN and ensure data and control are working as expected.
  • Validated JSON formatted data and different http status codes like 200, 201, 400, 415, 500 etc in POSTMAN.
  • Involved in Back End testing, by executing SQL queries to retrieve and validate data.
  • As part of Android mobile app testing, APK file generated from Jenkins build is imported from local machine to test device and verify the application functionality.
  • Logged defects encountered in the application through JIRA during test cycles and conducted fix verification.

Environment: Windows XP, Android Design Studio, Jira, Selenium, Java, SQL Server, Postman, Jenkins.

Confidential, Addison-TX

Test Lead

Responsibilities:

  • Supervised preparation and coordination of plans, resources, and software build availability in preparation for testing.
  • Assigned work to other test analysts and established schedule to ensure testing practices are executed and evaluated in an effective way.
  • Created/executed test plans and organizing testing procedures among the testers.
  • Jotted test reports based on problems found.
  • Created the Test Summary Report after the completing the Testing Cycle.
  • Implemented a QA tracking and reporting system.
  • Ensured test documentation and results are maintained.
  • Conducted peer reviews of software design and testing documentation.
  • Developed and implemented software verification and validation procedures, processes and templates.
  • Thorough hands-on experience with all levels of testing viz., Interface, Integration, Regression, Functional and System testing.
  • Data validation performed using SQL for source and target validation.
  • Use of Microsoft Outlook for corresponding via emails to and from management, development and coworkers with problems or questions regarding testing.
  • Delegated work to other testers, reviewed the testers work and worked directly with managers in making testing more proficient in the fast-paced environment.
  • Interacted with the Development team, Team Managers / System Analysts.
  • Drafted lessons learnt document highlighting the process that can be improved for further releases.

Environment: iSeries, SQL Server, Microsoft Access front end, .Net, Java, SharePoint, MS Office tools, WinSCP, Putty.

Confidential, Addison-TX

Test Lead

Responsibilities:

  • Lead a team size of three.
  • Involved in Functional, GUI Testing, Regression Testing, Integration Testing, User Acceptance Testing (UAT), Interface Testing (Menu, Sub-menu etc), Database Testing.
  • Verified if user can successfully create claims in Focus front end system.
  • Verified if claim details are stored in iSeries Backend files.
  • Verified transmission of claim details to EDI (Electronic data interchange) during batch process.
  • Monitoring, Analyzing and Reporting of the test results to the management.
  • Identified potential bottlenecks and provide recommendations for fixing.
  • Prepared test matrix and planned the tests to execute on various combinations.
  • Also prepared master test plans and test cases for the end-user’s usability, and for support.
  • Worked closely with the development teams located at various locations in US.

Environment: iSeries, Java, .Net, SharePoint, MS Office tools.

Confidential, Addison-TX

Test Lead

Responsibilities:

  • Analyzed the Requirements from the client and developed Test cases based on functional requirements, general requirements and system specifications.
  • Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan.
  • Performed Smoke Testing, GUI Testing, Functional Testing, Backend Testing, System Integration Testing, Sanity Testing, and assisted with User Acceptance Testing (UAT).
  • Prepared Test Cases and Test Plans for the mappings developed through the ETL tool from the requirements.
  • Created Test Cases, traceability matrix based on mapping document and requirements.
  • Tested access privileges for several users based on their roles in WebPages log in.
  • Prepared System Test Results after Test case execution.
  • Prepared Test Scenarios by creating Mock data based on the different test cases.
  • Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.
  • Debugging and Scheduling ETL Jobs/Mappings and monitoring error logs.
  • Have tested reconcile and incremental ETL loads for the project.
  • Tested data migration to ensure that integrity of data was not compromised.
  • Involved in testing the batch programs by using the Autosys tool.
  • Provided the management with weekly QA documents like test metrics, reports, and schedules.

Environment: iSeries, WebSphere, Quality Center 9.2, TOAD, MS Word, Windows XP/vista.

Confidential, Addison-TX

Test Lead

Responsibilities:

  • Participated in requirements review meetings with the business team to build a robust environment for implementing the test plans.
  • Prepared detailed test plan to set coverage criteria and evaluate coverage.
  • Involved in performing GAP Analysis to verify that test cases match to the user requirements and reported Defects in HP Quality Center
  • Performed client acceptance testing, functional testing, integration testing, system testing.
  • Derived and developed Requirements, Functional, Regression Test Cases from Use Cases and Test Scenarios
  • Executed system functional / integration test cases, communicating results to the software development.
  • Written SQL statements manually to validate data from Database using TOAD and tested entity relationship of the database.
  • Involved in interface testing for different Modules using Web Services. Validated the output against the UI and client accounts against the database.
  • Import web service operations using WSDL Url and validate in SOAPUI.
  • Verified the newly developed ASP.Net web-based portal that provides users to access required data elements from the new Bookloss database housed in iSeries.
  • Validated Single Sign On (SSO) secure login process and user management via entitlements.
  • Have tested several complex reports generated by Cognos, including dashboards, summary reports, master detailed, drill down and score cards.

Environment: HP Quality Center 9.2/10, J2EE, Web services, SOA, Cognos 7.2/8.1, SOAP UI.

Confidential, Addison-TX

Test Lead

Responsibilities:

  • Involved in testing on all stages of System Development Life Cycle.
  • Developed Test Plans and Test Specifications as per user's Business Requirements.
  • Estimated the time frame for all testing types according to the project plan.
  • Developed Test Cases and Test Scripts for Functional, and Regression testing according to user’s functional requirements.
  • Executed Functional and Regression Test Scripts manually against expected results.
  • Verified backend tables migration from SQL Server 2000, MS Access & AS400 to new CHP environment.
  • Verified ETL Process (SSIS Package) for AOP Extraction.
  • Verified the relink is successful for AOP Extraction front end system to the new back-end tables in CHP (SQL Server 2008)
  • Analyzed and documented test results and participated in acceptance sign-off process.
  • Involved in Identifying & tracking bugs and diligently worked with the development team to ensure the bugs are fixed.

Environment: Microsoft Access Front End, SQL Server 2008, iSeries (AS400)

Confidential, Addison-TX

Test Lead

Responsibilities:

  • Analyzed and reviewed system requirements, use cases and other design documents to gain overall understanding of the functionality of the application.
  • Involved in testing project at every step of the quality cycle from test planning through execution to defect management.
  • Involved in organizing, improving upon and executing existing tests.
  • Updated test cases based on approved change requests.
  • Involved in interacting with technical developers and business analysts during defect resolution.
  • Involved in the creation of Test strategy, Test plan and Test cases for manual and automated testing from the business requirements to match the project's initiatives.
  • Performed Integration Testing, Functional Testing, Regression Testing, System Testing and assisted with data for User Acceptance Testing (UAT).
  • Executed SQL queries in order to validate ILOG-JRules which involves validating the Rules in the JRules Engine and testing all the Rules with Positive, Negative and Exception Scenarios.
  • Checked the data flow from the front end to backend that includes testing of inbound file, Web sphere Fee Processing System, ILOG JRules, Outbound File, MQ Series, iSeries Components.
  • Worked closely with Development team, Business Analysts, Data Quality and Production Support teams.
  • Used Mercury Quality Center (QC) for organizing Requirements, Test Cases and Defects

Environment: iSeries, MQ Messaging, WebSphere, ILOG, DB2

Confidential, Charlotte-NC

Test Lead

Responsibilities:

  • Reviewed the Business requirement documents and analyzing the testing scope.
  • Involved in Estimation and Creation of System Test Plan.
  • Reviewed and ensured the timely delivery of Test Scenario and Scripts.
  • Participated in Scenario and Scripts Reviews calls with business and getting Sign off on the same.
  • Validated Normalization and Derivation Logics for each data element is as d in mapping document.
  • Executed Queries in Query builder interface of SAS EG tool to fetch counts and data against data mart.
  • Validated reports produced from the data mart view in the form of Pivot Tables and BAC Residential Mortgage Reports.
  • Responsible for all the initiative Test Execution at offshore and providing appropriate execution status reports to the onshore Team.
  • Hosted defect management meetings and provided Test Summary Report, Test Metrics, and SIT updates for each release.
  • Updated or prepared the Requirement Traceability Matrix.
  • Managed transition activities and ensured smooth KT phase.

Environment: Quality center 9.2, SAS DI (Data Integration) Studio, SAS SPDS (Scalable Performance Data Server), OLAP Viewer, SAS WRS (Web Report Studio), SAS AMO (Microsoft Add On’s-Excel, PowerPoint, Word, etc.) Teradata, SQL Server, Olap Cube Studio, EG-Enterprise Guide.

Confidential, Charlotte-NC

Sr.QA Analyst

Responsibilities:

  • Analyzed Requirements from Business Requirement Document (BRD), Small Project Document (SPD), Mapping Document, and Software requirement specification document.
  • Documented Test Plan including Project Initiative and Data Security.
  • Initiated live meetings with the entire Team members to walkthrough Test plan and Test Scripts.
  • Developed test cases to test all the data elements provided in the mapping document for each Product-Report Category.
  • The Test Cases were made reviewed by Technical Analysts prior to SIT Entry to ensure the modeled Test cases that include complex SQL Queries are fulfilling the requirement.
  • All the Requirements gathered from mapping document and the approved test cases will be uploaded to Quality Center from MS Word and Excel respectively.
  • SIT Entry Criteria is documented in which metrics are also provided for pass % rate to exit SIT.
  • Performed System Integration Testing on the build comprising of SAS Metadata.
  • Extensively done BCO’s (Business Consumable Objects) dimensional data validation using Query builder in SAS EG 4.2 and constraint validations.
  • Worked with Quality Center Admin in customizing fields for Users to easily track data elements by their attribute numbers.
  • Assisted Quality Center Admin in creating Users and their access levels.
  • Execution of test cases is performed and ensured 100% requirement coverage.
  • UsedQuality Centerto organize and manage all phases of the software testing process, including planning tests, executing tests, and tracking defects.
  • Every considerable Test incident is recorded in Quality Center and assigned to Technical Analyst to determine the validity/severity of the Defect.
  • During Execution period, gathered all project team on call daily to make them aware of SIT Status and Defect Tracker updates.
  • Performed Regression testing on enhanced build.
  • Participated in User Acceptance Testing along with the users by providing to the end user.

Environment: Quality center 9.2, SAS EG 4.2, SAS DI, Oracle10g, Teradata, SQL Server, Olap Cube Studio, Windows 2003 Server.

Confidential, Charlotte-NC

QA Analyst

Responsibilities:

  • Analyzed Business Requirements (User Stories and acceptance Criteria), use cases, screen shots and Software requirement specification documents to create test cases.
  • Participated in Face-to-Face Communications with other members as a part of the Agile Methodology.
  • Involved in Sprint Planning, Daily Stand-up, Sprint Review and the Sprint Retrospective.
  • In Sprint Planning, determined the Sprint scope by working to establish acceptance criteria for User Stories, estimated the level of complexity within each User Story, and detailed a preliminary set of tasks for each User Story.
  • During the daily Stand-up, discussed what got accomplished in the previous day, what will be the focus of work today, and if they have any obstacles that need to be removed.
  • Used Quality center 9.2 for Test Planning and executing the Test Cases.
  • Used SDLT JIRA Instance for Defect Tracking.
  • Engaged in the defect resolution process and looked for automation integration.
  • Held the demo within the Sprint Review and showed off the functional delivery for the Sprint.
  • Extensively done Backend Manual testing using SQL and constraint validations.
  • Seeked Scrum Master help for providing the team with a Sprint burn-down chart and a project burn-up chart and test status reports which served as the key information radiators for the project’s overall health.

Environment: Quality center 9.2, QTP 9.5, SOAP UI, UNIX, TIBCO Rendezvous® (RV), TIBCO Enterprise Message Service™ (EMS), TIBCO Active-Matrix Business Works™ (Java components), Oracle RAC, Informatica ETL.

Confidential, San Antonio-TX

QA Engineer

Responsibilities:

  • Involved in Functionality Testing by creating the Categories, Portal Pages, Users/Groups, Sessions, Roles, Server Settings and finally creating the Web Portals.
  • Involved in testing the usability of the system from end-user perspective.
  • Functional and System Testing of the various aspects of the product including installation and migration.
  • Worked closely with members of the Development, Product Management, Release Eng and Documentation teams to ensure proper product testing and product quality.
  • Developed test specifications, test automation scripts and performing test execution.
  • Enhanced the existing test cases for Functionality, Module and Integration Testing
  • Reviewed various manual test scripts based on scenarios, updated them with respect to the new versions and enhancements, and involved in writing new test cases for the added functionality.
  • Involved in designing, writing and execution of test scenarios, full life cycle testing, system test experience, function verification test & Application Integration Test.
  • Tested the integration of multiple systems, based on business transaction scenarios defined by the functional teams during the design phase.
  • Test completeness and accuracy of data flow between various applications through multiple interfaces is verified.
  • Performed Black-box Testing, and to identify potential bottlenecks and provide recommendations for fixing.
  • Tested .NET, VB, COM/DCOM component objects
  • Performed Cross-Platform and Cross-Browser testing like (tested on Windows 2007, Windows XP using different servers like Netscape Directory Server 4.1)

Environment: MS Office, HTML/XML, VB/ASP.Net, COM/DCOM, Windows 2007 and Windows XP, Selenium.

We'd love your feedback!