We provide IT Staff Augmentation Services!

Quality Assurance Analyst Resume

Las Vegas, NevadA

SUMMARY:

  • 11+ years of experience in IT Industry as Software Quality Assurance Analyst (SIT, UAT) .
  • Having demonstrated experience in managing large scale projects with effective planning and management of resources, deliverables, deadlines with quality.
  • Expertise in Client/Server, e - commerce, Web, Finance and Mortgage application testing.
  • Experience in using Informatica Powercenter, Metadata Manager and IBM Metadata tools.
  • Experience in Web Services testing using SOAP UI tool.
  • Experience in working with SailPoint Identity Access Management tool IIQ.
  • Experience in Micro Strategy Reports testing.
  • Very Good Experience in various stages of Software Development Life Cycle, Software Methodologies, Software Testing Life Cycle and Defect Life Cycle.
  • Vast Experience with different SDLC methodologies like Waterfall, V and Agile- Scrum.
  • Expertise in preparing Test Plan, Test Strategy, Test Reports, Test Metrics and maintaining Requirements Traceability Matrix for complete test coverage of Requirements.
  • Excellent experience in GUI, Functional, Database, MSTR Reports, Data migration, Data warehouse(ETL) and Web Service Testing.
  • Experience in leading Test Initiatives like Test Planning, Test Design, Test Execution, Test Estimation, Test Scheduling, Test Environment, Test Data, Test Closure.
  • Experience in leading Test Teams (5 - 8 Resources ) by planning, training, coordinating the testing tasks for multiple releases, supporting, mentoring, ensuring quality deliverables and providing timely feedback to Management and Client.
  • Strong Experience in Performing various types of testing including Agile Testing, Functional Testing, Black Box Testing, Grey testing, End to End Testing, System Integration Testing, Browser Compatibility Testing, GUI Testing, Pair Testing, Ad-hoc Testing, Retesting, Smoke Testing, Regression Testing, Usability Testing, User Acceptance Testing, Security Testing. strong expertise in SQL concepts and back end testing using SQL queries.
  • Experience in using JIRA, HP QC, HP ALM tools for defect management.
  • Experience in using QTP, UFT Automation tools for Regression Testing.
  • Well versed with the Unix commands to run Autosys jobs.
  • Experience in conducting & participating actively in the meetings & tele-conferences with Client, Users, Business Analyst team, Development team, QA team members and Managers.
  • Excellent Communication skills possessing the capability to interact with different levels of management including Project Managers, Tech leads, Developers, Analysts and other Team members regarding the status updates and highlighting issues.
  • Expertise in preparing and reviewing the Test Artifacts, Weekly and Monthly Status Reports with Project team, Client and other stakeholders.
  • Identifying the Risks in the Product and drawing the Mitigation Plans accordingly.
  • Proven ability to work efficiently in both independent and Teamwork environments. Energetic and perseverant self-starter who is known for excelling goals and objectives.

TECHNICAL SKILLS:

Testing Tools: HP ALM 11, UFT12, Quality Center, Quick Test Pro, Test Director, Spira

Databases: DB 2, Oracle, MS SQL Server, Teradata, Netezza, Sybase, Paraccel

Languages: SQL, PL/SQL, HTML, ASP, XML, C, C++

Other Tools: SailPoint IIQ, Informatica power center 9.6, Informatica Metadata Manager, IBM Metadata, JIRA, Version One, Doors

Scripting Languages: JavaScript, UNIX , VB Script

Utilities: SOAP UI, Rapid SQL, Microsoft SQL server management studio, WINSCP, Eclipse, WinSCP, Beyond Compare, TOAD, J Developer, SQL Assistant, Aginity, Putty, Tectia

Version Control: Rational Clear Case, Microsoft Visual Source Safe

Reporting Tools: BI Micro strategy

Operating Systems: Windows XP/7/2000/98/NT, UNIX

PROFESSIONAL EXPERIENCE:

Confidential, Las Vegas, Nevada

Quality Assurance Analyst

Responsibilities:

  • Responsible for the elicitation and analysis of requirements in collaboration with Process Owners, Developers, Vendors and stakeholders, and effectively translating these into functional test scripts/configuration requirements of the target solution.
  • Identifies and establishes scope and parameters of systems analysis to define outcome criteria and measure-taking actions.
  • Responsible for the ongoing operation and maintenance of IAM application using Sailpoint IdentityIQ tool.
  • Collaborate with Infrastructure specialists and directs as appropriate to deliver and certify/test the target technology environment.
  • Triage incoming post production issues and develop a plan to deliver resolution within the agreed service level.
  • Track production support tickets, provide dependable updates within the (Jira) ticket and ensure appropriate assignments and priorities within the agreed SLA.
  • Conduct detailed analysis of business technical requirements as needed to deliver the best solution. Document the solution within the (Jira) ticket.
  • Contribute as a liaison between level-1 support (helpdesk) business, project management, product management and engineering team leaders.
  • Responsible for loading existing non-employee profile information from Allegiant HR systems to populate IdentityIQ which serve as authoritative source for Allegiant’s non employee information.
  • Responsible for validating Employee and non employees’s Identity IQ cube attributes which includes on boarding, transferring, terminating and updating personal information.
  • Validating Birth right provisioning rule driven group to Active Directory and AIS for employees and non employees.
  • Prepare test cases and scripts that are relevant to the design and functional requirements while adhering to defined quality and governance standards and procedures.
  • Maintain the scripts within Spira tool.
  • Clearly articulated test-planning efforts that are required as to inform the project team of the details and timelines associated with testing.
  • Involved in running, monitoring UFT Automation test scripts for regression testing and performing root cause analysis for the failures.
  • Involved in identifying Automation candidates and supporting Automation, performance teams.
  • Providing demos to Product owners during sprint and sprint review meetings.
  • Following and creating new initiatives for continuous process improvisation throughout the different phases of the Project.
  • Responsible for performing BAT (Basic Acceptance Testing) on every build given by Dev team.
  • Involved in both SIT and UAT Testing and validating Production issues created by NOC team.
  • Responsible for updating and tracking the status on respective User Stories in JIRA tool.
  • Involved in Agile process like daily Stand UP, Test estimations, Task identification, Iteration planning, Product back log, Review and Retrospective meetings.
  • Responsible for tracking the defects in JIRA and work closely with Dev team for fixes/resolution through Triage meetings and providing root cause analysis.
  • Responsible for Communicating and escalating the issues to Project Manager, Dev Manager, QA Manager and other stake holders in timely manner.
  • Responsible for generating RTM in order to ensure all the features and functionalities were tested and meeting Business Users Criteria.
  • Responsible for preparing and sharing Daily status reports, Test Summary Reports and SIT, UAT Sign Off reports.

Confidential, Mclean, Virginia

UAT Test Lead

Environment: Informatica Power center V9.6, SailPoint 6.4, DB2, SQL Server, Sybase, Oracle, Paraccel, SOAP UI, Unix, UFT, HP ALM 11, Version one, Doors, IE, FF, Informatica Metadata Manager(IMM) V9.6 & V10.1, Informatica Analyst, Rapid SQL, Microsoft SQL Server Management Studio

Roles & Responsibilities:

  • As Test Lead, responsible for overall success of System and User Acceptance Testing (UAT) activities.
  • Interacting with Business Analysts, Dev Team, Project Management Team, SME’s, Infrastructure team and other stake holders to understand the Requirements using BRD and SRS documents.
  • Involved in understanding the Mapping documents, prioritizing the Requirements and defining UAT Scope.
  • Responsible for Test Plan, Test Strategy documents preparation and review with Project Team, management and other stakeholders.
  • Interacting with Operations teams regarding Test Data Prep, Test Environment Set up activities
  • Responsible for identifying UAT Testers, providing training and co ordinating UAT Tasks.
  • Responsible for reviewing the team members work, supporting and monitoring the deliverable’s on daily basis.
  • Co ordinating with Project Team, Management and other stakeholders and seeking approval for the Test Artifacts prepared for UAT.
  • Involved in Validating ETL Process of Identified Enterprise Sources (HR, AD, Lotus Notes, Databases, OIM, TPI, CMDB and other Supplementary) into Identify Data ware House Staging Database using Informatica Powercenter.
  • Involved in Executing the Workflows and validate data flows into relevant tables from raw data files as mentioned in mapping documents.
  • Involved in Validating ETL Process of Staging, correlated and consolidated data into Identify Data ware House Main Database using Informatica Powercenter.
  • Involved in Validating Main DB Data is reflecting in Sailpoint for both Initial and Incremental loads.
  • Involved in validating Sailpoint is reflecting all certifications data with respect to appropriate resources flagged (privileged, SOX, PPI etc) and Roles (Groups, Manager, Non- Individual)
  • Involved in validating Access Recertification process for Active, Transfer, Terminate Accounts using SailPoint IdentityIQ provisioning tool.
  • Involved in performing Data Completeness and Accuracy checks, Transformations checks, Reload Checks, Metadata checks, GUI checks and Functionality checks for the identified data sources which are in scope.
  • Involved in performing negative testing by mocking the data, and ensuring all the Errors are logged in corresponding tables.
  • Responsible for Tasks Identification, Estimations, Test Metrics, Daily status, weekly status, Monthly status reports preparation, Test Closure and providing sign-offs.
  • Responsible for maintaining HP ALM for complete Requirements and Test Traceability.
  • Responsible for identifying End to End Business Flow Scenarios, Test Case design, Review, Test Execution and Test Result Documentation.
  • Responsible for tracking the defects in ALM and work closely with Dev team for fixes/resolution through Triage meetings and providing root cause analysis.
  • Responsible for Communicating and escalating the issues to Project Manager and other stake holders in timely manner.
  • Responsible for generating RTM in order to ensure all the features and functionalities were tested and meeting Business Users Criteria.
  • Responsible for preparing Test Summary Reports and UAT Sign Off in consultation with Business and Project Management Approval.

Confidential

Senior Quality Assurance Analyst

Roles & Responsibilities:

  • As senior Quality Assurance Analyst(UAT) Responsible for overall success of System and User Acceptance Testing (UAT) activities.
  • Interacting with Customers to collect, understand, and prioritize the Requirements for Single Family Operations.
  • Work Closely with DBA’s to collect the connection and configuration details for identified DB’s.
  • Involved in Test Plan, Test Approach, Test Case design and Test Result documentation.
  • Involved in creating, handling change request, service request and Mac request required to get access to user groups/accounts and OSSA ID’s.
  • Involved in configuration, connecting to each data source via a connector to collect meta data (oracle, DB2, Sybase, SQL Server, MM Agent BI, Informatica ETL) for different data sources using Informatica Metadata Manager.
  • Co-ordinating the UAT activities and sign off with customer and Project Team and Manager.
  • Involved in viewing Physical table/attribute metadata, performing upstream & downstream impact analysis, viewing lineage using IMM tool.
  • Involved in intake, store and provide access to view/edit Business Glossary and to link individual business glossaries with metadata on the MDR side using Informatica Analyst Tool (Business View of metadata).
  • Attending Daily Standups, project status and issue meetings.
  • Communicate and escalate testing issues to Project Manager in timely manner.
  • Involved in creating Custom Models, Custom Resources and Custom Attributes as per reqs.
  • Log Defects, perform root cause analysis, reproduce steps, coordinate with the ETL Ops Admin and DBA’s and follow up till closure.
  • Involved in creating additional links using Rule Based Links for custom and non-custom resource’s including paraccel Matrix resources.
  • Involved in loading, validating Metadata for DB2, Oracle, Sybase and SQL Server data sources.
  • Involved in loading, validating Metadata for IBM Data Stage data source, JAVA and COBOL data sources using AnalytixDS Mapping Manager(AMM).
  • Involved in loading, validating Metadata for Microstrategy and Erwin Model data sources using MM Agent BI Connectors and Modelling Connectors.
  • Involved in preparing weekly, monthly status reports and Test closure documents to keep the Product Managers, Business Analysts, Developers and other stakeholders updated.
  • Supported Production Environment Test activities by working closing with the ETL Admins.

Confidential

Senior Quality Assurance Analyst

Roles & Responsibilities:

  • As Sr Quality Assurance Analyst responsible for overall success of System and Integrating Testing (SIT)activities.
  • Interacting with the Business Analysts, Data Architects, Developers and other stakeholders on a day-to-day basis for gathering and analyzing the IT requirements.
  • Responsible for Test Plan, Test Approach, Test Metrics, status reports preparation and providing sign-offs.
  • Involved in Test Scenarios Creation, Peer review, Test cases design, Execution, Defects logging, performing root cause analysis, Test data preparation, Test Result documentation and maintaining HP ALM for complete Requirements and Test Traceability.
  • Involved in all Agile ceremonies like Daily scrum, Product backlog, Iteration Backlog, Iteration Planning, Task Identification and Estimations, Iteration review, Retrospective meetings.
  • Involved in GUI validation and functionality testing using different black box techniques for project modules including Search, Reports and Work order.
  • Involved in Security testing validating various roles authentication (Internal and External Users).
  • Involved in validating latest CSS styles (inspect element) are implemented in the application web pages.
  • Involved in store procedures testing verifying the Flatten tables data using the decision/branch coverage techniques.
  • Involved in backend testing (CRUD operations) validating data entered in the front end is stored in the back-end database for different features.
  • Involved in trigger and process the autosys jobs in order to create different Loan Events.
  • Validating LCVA target tables data is created as per the logic and mapping given in design documents.
  • Validating diff On demand and Scheduled Reports functionality and generation process.
  • Validated data coming from different external Web Services through XML responses (Soap UI).
  • Involved in identifying, creating and executing the End to End scenarios.
  • Assist in UIT, UAT Smoke testing and retesting the defects identified during Prod deployments.
  • Involved in running, monitoring UFT Automation test scripts for regression testing and performing root cause analysis for the failures.
  • Involved in identifying Automation candidates and supporting Automation, performance teams.
  • Providing demos to Product owners during sprint and sprint review meetings.
  • Following and creating new initiatives for continuous process improvisation throughout the different phases of the Project.

Confidential, charlotte, NC

SIT QA Lead

Environment: Netezza, Oracle, SQL Server, QC11, Java, Quartz, web services, Micro strategy, Unix, JIRA, Teradata, IBM Metadata Manager Tool

Roles & Responsibilities:

  • AS QA Lead responsible for Work Allocation, Review Scenarios, Test cases, Test Execution Results reported by QA Engineers.
  • Work closely with the business analysts to gather requirements, create technical and functional specifications for achieving the desired end results.
  • Providing clarifications to the Offshore team and tracking the status on daily basis.
  • Involved in Test plan, Test strategy, Test Approach, Test Metrics, weekly status and Monthly status reports preparation and providing sign-offs.
  • Responsible for coordinating across various applications to ensure that all applications have completed relevant test deliverables on time and with high quality.
  • Involved in validating the VAPs applied on the columns according to Bussiness requirement.
  • Involved in writing the queries to validate the Transformation Rules are applied on the columns.
  • Involved in GUI testing based on the roles and authorization.
  • Involved in Microstrategy Reports testing with front end and backend data validation.
  • Responsible for Setting up the Test Environment using the latest Builds delivered by Development Team. Involved in Web service testing.
  • Tracking the Defects and Performing Root Cause Analysis for the defects identified.
  • Working closely with developers in reproducing and fixing defects reported in QC.
  • Successfully implementing Mercury Quality Center for Test Planning,Test Execution and Defects for complete Requirements treacability.
  • Performed User Acceptance Testing on behalf of End Users at client’s environment.
  • Responsible for updating and tracking the status on respective User Stories in JIRA tool.
  • Involved in Agile process like daily Stand UP, Test estimations, Task identification, Iteration planning, Product back log, Review and Retrospective meetings.

Confidential

Senior Software Engineer

Roles and Responsibilities:

  • Understanding the requirements by analyzing of BRD, HLD, LLD, mapping and other project documents.
  • Involved in Test scenarios, Test Case design, Test Execution and Test Reports preparation.
  • Tracking the Defects and Performing Root Cause Analysis for the defects identified.
  • Involved in verifying the data is transformed correctly according to various business requirements and rules. performed data completeness check validating all the source data is loaded into the target DB without any data loss and truncation.
  • Involved in Valid value check verifying the particular field contains only values that are in req’s.
  • Involved in validating that application appropriately rejects, replaces with default values and reports invalid data (Reload check).
  • Involved in validating data checksum for record count match, NULL/Space value fields, duplicate data is not loaded and data integrity
  • Involved in Layout check, count check, Amount check, Replication check, Archive, Email Notification, Log file and Load statistics check’s.
  • Involved in generating and validating the Micro Strategy reports are as per requirements.
  • Validated the MSTR Reports Layout, data, prompts, attributes, Filters, drill options, formatting and export functionality.

Confidential

Roles and Responsibilities:

  • Understanding Requirements, Analyzing Mapping Doc and creating SQL queries for testing data consistency in each phase.
  • Involved in Test scenarios, Test Case design, Test Execution and Test Reports preparation.
  • Tracking the Defects and Performing Root Cause Analysis for the defects identified.
  • Involved in validating duplicate data is not loaded into Target DB tables and also the particular fields in Target DB contains only the values that are available in source DB.
  • Involved in validating data is loaded into each relevant table in Target DB.
  • Involved in verifying the data is transformed correctly according to various business requirements and rules.
  • Involved in validating the layout, counts between source and Target databases.

Confidential

Roles and Responsibilities:

  • Involved in understanding requirements, Test scenarios, Test case preparation and Execution.
  • Involved in logging the Defects and Performing Root Cause Analysis.
  • Involved in preparing estimations and communicating with client on daily basis.
  • Involved in updating test execution status reports and reporting status to the client.
  • Involved in validating the Business Metadata, Technical Metadata for the data elements listed in the 10 ECRRA Top of the House Reports against the IBM Metadata Tool.

Confidential

Software Test Engineer

Environment: Manual Testing, DB2, SQL Server, Oracle, QC, QTP, IE, FF

Roles and Responsibilities:

  • Involved in understanding the requirements and clarifications with Business Analysts.
  • Involved in Test Plan, Test Scenarios, Test cases design, Review.
  • Involved in creating test data by inserting and updating data in different database tables.
  • Involved in writing and executing SQL queries for backend testing.
  • Involved in test case execution and updating the status on daily basis.
  • Involved in Defect logging, tracking and root cause analysis .
  • Involved in GUI, functional, system, data and browser compatibility testing.
  • Involved in preparing estimations and communicating with client.
  • Involved in preparing test execution status and Defect reports and status reports required to communicate with the Project management and client.
  • Ensures testing specifications are properly linked to detailed business requirements and Traceability matrix
  • Ensure all testing defects are assigned, worked and resolved according to the project schedule.
  • Responsible for managing change control of documents.
  • Involved in Status, Defect Triage and other project meetings.
  • Communicate status and issues to Testing coordinator.

Hire Now