Sr Qa Analyst Resume
Columbus, OH
PROFESSIONAL SUMMARY
Over 8+ years of IT experience in QA testing with extensive Web based, Client/server, Web Service, Middleware for Public Sector, Healthcare and Telecom domains.
- Experience in all phases of Testing Lifecycle (SDLC and STLC) with expertise in the Integration and Functional, Input domain, Load Testing, User Interface, Sanity testing, black-box testing, User Acceptance Testing Regression and Re-testing
- Worked closely with business analysts, programmers, and end users in a cross functional team
- Experience in Preparation of Test Plans, Test Scenarios, Test Cases and Test Data from requirements and Use Cases
- Experience on Oracle Enterprise Taxation Management(ETM)V2.2
- Experience in all phases of Testing Lifecycle with expertise in the Integration and Functional, Input domain, User Interface, Sanity testing, Regression and Re-testing
- Complete software development lifecycle (SDLC) experience from business analysis to development, testing, deployment, documenting, maintaining and user training and getting the share point information
- Experience in co-ordinating efforts between development teams and offshore enterprise test team
- Extensive experience in Mercury Quality Centre and TFS
- Expertise in Web Based and SOA Architecture Based Testing, Worked with .Net, Java based application testing.
- Responsible for ETL batch jobs execution using IBM Tivoli application to load data from core database ( ETM database) to Staging and Data mart tables.
- Experience in testing Web Services like SOAP through XML
- Experienced in software analysis, Requirements Management, Quality Assurance, Configuration Management and Change Management.
- Worked with Water Fall Model, Agile and Scrum methodology
- Database experience includes Oracle & SQL Server.
- Good experience Unix and Windows platform
- Strong Experience MS Office Tools like (Word/Excel/PowerPoint)
- Hands on experience with DW- ETL Testing BI Reporting tools, such as Business Objectsand Cognos
- Expertise in Testing Data Marts and Data warehouse ETL (informatica, Data stage and Abinito) and OLAP (Cognos, Business Object), Reporting and Documentation.
- Strong OLTP and OLAP Testing experiencegood knowledge on Star schema and Snow flex schema
- Experience on Data validation, Data merging, Data cleansing and Data aggregation activities
- Good knowledge on Star schema, snowflake schema, Fact tables and Dimension tables
- Good Experience in peer review of test cases and Preparing Test Reports
- Worked with Retesting and Regression testing
- Worked with ODS and DDL, DML commands and Strong backend Database testing experience
- Strong experience in SQL Commands Select, Joins, Union
- Strong experience in Testing Procedures/Functions, Packages and Triggers using PL/SQL.
EDUCATION
Master’s Degree in Computer Applications
TECHNICAL SKILLS
Testing Tools: HP Quality Center (QC), TFS, Test Director, Jira
Web Services : SOAP,WSDL
Databases: Oracle 8i/9i, 10g, SQL Server, Teradata
ETL Tools: Informatica, Data stage
Reporting Tools: Cognos, Business Objects
Scripting Languages: Java Script, Shell Script, Perl Script
Languages: C, C++, Java,.Net, VB
GUI Tools: Erwin, TOAD, Teradata SQL Assistant
Operation Systems: Windows, UNIX, Linux, Solaris.
Unit Testing Tools: Junit, Nunit
Browser: IE 8.0, Fire fox
Internet Technologies: HTML, XML, XML Schemas, Java Script
PROFESSIONAL EXPERIENCE
Project Title: STARS (State Tax and Revenue System)
Client: Confidential,Columbus, OH Nov 2011 To till date
Sr QA ANALYST
Project: A software system that integrates all tax information into one database and offers the following major functions for all state-administered taxes: Taxpayer Identification, Returns Processing, Taxpayer Accounting, Revenue Accounting, Case Management and Common Application Services (i.e. correspondence and document management)
Environment: Quality Center 9.2, Java, Jsp,C++, Web Services SOAP,XML, EJB,Junit, Web Sphere, Oracle10g, Toad, Unix, Data stage,Cognos
Responsibilities:
- Understanding Functional Requirement Specifications and System Requirement Specifications
- Preparation of Test Scenarios, Test Cases and Test Data
- Provide review comments on Test Scenarios and Test Cases based on Functional Requirement Specifications
- Created Test Plan Document based on Test Strategy Document
- Worked with SOA (Service Oriented Architecture)
- Performed Web Application and Web Services testing
- Testing web services using SOAP UI
- Design and Execution of Test Cases and Test Scripts
- Preparation of Requirement Traceability Matrix ,Test Metrics and Deployment manual
- Performed Sanity Testing, Data Driven Testing & Ad-hoc Testing when required.
- Performed system testing to ensure the validity of the requirements and mitigation of risks prior to formal acceptance.
- Performed black-box testing and User Acceptance Testing with UAT test scenarios.
- Extensively used SQL queries for data validation and backend testing.
- Functionality, Integration, Interface, and Regression testing
- Find and report defects and subsequently validating the fix, repeating the process until done.
- Managed defects usingQuality Center
- Involved in Build deployment activities
- Evaluated and suggested improvements to the software development process.
- Coordinate efforts between development teams and offshore enterprise test team
- Monitoring Testing Activities within the team and reporting regular progress to the Test Manager
- Implemented required testing approaches to accommodate tight schedules and resource constraints including applying risk-based analysis to determinetest coverage
- Mentor QA engineers in performing testing activities
- Conducted Project Domain and Internal Project Functionality Training Sessions
- Work closely with the software engineers to ensure successful, high quality releases.
Confidential, KY January 2011 To Nov 2011
Sr. QA ANALYST
Project: To provide a comprehensive customer service site, The Kentucky Child Support Web Portal, where custodial parents, non-custodial parents, and general citizens can access child support information and other general information. The scope of this project is to generate Federal reports like 157,34A, canned reports, and dash boards.
Responsibilities:
- Understanding Functional Requirement Specifications and System Requirement Specifications
- Provide review comments on Test Scenarios and Test Cases based on Functional Requirement Specifications
- Design and Execution of Test Scenarios and Test Cases based on Functional Specification Document, Use Cases, Low Level Design document
- Preparation of Requirement Traceability Matrix ,Test Metrics and Deployment manual
- Performed Sanity Testing, Data Driven Testing & Ad-hoc Testing when required.
- Performed system testing to ensure the validity of the requirements and mitigation of risks prior to formal acceptance.
- Involved in Build deployment activities and ran UNIX scripts
- Performed User Acceptance Testing with UAT test scenarios.
- Extensively used SQL queries for data validation and backend testing.
- Involved in Data Migration Testing and black box testing and grey box testing
- Worked with SOA (Service Oriented Architecture),different real time interfaces
- Functionality, Interface, and Regression testing
- Report defects and subsequently validating the fix, repeating the process until done.
- Tracking and Managed defects usingQuality Center
- Involved in Build deployment activities and participate in design (spec) reviews with the Core team
- Evaluated and suggested improvements to the software development process.
- Coordinate efforts between development teams and offshore enterprise test team
- Mentor QA engineers in performing testing activities
- Monitoring Testing Activities within the team and reporting regular progress to the Test Manager
- Trained new and offshore employees on areas of the product andtesting processes and procedures
- Work closely with the software engineers to ensure successful, high quality releases.
Confidential,Chicago, IL October 2010 To December 2010
Sr. QA Tester
Project: The NCPDP provides monthly update files for Provider Groups, Pharmacies, Payment Centers and the Pharmacy association to a Provider Group or Payment Center. They contain information to add, change or delete demographic information for the NCPDP and/or Payment Center. The Pharmacy’s association to the NCPDP and/or Payment Center can be added, changed, or deleted which will be changed by processing these files.
Responsibilities:
- Understanding of Functional Requirement Specifications and System Requirement Specifications.
- Provided review comments on Test Scenarios and Test Cases based on Functional Requirement Specifications.
- Prepared Test Cases and Test Scenarios based on Functional Specification Document, Use Cases.
- Performed Sanity Testing, Data Driven Testing & Ad-hoc testing as required.
- Performed System Testing to ensure the validity of the requirements and mitigation of risks prior to formal acceptance.
- Performed Interface and End to End Testing
- Performed User Acceptance Testing with UAT test scenarios.
- Extensively used SQL queries for data validation and backend testing.
- Found and reported defects and subsequently validated the fix, repeating the process until done.
- Involved in Build deployment activities and participated in design (spec) reviews with the Core team.
- Worked closely with the software engineers to ensure successful, high quality releases.
Environment: UNIX, TLINE 5.0, Java,J2EE,EJB, WebSphere6.0, Oracle10g,Quality Center 9.2(QC)
Confidential,Chicago, IL March 2010 To September 2010
Sr. Tester
Project: To address the required changes to the Agent Rebate process, procedures, and related systems in order to meet the Mythos 2010 implementation timeframe.
It was determined that Mythos 2010 initiatives for Pricing and Rewards would impact the Agent Rebate Process. To ensure the Agent Channel was not adversely impacted in the short term with the Mythos 2010 launch, the new processes and/or systematic changes were needed to occur to support accurate and timely equipment rebate payments to Agents.
Responsibilities:
- Understanding of Business Requirement Specifications and System Requirement Specifications.
- Prepared Test Cases based on Functional Requirement Specification Document, Use Cases, and Low Level Design document.
- Prepared Test Scripts and Test Data based on Functional Requirement Specification Document.
- Performed data migration activities in informatica one version to another version
- Performed System, Functional, Interface and Integration Testing.
- Performed Regression and Retesting
- Validated the data Source (CARES system) to Target (EDH system).
- Source to target data validation Field to Field verification.
- Performed end to end ETL testing using Informatica ETL tool.
- Involved in BA walkthroughs and ETL Specification Document.
- Created SQL queries to fetch and verify the data from Source and Target (Teradata).
- Conducted on the ODS data testing and Knowledge on XML.
- Loaded the data into Teradata.
- Involved in Build deployment activities and ran UNIX scripts.
- Prepared of Test Scenarios, Deployment manual and Test Metrics.
- Performed Sanity Testing, Data Driven Testing & Ad-hoc testing.
- Verified outperform, generated file feed, formatted and compared with cares data.
- Checked the control table data, verified report data with source.
- Ran the ETL process from CARES to EDH (Teradata) in UNIX environment.
- Designed the Requirement Traceability Matrix document based on the Functional Requirement document and Test Cases document.
- Provided daily status to Test Lead.
- Defect Logging and Verifying resolved bugs and reported regular progress to the Test Manager.
- Conducted peer review of test cases and provided comments.
- Defect logging in bug tracking tool (QC) and Verifying resolved bugs.
- Worked closely with the Developer and BA to ensure successful, high quality releases
- Verified report data with source.
- Field level data verification.
Environment: Java, EJB, Web Logic Server, Informatica 8.0 and 8.6, Oracle 10g, Teradata, Toad, UNIX, Quality Center 9.2
Confidential, September 2009 To February 2010
QA ANALYST
Project: The Quote-to-Cash (Q2C) program implemented global standardized processes and platforms from the customer quote through cash collection cycle.
Responsibilities:
- Understanding Functional Requirement Specifications and System Requirement
- Specifications
- Provide review comments on Test Scenarios and Test Cases based on Functional
- Requirement Specifications
- Design and Execution of Test Scenarios and Test Cases based on Functional
- Specification Document, Use Cases, Low Level Design document
- Preparation of Requirement Traceability Matrix ,Test Metrics and Deployment
- manual
- Performed Sanity Testing, Data Driven Testing & Ad-hoc Testing when required.
- Performed system testing to ensure the validity of the requirements and mitigation
- of risks prior to formal acceptance.
- Performed User Acceptance Testing with UAT test scenarios.
- Extensively used SQL queries for data validation and backend testing.
- Functionality, Interface, and Regression testing
- Report defects and subsequently validating the fix, repeating the process until done.
- Tracking and Managed defects usingQuality Center
- Work closely with the software engineers to ensure successful, high quality
- releases.
Environment: Quality Center 9.2, ASP.Net, C # .NET SQL Server, Toad, Visual Studio
Confidential, India January 2008 To August 2009
QA ANALYST
Project: The Ecrins2 application provided order-promising capabilities to HP business users. Ecrins2 was a critical application responsible for order promising and allocation management. Therefore, Ecrins2 had built-in High Availability to ensure 24/7 uptime. High Availability is achieved through server and data replication. All transactions are mirrored on a primary and secondary server, which facilitated business for: Inventory reduction and faster response to changes in Supply & Demand.
Responsibilities:
- Involved in leading and participating in the full software life cycle for testing activities – from test planning, test execution to test monitoring, status reporting, documentation and data validation.
- Reviewed requirements documented (mapping document) by Business Analyst for thorough understanding of the application.
- Validated the data thru various stages of data movement from staging to Data Store to Data Warehouse tables.
- Created Test Cases using the SDLC procedures and reviewed them with the Test lead.
- Executed all the Test Cases in the Test Environment and maintained them and documenting the test queries and result for future references.
- Validated the external data source by connecting through Oracle heterogeneous services and joining them to existing tables and reporting the anomalies in the data.
- Tests included System Testing, Regression and Business Objects reports testing.
- Provided flexible & high quality support to wider Testing strategies on key regular deliverables & ad-hoc reporting issues.
- Extensively tested the Business Objects report by running the SQL queries on the database by reviewing the report requirement documentation.
- Validated the reporting objects in the reporter against the design specification document.
- Validated the data files from source to make sure correct data has been captured to be loaded to target tables.
- Validated the load process of ETL to make sure the target tables are populated according the data mapping providing it satisfied the transformation rules.
- Validated the Archive process to purge the data that met the defined business rules.
- ETL validation, Data Model Validation.
- Involved in validating the aggregate table based on the rollup process documented in the data mapping.
- Performed data migration activities in informatica one version to another version
- Retested the modifications, once bugs were fixed after reinstalling the application.
- Reported the bugs through email notifications to developers using Mercury Quality Center.
- Generated Problem Reports for the defects found during execution of the Test Cases and reviewed them with the Developers. Worked with Developers to identify and resolve problems.
- Lead and schedule QA project status meetings and publish meeting minutes.
- Developed presentation and testing implementation learning to other testing resources for cross functional training.
Environment: Informatica 7.1.4 and 8.1 Power Center, Oracle 8i, DB2, Toad, UNIX, Mercury Quality Center
Confidential,India July 2006 To December 2007
Responsibilities:
- Involved in test planning, writing test cases/scripts, test case automation and test execution.
- Documented test scenarios and test cases.
- Prepared ETL and SQL routines/code for performance ETL testing (system and
- Integration testing).
- Verified report data with source, field level data verification.
- Familiar with defect management systems
- Read and understood data model.
- Participated in Full Lifecycle Development on large projects.
- Understood Unit, Functional, System, Performance, Technical and Operational testing and the tools utilized.
- Knowledge of software development life cycle principles and awareness of Agile practices.
- Experienced with data migration testing, data validation testing.
- ETL tool testing & data integrity testing.
Environment: Quality Center 9.2, ASP.Net, C # .NET SQL Server, Toad, Visual Studio, nformatica
Confidential, India July 2005 To June 2006
Responsibilities:
- Worked on Informatica Designer tools –Source Analyzer, Data warehousing designer, Mapping Designer, and Transformations, Repository manager, Workflow manager and Workflow monitor.
- Created the Mappings Using Transformations like Expression, Router, Filter, Normalizer, Sequence Generator, Joiner, Lookup, Sorter and Aggregator.
- Derived facets by analysis of the text of an item using entity extraction techniques.
- Tested the Performance of Mapping and was responsible for design, development and Test Cases
- Scheduled the sessions and workflows.
Environment: Informatica 7.1.2 (Power Connect, Power Mart, Power Center, Designer, Workflow Manager, Administrator and Repository Manager) Oracle 9i, PL/SQL, TOAD, Business Objects 6.5, UNIX
Confidential,January 2004 To June 2005
Responsibilities:
- Understanding of Business Requirement Specifications
- Design and Execution of Test Cases, Test Scripts and Test Data
- Report defects and subsequently validating the fix,repeating the process until done.
Environment: Asp. NET, C#.NET, SQL Server
Confidential, India June 2003 To December 2003
Responsibilities:
- Understanding of Business and System Requirement Specifications.
- Design and Execution of Test Cases, Test Scripts and Test Data
- Report defects and subsequently validating the fix,repeating the process until done.
Environment: Test Directory, C#.NET, SQL Server