Qa Analyst Resume Profile
PROFESSIONAL SUMMARY:
- Over 8 years of experience in information technology as a Software QA professional.
- Strong knowledge of Software Development Life Cycle SDLC and Software Testing Life Cycle STLC .
- Hand on experience in working in Waterfall, V model and Agile/SCRUM methodologies.
- Extensive experience in analyzing Requirements, Functional and Technical Specifications converting them into Test Scenarios and Test Cases.
- Strong working experience in Black box, White box, Functional, System, Integration and End to End, Regression, Automation, testing.
- Experienced in Web Application testing including Functionality, Usability, Interface, Compatibility, Security testing.
- Hand on experience in test and defect management tools such as Quality Center, TFS, Microsoft Test Manager MTM , Team Foundation Server TFS and JIRA.
- Extensive knowledge on ETL process, OLAP, OLTP and N tier data architecture and RDBMS.
- Hand on experience in ETL testing using ODI Oracle Data Integrator and DataStage and data validation.
- Experience in data validation and testing Database Applications of RDBMS in Oracle, SQL Server, DB2 and MS Access.
- Hand on experience in analysis, creation of testing data to validate the ETL transformations rules, business rules including positive and negative scenarios and conditions.
- Hand on experience in BI report testing using OBIEE, Hyperion, Enterprise Performance Management EPM 11 and business object.
- Good understanding of OBIEE repository Physical, Business Model and Mapping and Presentation layers for both Stand-Alone and integrated and analytics implementations.
- Hand on experience in writing simple and complex SQL DDL, DML and DCL queries for back-end testing and data validation.
- Good understanding of Data warehouse objects such as Fact Tables, Dimension Table, Relationship, Unique Identifiers and constraints.
- Expertise in writing simple and complex SQL queries using TOAD, SQL Developer.
- Hand on experience in Test Planning, Test Execution, and Reports using Microsoft Test Management and Lab management.
- Expertise in Defect Tracking, Defect management and Reporting using Quality Center, BugTraker, JIRA.
- Hand on experience in Configuration and version management tool Subversion SVN , JIRA.
- Built a dynamic relationship with the client, development and other teams to insure the highest quality deliverable possible and to deliver them on time.
Technical Emphasis
Data Warehousing | DataStage 9.1/8.1, ODI 11.1.1/10.0 |
Reporting Tools | OBIEE 11.1.1/10.1, BI Publisher 11.1.1.6/10.1.3.4, EPM 11, SAP BO 6.0 |
Data Modeling | Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin |
Testing Tools | SOAPUI, Quick Test Pro QTP , Quality Center, TopTeam, Test Director, Rally, Application Life Cycle Management ALM , Mercury Quality Center, Rational Clear Quest |
RDBMS | Oracle 11i/10g/9i/8i/7.x, MS SQL Server 2008, UDB DB2 9.x, Sybase 12.5, Teradata V2R6, MS Access 7.0 |
Programming | UNIX Shell Scripting, Korn Shell, C Shell, Bourne Shell, Bash, SQL, SQL Plus, PL/SQL, TOAD, C |
Web Technologies | JavaScript, HTML 4.0, and DHTML, Java, JSP, XML, J2EE, Microsoft .net and C |
Environment | UNIX, MVS, HP-UX, IBM AIX 4.2/4.3, Novell NetWare, Win 3.x/95/98, NT 4.0 |
Configuration Management | Sub Version SVN , Rally, JIRA |
Methodologies | Waterfall, V-Model, Agile, Verification and Validation |
PROFESSIONAL EXPERIENCE
Confidential
Sr. ETL/ BI Tester
Safeway is an American supermarket chain that is acquired by Albertsons in early 2014. The merger project is to merge the Safeway and Albertsons Applications and database together and improve the reporting capabilities of Safeway using cutting age BI tools to support wide range of business decision.
Responsibilities:
- Analyze Data Models, Data mapping, Design, Conversion and ETL design documents
- Create, maintain and track of Requirement Traceability Matrix
- Microsoft Test Manager MTM creating and executing Test Cases and printing status report for the team meetings
- Worked with the project team to determining defects using the MTM and Version One
- In charge for coordinating ETL testing: Source, Staging, ODS, EDW, BI
- Implemented SQL Queries for testing integrity of data in database Backend Testing
- Tested database integrity, referential integrity and constrains during the database migration testing process
- Tested the reports different types of Customized Reports Drilldown, Aggregation Created by OBIEE to meet client requirements
- Created and administered the physical layer, Business Model and Mapping Layer and Presentation Layer in a OBIEE Repository using OBIEE Administration tool
- Involved in validating the reports layouts in OBIEE based on Report Layout Document
- Creation and execution of test cases on Oracle Reports and OBIEE
- Executed SQL Queries for testing integrity of data in database Backend Testing
- Backend and Database Testing: Oracle, MS-SQL Server, MS Access databases
- Analyzed business requirements, transformed data, and mapped source data using the Teradata Financial Services Logical Data Model tool, from the source system to the Teradata Physical Data Model
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning
- Used Software for Querying ORACLE. And Used Teradata SQL Assistant for Querying Teradata
- Experience in Data migration and Data distribution testing
- Tested database integrity referential integrity and constrains during the database migration testing process
- Designed and kept track of Requirement Traceability Matrix
- Quality Center updates and test cases loading and writing Test Plan and executing Test Cases and printing status report for the team meetings
- Extensively involved with ETL test data creation for all the ETL mapping rules
- Executed SQL statements to test the integration between application and database
- Proficiently worked with OBIEE to test Physical Model, Business Model and presentation layer and established relationships
- Tested different types of Customized Reports Drilldown, Aggregation created by OBIEE to meet client requirements.
- Validated the reports layouts in OBIEE based on Report Layout Document
- Creation and execution of test cases on Oracle Reports and OBIEE
Environment: DataStage 8.0, OBIEE 11g, Teradata, SQL, Microsoft Visual Studio, MTM, MS Office, SQL Server, Oracle Data Staging, Version One, Agile, Kanban
Confidential
Role: Sr. QA Engineer/Sr. QA Analyst
Project Description: Integrated Eligibility System IES This project's goal is to improve access to programs providing service to economically disadvantaged people by providing a simple, efficient, seamless, and traceable system for people to access and manage their health coverage, insurance, or aid for state of PA.
Responsibilities:
- Reviewed and analyze the Requirements documents, Design documents, Data design/flow, Data mapping documents to identify the Test scenarios and test data for testing
- Collected and reviewed the business flow, business scenarios from business analyst.
- Responsible for creating requirement traceability matrix RTM into Quality Center and generate the report to verify the test case vs. requirements coverage
- Developed and maintain Test Approach and Test Plans documents for Functional, System and Compliance testing and reviewed with project team members
- Responsible for writing and executing the test cases/test scripts for accessibility testing on web application as per government compliance using tool JAWS, AccVerify, and Wave.
- Responsible for performing 508 compliance validation testing
- Responsible for working within an agile development environment and working in a Scrum software development and testing framework
- Responsible for participating in Scrum call and provide the updates, closely working with Scrum Master, Product Owner. Analysed the Product backlogs, Sprint backlogs and developed test matrix
- Developed and generated test progress and defect reports from Quality Center and communicated the same to the project team
- Involved in identifying the manual test scenario to develop the Automation and Performance scripts
- Developed the Data Driven Automation framework using QTP and Quality Center
- Developed Automation Script using Data Driven Methodology which applies the Business rules to validate the components displayed on the website.
- Prepared Automation Test Cases and Required Screenshots for Web application and developed Automation Scripts focus tests to target key business use cases using QTP.
- Customized and enhanced Automation scripts based on business rules validation and implemented the Check points, Output values, Regular Expressions etc in QTP.
- Responsible for overall software product quality.
- Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production
Environment: My SQL, SQL Developer, Top Team, Application Lifecycle Management ALM , SharePoint, Jaws, AccVerify, Wave, QTP 11.0, Oracle 12G, MS Office, Outlook, JIRA, Confluence, Beyond Compare
Confidential
Role: Sr. QA Analyst/QA Engineer
Project Description: EIM program in USDA is to build the new Enterprise Data Warehouse EDW in Oracle environment and use it for reporting based on user requirements. Also migrate the existing data marts from Informix environment to Oracle environment. In this project testing team is involved in testing the ETL process and BI Reports for new Data mart implementation, involved in testing of data migration from Informix to Oracle environment and testing the new and migrated OBIEE and EPM 11 reports.
Responsibilities:
- Worked as a single point of contact for all Testing activities includes Functional testing, System Testing, Integration Testing, End to End Testing and UAT Testing
- Reviewed and analyze the Requirements documents, Design documents, Data design/flow, Data mapping documents to identify the Test scenarios and test data for testing
- Collected and reviewed the business scenarios from business users and developed Test Scenarios and Test cases for validate the same
- Developed and maintain Test Approach and Test Plans documents for Functional, System and UAT testing and reviewed with project management and business user's team
- Responsible to develop and maintain the user guides for UAT testing support
- Manage resource allocation and task assignments for different data mart to the team members
- Mentor junior test engineers to encourage best practices and thorough testing of the application
- Responsible for exporting the Requirements and Test cases into Quality Center using excel add-in
- Responsible for creating requirement traceability matrix RTM into Quality Center and generate the report to verify the test case vs. requirements coverage
- Responsible for test cases execution and defect management using Quality Center
- Developed and generated test progress and defect reports from Quality Center
- Responsible for testing progress meeting and sending testing status reports to Project Management team and Development Team
- Developed Test Cases using WSDL, Schema files which defines Web Service Request, Response, methods/operations, End Point of web service to be tested
- Tested the various Service Oriented Architectures SOAs spanning across various departments using SOAP WSDL using SOAP UI Pro
- Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
- Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity.
- Tested all OBIEE Dashboards according to the requirement.
- Validated the Security, navigation to a particular screen, dashboard, page, and report for OBIEE.
- Validate access for data, screens, dashboards, pages, and reports, response time to access the application, particular dashboard, page and report, capability on Reports for OBIEE.
- Validated if the summary reports are working from charts, tables, table headings, counts are matching between the summary and detail report where appropriate.
- Tested the reports like Drilldown, Drill Up and Pivot reports generated from OBIEE.
- Tested all OBIEE Dashboards according to the requirement.
- Written several complex SQL queries for validating OBIEE Reports.
- Testing has been done based on Change Requests and Defect Requests.
- Prepared Test status reports for each stage and logged any unresolved issues into Issues log.
- Used Quality Center for bug reporting. Tracked and reported the bugs with Quality center.
- Created Data base checkpoints and wrote SQL queries to validate the data
- Created and maintained shared object library, Functions, data sheet in Quality Center for QTP scripts
- Developed and implemented standard testing practices/approach to improve the testing process for manual and Automation Testing using Quality Center 10.0 and QTP 10.0
- Utilize defect tracking tool HP Quality Center to trace, assign, verify and close defects
Environment: SQL Developer, Oracle EPM 11.1.1 Web Client, ODI Release 10.1.3, Hyperion 8.5, Informix Client, Quality Center 10.0, QTP 10.0, MS office 2010, SQL Server 2008, JAVA, Visual Basic Scripts, SharePoint, Putty, WINSEP, Sub version SVN , JIRA, WebServices, SOAPUI, XML
Confidential
Role: Sr. QA Engineer II/Functional Tester
Project Description: New upgraded sales and support Web applications of the Intel corp. from legacy system to .Net framework allow the customers to create and manage their accounts, orders, delivery tracking, payment management etc. Testing team was involved in performing Web application testing includes Functional Testing, Security Testing, Compatibility Testing, Usability Testing, Interface testing and data validation testing.
Responsibilities:
- Responsible for implementing and monitoring the QA on very initial stage of SDLC, involved in requirements gathering and analysis
- Responsible for Keep track of issues and solutions from team members and maintain lessons learned for every releases
- Responsible for develop and execute the test cases for System and Integration testing
- Tested XML transmission and verified inbound-outbound XML using a SOAP UI 3.0
- Developed and customized XML scripts to access and execute embedded in the WSDL file.
- Identified and replaced actual value and Aut and selected the correct outgoing WSS while running the XML script in SOAPUI
- Modified end points when we worked on different servers and different versions of Web Services.
- Performed web specific testing like Link checking, Browser page testing, Application Testing and Security Testing.
- Installed, configured SOAP UI to test web services using WSDL file given by Development team
- Performed Function testing, System Testing, Regression testing and UAT Testing
- Developed standardize Matrix and Reports to keep track of team activities
- Created and maintained the Test and Defect control matrix for each testing phase
- Developed and presented Test approach, Test strategy and Test Plan
- Worked with team in Agile methodology to develop the test cases and execute
- Wrote test cases and created test data for Positive, Negative and boundary condition testing
- Provided support to testing and UAT team with tools and issues
- Logged and updated Issues and Risks using the configuiration management tool JIRA
- Created documents such as Automation Framework, Testing cycle flow, defect management guidelines, demo and presentations
- Provided resolution to an assortment of problems complex in scope related to Quality Assurance
- Maintained releases and cycles in Quality Center
- Wrote and execute SQL Queries to get testing data to validate standard and customized BI reports
- Tracked, reported and follow-ups defects using Quality Center 10.0
Environments: QTP 10.0, Quality Center 10.0, JIRA 8.0, TOAD 10, OBIEE 10g, Business Object 8.0, JAVA, J2EE, SharePoint, Clear Quest, WebServices, SOAPUI, XML, Web Logic application server, UNIX, MS Office, Oracle, TOAD, Web Services, SoapUI, XML
Confidential
Role: Sr. QA Analyst
Project description: The Migration project of the Department of Health and Human Services DHHS of Ohio is to migrate the old legacy system from old mainframe platform to new window based platform to make more convenient and user friendly and faster. Testing team was involved in performing the functional testing, Data conversion testing, Data migration testing, System integration testing, End to End testing and UAT support.
Responsibilities:
- Worked with the Business Analysts to collect requirements to create test cases
- Participated in web services requirements and design document walk through Analyzed and reviewed the software requirements, functional specifications and design documents
- Created test plans and test cases based on web service requirements
- Performed Web Services testing using Soap UI Pro to locate WSDL file on internet, create Test cases, run them to do load testing, security testing
- Experienced in Groovy Scripting in SOAP UI PRO tool Web service Automation .
- Proficient in Setting up test cases and test suites with required assertions in Soap UI tool to validate the web services based on business requirements
- Proficient in field and error validation in regards to web services testing
- Updating WSDLS in Soap UI as needed to retest defects or to test updated service requests
- Experienced in data driven testing by creating data sources and data loops in Soap UI Pro
- Performed test execution in SOAP UI and GUI, placed the test results in Micro Soft excel
- Developed process documents, users manual and defect loging process to support UAT testing
- Conducted test execution/status meeting with project stake holders
- Supported UAT testing and created defect guide lines documents for users using Quality Center
- Conducted meetings and walk through with users and junior team members to discuss the defect management process in Quality Center and Clear Quest
- Created integration test scenarios and tested application areas developed on SOA architecture and WebServices
- Generated and distributed test progress and defect status reports to Management team
- Facilitated defect review meetings and improved existing defect management process, driven team to work in faster manner by reporting defects status
- Wrote and used SQL queries to setup and validate test data
- Wrote and executed the Test cases for Web apps diagnostics, accessibility and compliant validation based on government defined process and procedure using the tools
- Utilized defect tracking tool HP Quality Center to trace, assign, verify and close defects
Environment WebServices, SOAPUI, XML, Web Logic application server, WSDL,.NET, Microsoft SQL Server 2008, SQL Server Management Studio, MS office 2007, Share Point, Subversion, Quality Center 9.6, J2EE, HTML, XML, VB Scripts, JAWS, AccVerify
Confidential
Role: Sr. QA Analyst/Tester
Project Description: -Verizon uses an Order Tracking System OTS which is an integrated information system that tracks the customer orders for services from initial entry through completion. It consists of different modules i.e. Sales, Order Entry, Credit Check, Consultative Services, Export, Install, Sales Support, Delete Order, Inventory, Shipping, External Orders and Intl Ship Approval. Each module has specific functionalities and multiple users. And, out of these, through out the business process, the three modules Sales, Order Entry and Install are the major modules which are being tested by the System testing team
Responsibilities:
- Responsible for implementing and monitoring the QA on very initial stage of SDLC, involved in requirements gathering and analysis along with Business Analyst and SME's
- Developing and execution of test plans, test cases based on business requirements of the Application
- Performed automated testing for Web Applications
- Extensively used Quick Test Pro for functional and regression testing for automation of multiple modules in the environment
- Tested compatibility of application with Internet Explorer 6.0, Netscape Navigator 5.0.
- Report defect found during test cycles and Track the defects and retest fixed programs Developed and customized Excel reports and graphs using Dashboard for reporting in QC
- Define and develop standard testing approach and time lines based on given testing time frame
- Wrote and execute SQL Queries to get testing data to validate BI reports
- Develop, maintain and execute test automation scripts in QTP using visual basic, setup test data needed for scripts to run in different environments
- Customize QTP, installed add-ins, develop function libraries, recovery scenarios and environment variables using QTP 9.0
- Tracked, reported and follow-ups defects using Quality Center 9.0.0
Environments: QTP 8.0, Quality Center 9.0, JIRA 6.0, TOAD, Oracle Client 11G, UNIX, ASP, VB scripts, TOAD 8.0, JDBC, Window NT, SQL, Business Object 6.0
Confidential
Role: QA Analyst
Project: Worker's Compensation Conversion The New York state government keep track of the employers, business worker componsation requirements based on rules and regulations. Also the sate provides the Satate Insurance Fund for the non profit agencies of the state of New York.
Responsibilities:
- Followed the Scrum approach of Agile software development methodology
- Develop and maintain functional Integration, Platform, Security Testing and regression test cases
- Collected requirements, used cases and developed Test Scenarios and Test cases
- Create testing requirements in Test director and mapped with Test cases developed in Test Plan module of Test Director
- Executed the test cases and logged the defects in Test director
- Maintained the configuration related changes in configuration management tool JIRA
- Worked with release manager to prepare for releases for m Testing environment to PROD
- Worked closely with deployement manager to complete the deployement plan and testing task
- Co-ordinate and execute backend jobs in UNIX and Sybase environment for test environment setup
Environments: Test Director 8.0, SQL Server, Oracle 9i, XML, JAVA, Visual Basic, HTML, C , SQL Developer, SQL Server 2005