Qa/ba Analyst Resume
Coraopolis, PA
SUMMARY:
- Highly accomplished IT professional with almost 8 years of experience in Software Testing and Quality Assurance with Manual and Automation Testing using HP Quality Center and Selenium.
- Expertise in Full Testing Life Cycle from Define Testing Methodologies, Design Test Plans, Test Environment set - up, Create and Execute Test Cases, Defect Tracking, Test Closure, Documentations with effective QA implementation in all phases of the software development lifecycle (SDLC).
- Performed Functional, Non-Functional (load/stress/performance), System Integration testing (SIT), User Acceptance Testing (UAT), Regression and Automation testing on frontend and backend of the application.
- Active contribution in the design, development, testing, troubleshooting & debugging of the process, post-implementation, application maintenance and enhancement support to the client for the product / software application.
- Improve quality and reduce defects by supporting best quality practices and implement effective process improvement methods
- Experience in implementing Agile testing process and techniques with SCRUM methodology in testing projects.
- Experience includes the testing of Client/Server, ETL, SOA and WEB applications using Manual and automated testing tools like Quality Center.
- Experience in Back end database testing on database Oracle, SQL Server and TeraData with complex query writing.
- Highly proficient in different phases of Testing like System, Module and Integration Testing
- Skilled at performing both Manual and Automation Testing on Client-Server, Web-based applications and API tools
- Adept in performing Interface, Functional, GUI, Database, Performance, Regression, Volume, Stress and Security testing
- Good experience in Black Box testing. Well versed in performing Backend Testing to verify data integrity.
- Good working knowledge and exposure to Quality Models - ISO, SEI-CMM, CMMI.
- Successfully researched and implemented manual and automated test tools to suit the client’s best practices and needs.
- Excellent experience conducting system, integration, functional, regression and user acceptance testing and also performed Web services testing using XML.
- Strong experience in analyzing business requirements, designing test scenarios ensuring proper coverage of requirements;
- Strong knowledge in documentation handling: test plans, test cases, test reports, user manuals, guidelines for automation test process;
- Experience in guiding/mentoring the test team as a Team Coordinator and interacting with clients, business analyst, automation team, and development team.
- Build and mentor highly motivated project teams focused on achieving project and organizational goals
- Cohesive Team Player with Fast Learning Curve along with strong analytical, problem solving, innovation, planning, organizational, communication & interpersonal skills
COMPETENCY MATRIX:
- Software Testing Methodologies
- Project Management
- Software Testing
- Test Plan Estimation & Development
- Test Case Execution & Review
- Testing & Analysis
- Writing & Executing SQL Queries
- Defect Reporting
- Escalation Handing
- Client Interaction
- Software Quality Assurance
- Debugging & Troubleshooting Skills
TECHNICAL PROFICIENCY:
Languages: SQL, XML
Operating Systems: Windows, UNIX, Linux
Databases: MS-Access, Oracle, SQL Server, TOAD for Oracle, Netezza, SQL Developer, Teradata 13.10
Testing Tools: HP Quality Center, Test Director, HP Load Runner, BugZilla, JMeter, Selenium
Reporting Tools: Microsoft SSRS
Packages: MS Office (Word, Access, Excel, FrontPage), MS Project, Target Process Management, MS SharePoint, Informatica PowerCenter 6.2, Microsoft TFS
PROFESSIONAL EXPERIENCE:
Confidential, Coraopolis, PA
QA/BA Analyst
Platform used: Windows 7
Framework: JAVA, .NET, ORACLE
Tools: used SQL Server 2005, MS Office, XML, Microsoft TFS, Selenium, JMeter
Responsibilities:
- Preparing test cases to cover all possible scenarios such as configuring clients, making payments, searching for past transactions and printing confirmation receipts.
- Regression testing all scenarios with every release to ensure that all defects have been fixed and current functionality works as intended.
- Analyzing and preparing the FRD to cover all client requirements and ensure that all development and testing is done as per the FRD
- Participating in daily stand ups with the project team to discuss testing activities and updates
- Load testing application with maximum load of users in JMeter to verify that application doesn’t break and transactions are completed successfully
- Recording defects found in Microsoft TFS and communicating to offshore/onsite development team to ensure that they are fixed in the next build
- Running SQL queries in SQL Server to validate that data is getting populated in the relevant tables with the relevant transaction information
- Automating and executing Selenium test scripts to configure clients and make payments on new builds
Confidential, North Bergen, NJ
QA Engineer
Platform used: Windows 7, Windows XP
Framework: JAVA, HTML, ORACLE
Tools: used SQL Server 2005, MS Office, XML, TOAD for Oracle, HP ALM Quality Center, ServiceNow
Responsibilities:
- Prepared test plans and test cases in MS Excel and discussed with BA and QA Lead to ensure that all test scenarios are covered as per FRDs
- Executed developed test cases and reviewed the result with QA team for accuracy and completeness at regular intervals.
- Regression/smoke testing basic modules in every build such as sales tender (cash, credit, check), returns & exchanges, send sales, gift items, item lookup, customer capture and verifying that all modules work as outlined in the FRD.
- Tested the Inventory modules such as cartons received, cartons sent to warehouse as defective returns, inventory adjustment, mispicks & misships and verifying in database tables that all modules work as outlined in the FRD.
- Running SQL queries in Toad to retrieve data in POS tables for testing purposes
- Executed register and store Open/Close functions and determining that the amounts match and the safe is in balance in both instances of opening and closing XStore.
- Recorded defects found in Excel and Service Now during testing POS and Inventory modules and escalating to MICROS for immediate resolution of the defects.
- Backed up XStore and miStore logs (called pos logs) and performing DB backups to ensure data integrity and consistency.
- Provided QA Production support and assisting live stores with issues such as performing mispicks and misships, receiving inventory in stores and problems with tender transactions such as receipts not getting printed, returns not getting processed etc.
- Monitored business team during execution of UAT scenarios and assisted with any issues faced during testing any or all of the POS and Inventory modules
Confidential, Pittsburgh, PA
Quality Engineer
Platform used: Windows 7Framework JAVA, C#, HTML, ORACLE 11g
Tools: used MS Excel, MS Word, Oracle, SQL Developer, XML, PL/SQL Developer, Internet Explorer 8, HP ALM Quality Centre, MS SharePoint
Responsibilities:
- Prepared test plans and test cases in Excel and imported to HP ALM Quality Centre.
- Triggered outbound events in the form of XMLs sent out to third party integrators like PartnerConnect containing information related to appraisals of properties throughout the US.
- Validated the XMLs sent outbound by checking tags/elements/values in them and cross referencing them against the mapping document prepared by the Business Analyst.
- Extracted data from database tables in PL/SQL Developer using SQL Joins corresponding to the various outbound events triggered in VISION.
- Managed the offshore team and monitored all development and testing activities being carried out to ensure 100% mapping of business requirements.
- Lead daily SCRUM meetings with offshore project team as SCRUM master to discuss what goals and activities were completed yesterday and what is planned for today.
- Uploaded appraisal reports on vendor website and generated MISMO XMLs in order to complete appraisal orders in Vision coming in from PartnerConnect originally generated by Wells against consumer loans.
- Edited and posted XMLs to WESB database component in PL/SQL to create new orders in Vision and verifying whether they send out order confirmation events to the client successfully.
- Recorded defects in HP ALM Quality Centre and followed up with development team to ensure swift resolution of those defects.
- Conducted daily stand up meeting with the third party integrators and with Wells three times a week to determine whether all outbound events work and whether information sent out in the XMLs is valid and as per business requirements.
- Managed and configured all configuration settings in VISION to ensure that all XMLs work properly and all tags/elements are mapped to their respective outbound events and are triggered successfully.
- Demonstrated the business features by executing scenarios during UAT (user acceptance testing).
- Executed developed test cases, maintained test data for all executed test cases and reviewed the result of entire QA team with Project Manager and AVP for accuracy and completeness at regular intervals.
- Mapped test cases back to business requirements by creating the Requirement Traceability Matrix in HP ALM Quality Centre.
Confidential, New York, NY
Senior SQA Engineer
Platform used: Windows 7, Windows Server 2008, Linux VM
Framework: JAVA, C#, ORACLE 11g
Tools: used MS Excel, MS Word, Oracle, SQL Developer, XML, Load Runner, BugZilla, Internet Explorer 8, VM Server, Teradata, JMeter
Responsibilities:
- Prepared test plans, test cases and test strategies.
- Extracted data from database tables using Oracle and SQL Developer corresponding to different workloads for use in testing the forecast and prediction features.
- Successfully prepared and executed tests based on business/user requirements, system specifications and screen mock-ups.
- Set up the QA and customer environments on various Linux VMs for testing the builds
- Conducted bi-weekly status update meetings with BezNext team abroad to inform of the testing activities and progress done each week.
- Installed the product builds and executed regression/performance testing based on test cases and test scripts created in Excel and Load Runner.
- Created and executed test scripts in LR VuGen and generated reports in LR Analysis of different user loads applied to the BEZNext application.
- Recorded defects in BugZilla then escalated them to developer for immediate fixing and finally executed another cycle of regression for verification of fixes.
- Defect Management- Documenting issues / defects /report generation/ defect analysis/escalation.
- Performed Web services Testing using XML.
- Coordinated monthly meetings with the CTO to report the status of overall progress for that month and addressed any issues being faced by QA team for carrying out their daily activities.
- Demonstrated the business features by executing scenarios during UAT (user acceptance testing).
- Executed developed test cases, maintained test data for all executed test cases and reviewed the result of entire QA team with test lead/ test manager for accuracy, completeness and usability at regular intervals.
- Mapped test cases back to business requirements by creating the Requirement Traceability Matrix
- Participated in the mandatory yearly audit with the team by preparing & collecting from various departments the required ISO 9001:2008 documents such as the Quality System Manual, Quality Policy and other business process/technical documents.
Confidential, Reston, VA
Senior SQA Engineer
Platform used: Windows XP, Windows 7, Windows Server 2003
Framework: .NET, JAVA, PHP
Tools: used Target Process Management, MS Excel, HP Quality Center, MS Word, SQL Server, SQL, API Testing, MS SharePoint, XML, Agile, SCRUM
Responsibilities:
- Created and documented Project Test Plan with Test Lead after reviewing Business Requirement and other necessary documents.
- Prepared test plans, test strategies and streamlined QA processes
- Managed QA resources and monitored the progress of testing activities being carried out to ensure 100% mapping of business requirements
- Performed both API and Web based testing to ensure that the application works in both modes
- Conducted/lead daily SCRUM meetings with stakeholders and project team as SCRUM master to discuss what was achieved yesterday and what we plan to achieve today.
- Carried out testing for Sprint releases every month divided into 2 Sprints both 15 days apart.
- Validated and uploaded test data created into the application to ensure that managers are able to view qualification reports, postage statements and various other reports
- Efficiently developed testing estimations and schedules for the project to be tested
- Reviewed functional and technical documentation in order to identify requirements for the creation of test plans, test cases and test scripts
- Created test data for usage in testing different modules on the AIMS application
- Involved in testing the software to ensure proper operation and freedom from defects
- Validated and uploaded created test data on USPS’s PostalOne website to help end users in viewing qualification reports, postage statements and various other reports
- Coordinated standup meetings with other testers and escalating issues to the Project Manager as needed
- Performed complex SQL joins on 3-4 tables using SQL statements and utilized the data retrieved for testing and test data creation purposes
- Tested the appointment scheduling feature (Shipment Assurety) of AIMS application by creating and sending appointments in XML to USPS to ensure that end users (mailers) are able to schedule appointments successfully with USPS for picking up their mail and shipping it to the required destination
- Performed testing of the web based application through API utilities developed by software engineers
Confidential, Columbus, OH
QA Analyst
Platform used: Windows XP
Framework: JAVA, .NET
Tools: used MS Office, MS SharePoint, HP Quality Center, QTP, Web Testing, Windows Media Player, Internet Explorer 8, SQL
Responsibilities:
- Created and executed test scripts traced to requirements to validate the quality of the work products, always keeping the customer experience in mind
- Identified and recorded defects in Quality Center.
- Communication With Design team to sort out technical issues with Dev Team to forward the same once the issues/ queries are sorted out
- Performed Defect Management and responsible for preparing and managing the metrics
- Organized status meetings and send the Status Report (Daily, Weekly etc.) to the Client.
- Involved in Regression testing usingQTPand enhanced the scripts by adding functions and inserted synchronizationpointsin scripts whenever necessary.
- Coordinate in administering and managing the bugs in Quality Center
- Assisted in test configuration management activities for moderately complex work efforts
- Worked with vendors and business partners to address ambiguity and provide clear, concise requirements
- Created test scripts in Quality Center and mapped them to maintain the Traceability Matrix
- Analyzed test plans and test cases based on requirements and general design documents involved in both manual and automation testing
- Supported and coordinated testing activities which includes review of test cases and test conditions
- Conducted meetings and walkthroughs and updated test plans and scripts for any functionality changes
- Executed analysis of the defects identified during the testing process
- Detected bugs/ defects during execution of scenarios, logging them in defect tracking tool and working with assigned developer till closure of the bug life cycle.
- Prepared testing summary reports, monitored progress during each test to ensure that the solution is validated and on time and that it meets or exceeds expectations
- Assisted in test configuration management activities
- Performed both manual and automated testing activities
Confidential, Indianapolis, IN
QA Engineer
Platform used: Windows XP, UNIX, Windows Server 2003
Framework: .NET, PHP
Tools: used HP Quality Center, MS Project, MS Office, SQL, Toad for Oracle, Informatica PowerCenter, SSRS
Responsibilities:
- Reviewed functional and technical documentation in order to identify requirements for the creation of test plans, test cases and test scripts.
- Reviewed test cases in the repository based on the use cases, high level design, low level design, project test plan for correctness and comprehensiveness and ensured they match project requirements.
- Performed back end testing for all the 3 phases of the project involving Oracle SQL data
- Developed high level test scripts and test cases for the different stages of the project using Quality Center Test Plan module
- Performed front end testing on SSRS reports containing data coming from the Oracle databases
- Created and executed SQL queries to validate that actual test results match the expected results
- Manipulated and created test data in TOAD for Oracle and TOAD for Data Analyst (Netezza) for the purpose of testing in the various phases of the project
- Developed solid test documentation, including the creation and maintenance of comprehensive Test Plans and Test Cases
- Performed complex SQL joins on 3-4 tables using SQL statements and utilized the data retrieved for testing purposes
- Executed various Shell Scripts in UNIX to trigger the workflows that truncated and loaded data into the Oracle tables in TOAD.
- Successfully maintained constant conversation with Senior Management, Project Managers, Developers and Clients in regard to enhancements and fixes to applications, in order to deliver projects in a timely and cost effective manner.
- Created SIT Test Summary Report, UAT Test Summary Report, Regression Test Summary Report and Performance Testing Test Summary Report.
- Employed Informatica Power Center tool to monitor workflows and verified that data into the tables
- Documented defects in Quality Center and worked with the developers to resolve them
- Created test plans, test cases and test scripts
- Wrote SQL queries and performed complex joins on tables in Toad for Oracle
- Performed back end data testing on tables in Toad for Oracle
- Recorded defects in Quality Center and worked with development team to resolve them
Confidential
QA Analyst
Platform used: Windows 2000, Windows XP
Framework: .NET, PHP
Tools: used MS Office, HP Quality Center, Internet Explorer 7, SQL, MS SharePoint, Agile, SCRUM
Responsibilities:
- Analyzed business requirements and developed a roadmap to accomplish testing
- Worked closely with the development team on day-to-day basis to test and deploy tasks / projects and content in a fast paced, dynamic internet environment
- Performed the User Acceptance Testing (UAT)
- Estimated and scheduled testing efforts for the project
- Participated in daily SCRUM meetings with stakeholders and project team to discuss what was achieved yesterday and what we plan to achieve today.
- Carried out testing for Sprint releases every month divided into 2 Sprints both 15 days apart.
- Administered, customized, and maintained the Test Director environment
- Managed Project requirements and tractability using Test Director
- Developed Requirements Traceability Matrix (RTM)
- Documented Test cases for the application in Mercury Quality Center and reported defects in Mercury Quality Center, and followed through until defects are fixed.
- Creation of Testing scenarios (Positive /Negative), Test cases/ Test scripts, Requirement Traceability Matrix and other test artifacts.
- Successfully maintained constant conversation with Senior Management, Project Managers, Developers and Clients in regard to enhancements and fixes to applications, in order to deliver projects in a timely and cost effective manner.
- Developed high level test scripts and test cases for the different stages of the project using Quality Center Test Plan module.
- Understanding the Functionality and Scope of the Application
- Investigating application bugs through bug tracking system and Interacting with the developers to resolve technical issues and reported the bugs
- Working with the database to check whether the data is correctly stored in the database or not.
- Documenting test cases based on corresponding user requirement documents & technical specifications and other operating conditions.
- Test Director for Test Planning and execution and to create a structured workflow of Test Cases.
- Reported defects through a bug tracking system and coordinated the deployment of fixes and modifications with the development team
- Connected to Toad for Oracle and SQL Server and verified the Backend Data Integrity