Professional Objective: obtain a challenging position as a Senior Quality Assurance Analyst where my extensive knowledge and experience with Management of Offshore Teams and strong validation of web - based applications on any platforms which can be applied to assist in solid test validation.
Confidential, Phoenix, AZ
Associate QA Lead
- Responsible for the management of India Offshore team in an Agile environment.
- Disseminating work to team members after Sprint Planning.
- Attended daily stand-ups; facilitates and stand-ups when Scrum Master where unavailable.
- Verified user stories and bugs where correct
- Performed housekeeping of TFS board; solved team issues such as third-party application assistance.
- Performed Onshore and remote test validation (only QA) on the following application solutions: Care Management, Juvo, Juvo-Statit Integration, Comply and Statit.
- Use Internal Tools including: Support Log, Support Center, COW (Client Only Website) and COMs (Clients Operation Management Systems) and Proprietary applications.
- QA to Development ratio: QA 1 Onshore; QA 5 Offshore; Dev Onshore: 2, Dev Offshore: 4
- Support Applications used: TFS v2015, Microsoft Word, Excel, Notepad and others.
- Researched, gathered and validated quality assurance methodologies, processes and procedures.
- Created guide for standardized quality assurance procedures.
- Standardized consultant responsibilities including back channel negotiations
- Mentored Arizona State University interns on client preparedness and workplace etiquette.
- Relayed client's onboarding process to prospective consultants when applicable.
- Used role-playing scenarios to instruct QA consultants in project procedures and manage when clients have no processes or requirements in place
- Modeled appropriate ways for quality assurance consultants to work with developers.
- Instructed university interns in building contingency plans, and creating test plans, and test script content
- Demonstrated effective use of Microsoft Test Manager, Jira and other Test Management tools
- Presented workshops: How to effectively Perform Manual Testing, Selenium IDE for Firefox, SQL for Beginners, Soap UI, Effective ( Confidential ) Daily Stand up procedures, IoT (internet of things),Working in Scrum and Agile Environments, Using Customized Salesforce Applications and others
- Project Management - managed time and tasks, captured teams concerns/feedback, identified risks and issues, met deadlines, monitored non-development asset such as website themes, images, databases and QA test statuses
- Supported team members remotely with SoapUI, flat files and data input into Applications
- Salesforce with integration tool Informatica Cloud with API validation
- Entered bugs in TFS, validated as fixed, performed regression tests, worked on Product Backlog items
- Test case development in TFS
- Delineated a matrix of responsibility through manual creation
QA Team Test Engineer
- Attended mid-morning ( Confidential ) with Manager and QA Team
- Validated new features and changes to the builds, all with Java components
- Performed cross-browser testing using Microsoft I.E, Chrome, Firefox and Safari
- Used web debugging tools including: Firebug, Web Developer Toolbar and Web Accessibility Toolbar
- Participated in shoulder checks, or demonstrations of new features
- Created new guests, walk-ins, groups and company profiles for test data
- Validated hotels dashboard as functioning after site loaded with the correct business date
- Prepared test case reviews using Microsoft Excel, Rally and Jira
- Used Gitlabs and Github as backup repository when there were issues with Rally
- Generated Steps to Reproduce included when, AUT, steps, Actual Results and Expected Results
- Verified pre-condition steps, validation input and validation expected results in Rally for test cases during regression testing
- Found defects verified in other test environments with various properties and in production to see if the defect(s) was also there
- Found defects, reproduced defect three times, wrote up steps to reproduce, and entered into Rally and Jira3 systems
- Performed root cause analysis on bugs and web application failures
- Verified Hotels (GPS) and Geo-Coordinates for new day-light-savings application feature
- Reviewed and validated defects fixed by offshore teams
- Processed defects and other Annoyances - captured systems, http, https, database and other logs Used forensics log findings for traceability back to the defect, issue, or event
- Certified test cases and defects for the property management applications for domestic and abroad clients
- Interacted with Team Leads, Management, Business Analyst, Project Leaders and other QA team members
- Project Management - managed time and tasks, identified risks and issues, met deadlines
- Validated then certified property management backlog defects for Skytouch, Confidential and Choice Advantage; Hotel content validation
- Participated in UAT
- Validated and certified (ATF) Automated team's script issues and defects
- Substantiated test cases with SQL queries as viable
- Installed AppDynamics on test boxes and did a top down test to make sure it worked with the (AUT) Application under Test
- AppDynamics was used (on separate test environment) to monitor performance after builds for viability
- All issues with the (AUT) application under test and AppDynamics were root-cause inspected then results were sent to QA\Managers with entries in Confluence for other team members to view and comment on
- Performed mobile testing and browser checks using BrowserStack for responsive testing ensuring web content displayed the same in every device
- Used Salesforce Chatter to communicate testing statuses and other information to team members
- Big/INT data validation with Beta applications
- Demonstrated leadership behaviors: required accountability, exhibited integrity, acted as owner, drove innovation, collaborated with system, customers and stakeholders, and developed productivity of self and others
- Performed manual testing in a Scrum/Kanban environment
- Test Methodology: effective Agile Processes (Scrum/Agile Hybrid/Kanban) with cross-functional communication - this included participation in Retro, effective and Groomed backlog and Sprint Planning
- Daily interactions with Scrum Master, daily stand-ups, burn down & up testing tasks when events changed
- Interactions with QA Manager, other Team Members, Product Owners, Business Owners, The Business & Developers
- Project Management - worked on legacy, new and specialized projects, team member meetings to get feedback, statuses, workload updates and addressed any issues
- Designed test scripts (Test Manager 2013), scenarios, procedures and best practices
- Executed test cases
- Documented software/hardware defects using Microsoft Team Foundation server (TFS) then reported defects/bugs, incidents to QA Manager/Programmers
- Performed top down approach/Exploratory testing
- Validated current sprint stories, created testing and unplanned tasks in Microsoft TFS
- Validated all Sprint items in 6 environments, 2 test sandboxes, 3 test and 1 live environment
- Created, validated, maintained test scenarios for SIS modules, SPED, ELL, IEP (504 Plans) and IEPs for ELLs assessments
- Applications test validated on Java, .NET & C# platforms
- Validated 80/20 special projects
- Validated web applications from legacy to new written using KnockoutJS
- AppDynamics was installed then used to validate performance on new, legacy, 80\20 applications
- AppDynamics tests included installation, application, environment, infrastructure compatibility, comparison when turned on/off, performance and some end-to-end validation
- All AppDynamics issues was reported to them, Management and Infrastructure
- Validated software/system modifications to prepare for implementation
- Validated legacy applications and new features
- Responsible for build process after fixes and/or other incidents
- Documented test procedures to ensure accuracy and compliance with requirements
- Identified, analyzed and documented problems with User Experience, program functions, output, online screen and content
- Used Firebug to capture errors (stack traces) when application was under test using Firefox
- Used Web Debugging Tools: Firebug, Web Developer Toolbar, IE Tester, Windows Magnifier, Xenu's Link Sleuth, Yslow, Fangs,Web Accessibility Toolbar
- Cross browser validation for IE, Firefox and Chrome
- Sharepoint 2013 Web Application Testing of Employee PTO, QA and custom Requirements page QA, Requirements pages and others
- Evernote used for my Project Planning and Management (depending on Project)
- Verified code after refactoring
- (Mobile Testing) On (LMS) Learning Management System Team - QA validation of Course application on mobile devices (tablets-Apple/Android/various sizes and Phones-Apple/Android)
- Created process maturity effort in validating that all test, requirements, etc., were in order
- Defect Management - Monitored defect resolution efforts then tracked successes (fixes). If the programmer fix failure, the issue also found defects and recorded them in TFS was sent back and validated UI usability of the applications.
- Root Cause Analysis on bugs and other Noise - captured sys, http, db and other logs and then used that forensics for traceability back to the defect/issue/event(s).
- Root Cause Analysis after web application failure
- Wireframes/Prototype Analysis Testing (when applicable)
- Participated in product design, reviews to provide input on User functional requirements, product designs, scheduling, usability requirements and outlined potential problems
- Reviewed software documentation to ensure technical accuracy, compliance and completeness
- Used Github Repository for Defect tracking, version control, source code, feature requests, task management and used to host 80/20 software projects
- Validated CALPADs (California Longitudinal Pupil Data ) with controlled version number in a text file as requested by client requirements
- Participated in some UAT
- Some Web Services validation, storage management and limited services
- For Arizona Department of (AZEDs) - verified API connections to the State test environments using Swagger - Github Framework for APIs and db triggers populated data points, validated their scenarios document matched SIS
- QA validation with Chief Infrastructure Architect on Gated Check-in, code and Swagger related APIs
- Verified DocuSign documents completeness in (SIS) Student Information System application - uploading of documents, signatures, correct receipts of documents to Parent/Student emails
- Developed, specified standards, methods to determine applications quality for release readiness
- Investigated customer/client problem when referred by Tech Support and Help Desk
- Installed and maintained software updates to COTS and other systems
- Performed manual build process (version control) to test environments
- Verified gated check in Build process to validate changes
- (CI) Continuous Integration using CI Tool to validate the software configuration management process
- Provided feedback and recommendations to programmers on software usability and functionality
- Used AppDynamics to monitor applications performance to ensure efficient and problem free operations. When there was a problem, it was reported to the QA Manager/Programmers in person and via Skype
- Instilled ability to adapt to shifting priorities, ambiguity and incidentals
- Leadership Behaviors: Accountability, Integrity, Acted as Owner, Drove Innovation, Collaborated with System, Customers and other Stakeholders, Inspired others and Developed Self & others.
Confidential, Atlanta, GA
Quality Control Billing Analyst
- Automated Test Tool: AutomatedQA's Test Complete v8.50 & v8.60. (Enterprise and Standard versions). During this test iteration, tool was used for Functional, Unit, Data-Driven, Manual, Regression, Web, Java, HTTP Load, and .NET testing. The IE Development Diagnostic tool was also used.
- (AMR - Automatic Meter Reading device) - Test validated diagnostics, analytics, real-time consumption and status data from energy metering devices such as electric and gas. Then verified transferring of data to the central database for billing, analyzing and troubleshooting. Device setup included handheld, mobile and network Confidential based on telephony platforms such as wired and wireless, radio frequency (RF) and power line transmission. Performed cost comparison on this remote effort verses the expense of trips to a physical location to read a meter. The utility providers received the results from these tests.
Confidential, John's Creek, GA
Test Advocate (Lead) /Manager of Off Shore Team (Consultant)
- Managed 6 members of off shore team.
- Responsible to obtaining updates from off Shore's testing the night before when onshore was in the black.
- Made sure results matched that of on-shores.
- Owned and Managed the Quality Control process.
- Developed, Managed and executed test plans, cases, scripts and schedules.
- Ensured the completion of functional/component, system, cross-system integration, regression, performance, stress, and user acceptance testing.
- Used Microsoft Project Management tool, collected feedback from all project(s) from the team and during solo project outings, during project cycle made sure all team members were actually contributing by having daily meeting on their statues, received updates during the night and early morning, validated all team members met deadlines and address any lingers, UAT, Validated message framework for Marketing, responsible for reserving and allocation of test environment resources and data to support the test scripts.
- Ensured that quality requirements were defined, agreed on and determined resource requirements to support QC tasks. HP QTP IE11 was used for some of the projects and manual testing done on others.
- Validated (all Mission Critical Track Test Areas) Optima UI, Global settings, Standard Product Hierarchy, Price Rules, Price Schemes, Optima user (Day 1 System Owner), Standard Location Hierarchy, Standard Time Hierarchy, Optima User (Day 2 Price Manager), Plan Types (D,A,B,X,M), Product Review Attributes, Attributes, Plan Creation Attributes, Optimization, STYMVE, Compelling Price Points, Price Ladders, Price Override, Optima Budget, Approve Markdowns, Return Products, Vendor Influenced, Manual Approvals, Forecasting, Customized Columns, Plan Summary and Plan Weekly.
Confidential, Tempe, AZ
- Smoke tests were done on all new builds and Stakeholder on whether the application and/or module prep ready to be released and this included UAT testing. Apps sit on .NET and Java environments
- Automated Test Tools: Originally, there was not an automated test tool. I introduced, for additional validation, Selenium Sauce IDE to test various release conditions as an add-on for Firefox. Molybdenum v0.7.3 Firefox add-on cross Web UI cross browser test tool was also used. Selenium (RC) Remote Control (with Ruby 1.9.2-p0) was used with Microsoft e.g. v8 browser. Test Link v1.9.0 was used in the beginning then later test cycles, Seapine Software Test Track TCM v2011.0.0 Build 18 (Windows) was introduced. After each release condition and build, Netsparker Web Security Scanner v126.96.36.199 (Community Edition) was used to analyze the web application.
Confidential, Scottsdale, AZ
Quality Assurance Analyst
- Mirror test included validation of all interfaces and reports.
- Test scripts were created along with Defect/Error Reports.
- Screen shots were also taken of each of the defect/errors.
- Data tests from database and Oracle Forms in ERP environment.
- Requirements, Business gathering and updating when porting from one system to the new.
- Used Open Office and StarOffice8 for documentation.
- Workflow Management Tools: Microsoft Project was used to gauge the principle workflow and any backlog.
- In some cases worked with Product Owner on special test projects with project management tool to validate team was simpatico with the objectives.
- Address any issues with the projects.
- Tested video/audio content and URL connections to third party company websites.
- Applications in .NET/Java environments.
- Used Toad when data was needed and Reflection X for UNIX (by way of Citrix) as an alternative to accessing test environments.
- HP QTP IE v10 was used, UNIX Shell scripting when required.
- Validated Rating Field Expansion validation.
- This application was the interface between Car Dealerships and Agents all over the United States and included my primary.
Confidential, Tempe, AZ
- Automated Test Tool(s): (QTP) Mercury Interactive Quick Test Professional IE v9.2 and IEv9.5 along with Mercury Quality Center (also used for test case design, creation, execution and management) were used in this process along with ad hoc (manual) testing, regression.
- Dealt with gaining licenses and customer service issues. Win-Runner was used periodically. Bugzilla was also used.
- Performed UAT and other tests F2F and remotely. Set up QA Test bed and was responsible for Source Control (Code) Management of the QA Environment.
- Microsoft Dynamics CRM and On Demand CRM/LiveOPs were also validated. Performance tests done with an in-house application.
- Manual testing and some white box testing. Microsoft Project was used for Project Management.
- Held daily status checks, communicated daily objectives, reviewed previous day objectives, made specific project decisions or addressed questions from the team.