Qa Automation Architect/qa Lead Resume
Houston, TX
SUMMARY:
- QA Analyst with 13 years of experience in Manual and Automation testing with familiarity in Functional, Regression, System, User Acceptance (UAT) and Graphical User Interface (GUI) testing
- Sound understanding of QA processes and methodologies - SDLC (Software Developing Life Cycle), STLC (Software Testing Life Cycle), Waterfall and Agile methodology
- Experience in various Manual, Automation & Agile Test Management tools including but not limited to HPQC/ALM, HP QTP/UFT, Microsoft Test Manager(MTM), JIRA, TFS, Coded UI and Selenium Java and C#
- Experience with Protractor to create end to end framework using Angular JS.
- Experience in Windows based, Web based GUI Testing, Mainframe Testing, Database/Data Warehouse Testing and SAP testing
- Experience with SOUPUI, Restful API using HTTP Methods GET, PUT, DELETE, POST in several Projects
- Experience in various types of testing like Regression Testing, Integration Testing, System Testing, Performance Testing, Stress Testing, Functional Testing and Android (Mobile) applications.
- Good understanding of database platforms with hands-on experience in writing SQL queries.
- Performed database testing to verify ACID properties of DB transactions.
- Performed testing with Real Time data and Historical data for ETL transactions in Microsoft BI and SAP BI
- Proficient in testing on Windows based and Web based Application using Client/Server systems, desktop and web technologies. Familiar with UNIX, Linux, Windows & Web applications.
- Expertise in creating Project Plan, Test Plan, Automation Test Strategy, Effort Estimation, Gantt Chart (Project Scheduling), Statement of Work, Project Proposal, Project Plan/Milestone Tracking, Test Closure Report and Defect Management Chart.
- Expertise in Hybrid, Keyword Driven, Data-Driven frameworks implementation using automation tools like HP UFT and CUIT.
- Expertise in Behavior Driven Development framework implementation with SpecFlow and Cucumber
- Good functional noledge of SAP (areas like SD, SRM, PPM and FICO). Good domain noledge in Finance, Airline, Insurance, Oil & Gas and Healthcare industry.
- Basic understanding of Data Science concepts and use case ranging from Artificial Intelligence, Machine Learning, Algorithms, Process and Data Mining, Sentiment Analysis and Others.
- Responsible in handling Defect Tracking and Mapping reporting through Quality Centre/ALM and TFS.
- Experience in using TFS Reports for Test Case Readiness, Test Plan Progress, Release Burn down, Sprint Burn down and Velocity chart to indicate teh amount of efforts teh team is completing in each sprint.
- Experience in working with teh development team based out of multiple locations/time zone.
- Contributed on establishing Quality milestones and aligning it with Project lifecycle following proper release management process involving parallel releases in a very dynamic and agile environment
- Good leadership and communication skills. Good customer relationship, oral, written communications, planning and problem resolution skills.
- Ability to handle multiple projects with competing deadlines in a fast-paced environment.
- Experiences working with multiple team based project, Short to Long term projects and Onsite - Offshore model.
TECHNICAL SKILLS:
Testing Tools: CodedUI, Microsoft Visual Studio 2013/2015 and Microsoft Test Manager (MTM), HP Unified Functional Testing (UFT), HP ALM 11.0 (Quality Center / Application Lifecycle Management (ALM), VSS, Test Director, Clarify, Selenium, Bugzilla, JIRA, Cucumber, Rational ClearQuest, Trac, SoapUI, Junit, TestNG
Operating Systems: Windows 8/7/Vista/XP/NT, Linux (Red Hat, SUSE, Ubuntu), UNIX (HP, Solaris)
Programming: C#, C++, VBScript, Java, ASP.Net, VB.Net, XML, HTML, Shell Script, JavaScript
Database: Oracle 8.x/9i/10g/11i/12c, Sybase 11.5, SQL Server 7.0/2005/2008/2012, MS Access, MySQL
Other Tools: MS Office (Word, Excel, PowerPoint), MS Visio, Outlook
Methodologies:: Agile / Scrum / Kanban, Waterfall.
WORK EXPERIENCE:
Confidential, Houston TX
QA Automation Architect/QA Lead
Responsibilities:
- Identifying test automation approach, application and automation tool feasibility.
- Created framework from scratch in Selenium C# with Specflow
- Implemented core component of framework like ConfigReader, Driver, KendoGrid and Database helper
- Implemented PageBase, SubPageBase for Page Object Model and TestBase for test classes
- With Nunit framework provided support to run test parallel and generate Extent report after execution
- Build teh automation team in India and US. Helped team to get up to speed with framework
- Implement automation framework for APIs using Rest Assured library
- Help automation team to create and maintain automation script for API using Java/TestNG
- Implemented mobile framework using Appium
- Performed Scrum Master role for 6 Sprints and Facilitate scrum ceremonies
- Maintaining Product Backlog Items for each sprint in regression automation project
- Setup teh Azure CI/CD pipeline, created build plan for Smoke and P1 test cases
- Development, documentation, and implementation of key QA processes and procedures aligned with business and test-driven development.
- Explore new tools/technologies like Katalon Studio and Cypress.
Confidential, Katy, TX
QA Lead
Responsibilities:
- Work as QA Architect covering all aspects of quality assurance including establishing metrics, applying industry best practices, tool selection and defining processes to ensure quality goals are met while adhering to budget and timeline.
- Developing teh scripts for testing Web Application by implementing BDD like Cucumber
- Implemented frameworks, net.masterthought reporting, Logging and Arquillian Cube for containerization
- Working closely with Business team for transformation of Acceptance Criteria to Gherkin language (BDD)
- Tested and documented REST/HTTP APIs for Microservices implemented in Java, including JSON data formats.
- Performed database testing while working with DBA to test CRUD operations, stored procedures, functions and data mapping.
- Coordinated with DevOps team to setup Cucumber, Specflow and Protractor test project on Bamboo server
- Worked closely with DevOps team to setup Test and QA environment and any build failure
- Work with Product Owner for requirement clarification during planning and design phase and UAT testing
- Implemented a Data-Driven, and Keyword-Driven testing framework using Protractor and built a Custom Dashboard using Angular to trigger tests and view results and logs
- Spec file and configuration file used to write Protractor tests and validate teh results. Native events and browser specific drives used while writing tests in Protractor
- Set up teh Test Automation Framework from scratch to run tests on Bamboo server using Docker container
- Performed Real Time data testing for Fuel automation tools like TCS, Siteminder, Veeder-Root and FuelMaster using APIs and connecting to FAA database
- Created Selenium behavior driven development (BDD) Automation framework using Specflow for .NET
- Worked on Page Objects Model, support cross browser and parallel execution using Specrun
- Built over 500 automated UI test cases dat ran daily via scheduled Jenkins builds
- Created Service Level automation for both positive and negative test cases with Restsharp for GET and POST
- Worked with Support and Dev team to resolve critical production issues ASAP
- Trained and On-boarded new resource joining teh QA team
Confidential, Houston, TX Dec
Sr QA Analyst
Responsibilities:
- Managed a group of 10 Quality Engineers. Elaboration of project schedules and reports, test planning for sprints, releases, and continuous verification.
- Conducted an assessment to identify teh gaps in teh current QA on teh areas of test process, test coverage, test documentation, cross functional trainings, tool limitation, skill and resourcing needs
- Designing and Developing versatile automation framework
- Created Shared Object Repository to reuse teh controls, used Assertion & Exception handling and WPF dynamic control handling in scripts
- Enhancing teh tests by creating standard auto-logging functions to capture and append teh logs to teh test run.
- Presenting teh Test Approach & Integration Test Demo to Product Managers.
- Involved in testing teh performance of teh application using Load Runner VuGen and interacted with developers to check performance of teh application
- Generated Vuser scripts for WEB type using Load Runner. Created Scenarios using Load Runner Controller for multiple host and scripts and executed scenarios & analyzed teh performance results
- Validated teh connection and data between ClickSoftware web services and SnapShot web services.
- Developed and executed test cases for IFS Web Services /XML / SOAP and RESTFul services using SoapUI
- Worked closely with Business Analysts and Developers to create test cases/scenarios, test steps, expected results and test data in MTM
- Extensively worked on Integration testing of SAP SD, ClickSoftware, IFS and Job Details
- Prepared detailed product documentation and QA KT documents and transitioned to Support team
- Reviewed test cases and defects created by offshore QA team
Environment: SQL, VSTS, Coded UI, MTM, TFS, XML, Web Services, SoapUI
Confidential, Charlotte, NC
Sr QA Analyst
Responsibilities:
- As a QA Lead, was involved in task allocation and work monitoring of resources with teh entertainment team
- Involved in sprint planning and calculating LOE (Length of engagement)
- Worked closely with business analysts and developers to create test cases/scenarios, test steps, expected results and test data in MTM
- Performed requirements analysis and established QA metrics for software quality
- Designed Bug Tracking Process for Team Using Scrum template of VS
- Create test Suites using Requirement bases test Suite, Query Based Test suite and Static Test Suite for grouping Test Cases under Test Plan
- Use TFS Reports for Test Case Readiness, Test Plan Progress, Release Burndown, Sprint Burndown and Velocity chart to indicate teh amount of efforts teh team is completing in each sprint
- Developed automation framework for automation testing
- Created automated test cases and performed automation testing using CodedUI using C# in Visual Studio (VSTS)
- Reviewed teh system requirement specification documents, developed teh test strategy, test plan documents and also developed teh traceability matrix between test requirements and test cases
- Extensively worked on testing of SAP SRM and PPM Modules
- Developed and executed test cases for Preliminary Budget Approval, Final Budget Approval and Supplementary Budget Approval workflow of SAP PPM
Environment: .Net C#, VSTS, MTM, TFS, Coded UI
Confidential, Houston, TX
Sr. QA Analyst
Responsibilities:
- Business requirements, design, risk analysis, project tracking, development and quality testing
- Involved in testing teh new functionalities based on test cases and coordinated with development team in fixing teh issues
- Responsible for release new builds
- Involved in backend database base testing in SQL server 2005
- Created test scripts after release to implement with master testing scripts using automated QA
- Manual testing for 3D modeling of oil well based on depth azimuth and inclination algorithms provided by engineers
- Tested 2D graphs using GDI+ to represent various calculated metrics
- Develop and Implement Page Object Model based automated functional testing utilizing JBehave, Java and Selenium WebDriver for automating teh web applications.
Environment: Net Framework 2.0, SQL 2000/2003, NUnit, XML, XPath, JBehave, Selenium
Confidential, Dallas, TX
Sr. QA Analyst
Responsibilities:
- Performed analysis of teh requirement for QA by reviewing teh Business Requirement Documents (BRD) and Use Case Documents
- Developed Test Plan, Test Scripts and reviewed Test Cases created by teh team and also handled teh allocation of detected errors
- Created Test Scenarios and Test data and planned and executed both positive and negative test scenarios
- Maintained teh Traceability Matrix and to check if teh testing efforts are in sync with teh requirements
- Worked extensively in test scripts to Parameterize values for performing Data-Driven testing
- Created Reusable Actions for user-defined functions to enhance code efficiency
- Enhanced scripts by putting Check Points like standard checkpoints, Text Checkpoint, output values by putting Text Area output value Check point, passing values between actions if teh scripts needs
- Created Per Action/Shared Object Repository so dat they can be used for different modules in teh test plan
- Organized and conducted GUI, Functionality and Regression testing. Supported UAT testing by preparing teh test plan and UAT test data
- Analyzed formal test results in order to discover and report any defects, bugs, errors, and interoperability flaws
- Tracked and communicated testing progress, test results, and other relevant information to management and project team
- Reported teh defects and was involved in teh Defect Life Cycle. Prioritize these Defects based on teh Severity of teh Defect. Conducted defect review meetings
- Wrote SQL queries to validate input data submitted through GUI with teh actual data in teh Database.
- Performed back end testing using SQL queries
- Used Quality Center to create and maintain teh test cases and teh test scripts for both System Testing and Regression Testing
- Developed test plan for System, Detail & over all test plans using teh business documents
Environment: Windows 7, QC/ALM, SQL, C#
Confidential, Milwaukee, WI
Software QA Engineer
Responsibilities:
- Integrated Manual test suite with Microsoft Test Manager
- Worked in Agile development environment with frequently changing requirements and features
- Involved in daily Bug scrub with Developers and Business Analysts
- Responsible for QA, manual/automation tests, and wrote technical documentation for validation
- Wrote and maintained automation test plan, test cases, scripts, policies and practices for automation including scope, objective, standards and procedure
- Built strong client and program management relationships by providing testing transparency
- Parameterized teh tests using Data Driver to replace fixed values with values from an external source during teh test run
- Managed integration, systems, acceptance, Regression & implementation testing
- Determined test execution priorities, deadlines, expectations and scope within a predefined test window
- Managed & Logged defects in MTM
- Performed functional testing, regression testing, UAT Testing and End to End testing
- Prepared Test Plan for assigned module as a part of teh testing team. Test Plan detail included teh testing scope, requirement, strategies and all compulsory resources
- Executed manual testing for all scenarios on products
- Participated in System Integration Testing (SIT) and User Acceptance Testing (UAT)
- Conducted regression and system testing in order to ensure dat new code is not causing issues and dat defects have been corrected
- Created/SQL queries for backend database testing
- Utilized Quick Test Professional (QTP) for basic automation testing which included recording and playing tests as well as parameterizing values
Environment: Windows, .net, C#, MTM, TFS, VSTS 2010, QTP, SQL
Confidential, Los Angeles, CA
Software QA Engineer
Responsibilities:
- Coordinated with teh various business analysts on teh project requirements, test cases review and provided teh design walkthrough with all teh stake holders
- Interacted with development and support teams on understanding and resolving documented bugs for given applications
- Interpreted and understood teh system requirements for new and existing applications
- Wrote test plans, test scripts, and test scenarios dat adhered to technical and business requirements
- Experienced executing all types of testing - Functional, System, Load, Performance, Integration, Regression, Negative, Black Box, White Box and Production Validation
- Created dummy Test Data, identified Test data and Co-ordinate with automation team for test data creation
- Executed Test Cases, reported test case results and reviewed testing progress with testing team
- Logged and Tracked Defects in Test Management Suite
- Documented test results and bugs utilizing standard procedures
- Managed teh offshore and onshore teams and led teh Testing of teh project
- Reviewed Error log files in UNIX box when order fails to load into SQL tables
- Extensively worked on XML for data transformation of data from flat files to teh database
Environment: Mainframe, SharePoint, UNIX, PUTTY, XML
Confidential, Boston, MA
Software QA Engineer
Responsibilities:
- Identified, Estimated, Planned, Created, Maintained, Updated and Reviewed Manual & Automation Test Plans, Test Suites, Test Cases, Test Scripts, testing tasks and End-To-End Test Scenarios from time to time as per teh changes in requirements
- Responsible for teh validation, verification, functional, integration, regression, smoke, adhoc and exploratory testing of various components and functionalities of teh applications
- Created, Updated Test data for different scenarios based on teh Test Cases
- Worked with cross functional teams for integrated testing of teh entire system
- Performed Defect Tracking, Monitoring & Reporting including Defect Status reports
- Validated teh Test Results, updated in Quality Center
- Work with offshore team and coordinate regression testing tasks
Environment: s: Windows 2000 / 2003 / XP, UNIX, Quality Center, QTP, Oracle, Toad
Confidential, Northbrook, IL
Manual QA Tester
Responsibilities:
- Reviewed and verified test cases for functional, Smoke, GUI, Regression and System testing
- Tracked Defects and posted defects in Quality Center
- Provided and managed Test Data for test execution
- Handled teh flat files using teh data driven technique
- Executed teh test scripts and created Descriptive programming
- Handled recovery exception, post bug handling concepts and modular performance
Environnent: ASP.net, Oracle, HTML, JavaScript, Python, Quality Center