Sr. Qa Engineer Resume
Minnetonka, MN
PROFESSIONAL SUMMARY:
- Over 8 years of professional experience in software quality assurance and test automation of web based and client/server applications with full understanding of software development and testing life cycles.
- Extensive experience in functional testing, system testing, feature testing, regression testing, performance testing and user acceptance testing (UAT) in web - based applications and mobile handsets.
- Experience in creating data-driven, keyword-driven and hybrid automation frameworks using Selenium Web-Driver with java and experience in cross browser, cross platform web testing with Selenium Web Driver.
- Experience in development of core QA automation framework function library for various web and client-server application in page object model.
- Experience in web services (API) testing Including functional testing, data driven testing and load testing using Ready API, SOAPUI, Postman, Rest API and Swagger.
- Actively participated in monthly UAT sessions and generated the action Item documents and prioritize them in the monthly iteration cycles.
- Experienced in requirements analysis, test strategy, test planning, test Scripting, test case design, test Case execution, and defect management.
- Experience in defect management including defect creation, modification, tracking, and reporting using Industry Standard Tools like HP Quality Center.
- Experience in demoing application at the end of each Sprint and update project health, challenges and dependencies to stakeholder.
- Comprehensive knowledge of database management systems, SQL Queries, and UNIX
- In-depth knowledge of data analysis, database design, Oracle SQL, and SQL testing
- Proficient in writing PL/SQL statements and executing SQL queries to perform Backend testing.
- Documented the automation approaches and usage details in share point so that all the team members can use them for as part of knowledge management.
- Involved closely with the developer and testing team during defect resolution phase.
- Performed quality assurance testing activities to ensure the applications and products releases follow the quality assurance standards of the organization.
- Strong knowledge in Software Architectures like Client-Server, n-Tier, J2EE and Service Oriented Architecture and knowledge of Software Development Life Cycle methodologies including Agile, scrum and Test-Driven Development.
- Ensured quality audits and process adherence in project delivery across all SDLC phases.
SKILLS SUMMARY:
Programming Languages: Java, C, C++, HTML
Testing Tools: Selenium webdriver, QTP, Win runner, Test director, HP Quality Center, HPE Unified Functional Testing (UFT) software, LoadRunner
Databases and Database Tools: MS SQL Server, MS Access, Oracle DMBS, Toad for Oracle, MySQL, SQL Navigator
Web Debugging Tool: Firebug, Firepath and Xpath
Test Framework: TestNg,Junit,FLite,CRAFT and Robot Framework
Defect Tracking Tools: ALM, JIRA and Bugzilla
Other Utilities: Eclipse, Quality center 11.0, 9.2, JIRA, Jenkins, Ubuild, SVN, gitHub, gitTeamforge, Maven, MS office
Operating Systems and Platforms: MS Windows (All versions), UNIX, MAC OS X, Diamond Claims Processing Systems
RELEVANT PROJECT EXPERIENCE:
Sr. QA Engineer
Confidential, Minnetonka, MN
Responsibilities:
- Analyzed the requirements for stories and developed automation script to validate the acceptance criteria.
- Reviewed product requirement documents and involved in developing test strategy, test plan and test case documents.
- Worked in Agile - SDLC which has the accelerated release in every 2 weeks.
- Worked on creating automation testcases using selenium webdriver in page object model structure in java.
- Responsible for handling the feature & regression test suite of the master data reports and making sure they are developed as per the technical, functional, and business requirements. As the project was in agile, we had to test the new features being added for every sprint and performed regression, performance testing for the existing features & report functionalities.
- Responsible for testing the boundary conditions defined in the software requirements specification. Developed high level test scenarios & automation scripts for the identified workflows and integrate the automation tools, framework / scripts with continuous integration tools.
- Worked with Business in driving Release/Testing/UAT requirements for applications and enable process improvements in Quality management.
- Involved in sprint planning and used Rally software for sprint burn down chart of each sprint to identify the flow and blocks in the sprint.
- Performed Gap analysis to understand whether the current business requirements are met in existing Functional, Performance & Regression test suite and proposed ways to accommodate those in the master test suite.
- Worked with the Business team and conducted UAT (User Acceptance Testing) which would be an evaluation process and quality check on the deliverables for every release.
- Worked on evaluating the various happy path and negative scenarios which should be a part of the automated test suite based on the business requirements.
- Worked with the Regression team on improving the efficiency of FTA (Functional Test Automation) Suites. Prepared Test Plan including testing scope, timelines, risks, mitigation plans, UAT and review schedules.
- Understanding the application under test, writing test plan, defining test strategy, create the test matrix and guiding the test engineers in scripting.
- Automated the report testing using selenium webdriver in page object model framework.
- Performed Day time Validation/Smoke Testing on the pilot server to give a Signoff from Business Ops during project release.
- Experience in executing Business Objects reports and comparing them against the DB by writing SQL query commands.
- Wrote SQL Statements to extract Data from Tables and to verify the output Data of the reports.
- Worked with the Offshore Testing team, to transition the backlog items and performed peer reviews or the work done by offshore resources.
Environment: HP ALM, Jira, Selenium webdriver, MS Windows, Rally, Cognos, SoapUI, Restful Services, XML, MS Office, Power point, Excel, MS Word, Agile SDLC, SQL.
Sr. QA Engineer
Confidential
Responsibilities:
- Develop detailed test strategy and test plan for web services.
- Analyze the business requirements document, put input in test plan and prepare detailed test cases for new functionality.
- API test automation using Ready API tool.
- Validate data between the services and Fdot.
- Configure different environments in Jenkins to execute test cases
- Perform and validate database testing and data retrieval.
- Provide support to system assurance and UAT team for their sign off and manage a centralized defect management area and process.
- Create manual test cases using HP QC and automated test scripts using Selenium.
- Generate weekly test reports and test metrics and updated to the management
- Logging and managing the defects in Quality Center.
- Work in the Agile development environment, track and update user stories in Jira and attend daily scrum meetings.
Environment: Ready API, HP ALM, postman, Selenium web driver, Java, Postman, Jira, HTML, XML, SQL Server
Sr. QA Analyst /Test Engineer
Confidential, Minneapolis, MN
Responsibilities:
- Review the product requirement documents and involve in developing test strategy, test plan and test case documents.
- Worked with Business in driving Release/Testing/UAT requirements for applications and enable process improvements in Quality management.
- Worked in Agile - SDLC which has the accelerated release in every 2 weeks.
- Process Improvements where focused on locating the property for Confidential Store in the specific city or state anywhere in USA and Canada.
- Involved in sprint planning and used Rally software for sprint burn down chart of each sprint to identify the flow and blocks in the sprint.
- Performed Gap analysis to understand whether the current business requirements are met in existing Functional test suite and proposed ways to accommodate those in the master test suite.
- Worked with the Business team and conducted UAT (User Acceptance Testing) which would be an evaluation process and quality check on the deliverables for every release.
- Worked on evaluating the various happy path and negative scenarios which should be a part of the automated test suite based on the business requirements.
- Worked with the Regression team on improving the efficiency of FTA (Functional Test Automation) Suites. Prepared Test Plan including testing scope, timelines, risks, mitigation plans, UAT and review schedules.
- Understanding the application under test, writing test plan, defining test strategy, create the test matrix and guiding the test engineers in scripting.
- Worked on QTP in record and playback of UFT script, Creation of test script, parameterization of the input values as per the requirement.
- Worked on QTP in creation of test objects, shared object repository and object repository management.
- Worked on creating the different types of checkpoints, creation and insertion of output values.
- Worked on creation of reusable and external actions for the test script.
- Performed Day time Validation/Smoke Testing on the pilot server to give a Signoff from Business Ops during project release.
- Worked on evaluating the as-is and to-be online system behavior through comparative testing.
- Worked with the Regression team on improving the efficiency of FTA (Functional Test Automation) Suites.
- Worked with the Offshore Testing team on Site Monitoring and OLS testing with the Defect management tool.
- Experience writing and executing SQL query commands.
- Executed some basic commands like Select Clause, Where Clause, Update.
- Wrote SQL Statements to extract Data from Tables and to verify the output Data of the reports.
- Worked with the Offshore Testing team on Site Monitoring and OLS testing with the Defect management tool.
Environment: UFT Tool, MS Windows, Rally, Cognos, SoapUI, Restful Services, XML, MS Office, Power point, Excel, MS Word, Agile SDLC, SQL.
QA Analyst/ UAT Analyst
Confidential
Responsibilities:
- Worked with Business in driving Release/Testing/UAT requirements for applications and enable process improvements in Quality management
- Worked in Agile - SCRUM which has the accelerated release in every 3 weeks.
- Involved in sprint planning and used Rally software for sprint burn down chart for each sprint to identify the flow and blocks in the sprint.
- Participated in weekly knowledge transfer sessions and test knowledge share sessions.
- Analyze the Business Requirements for stories, developed test script to validate the acceptance criteria for functionality.
- Worked with the Business team and conducted UAT (User Acceptance Testing) which would be an evaluation process and quality check on the deliverables for every release.
- Proposed a plan and format for effectively executing UAT with definite documentation
- Defined end to end test scenarios for better networks and better user experience. Responsible for UAT and reporting all the aspects for preparing user guides.
- Worked on creating the several types of checkpoints, creation and insertion of output values.
- Worked on evaluating the various happy path and negative scenarios which should be a part of the automated test suite based on the business requirements.
- Participated in technical Walkthrough sessions and defect report meetings periodically and coordinated with Developers to resolve the issues.
- Gathered data for quality and performance metrics often producing preliminary reports.
- Focused on Data Quality issues that include completeness, conformity, consistency, accuracy, duplicates and integrity.
- Developed SQL queries to query the database and test the back-end process of the application
- Performed Gap analysis to understand whether the current business requirements are met in existing Functional test suite and proposed ways to accommodate those in the master test suite.
Environment: Quality Center, QTP, Java J2ee, Html5, Css3, SQL, MS Office Power point, Excel, MS Word, Agile SCRUM, Toad.
QA / UAT Analyst
Confidential
Responsibilities:
- Developed test plan and conducted test case walk through to the project team.
- Worked closely with Development (UI), Product Management and Technical Operations during the development, test, and launch stages of the software development and release cycle (Waterfall and Agile).
- Executed all stages of Test life cycle Test planning, Test case design, Execution, defect Tracking, Metrics and Status reporting.
- Coordinate with end-users to schedule and support User Acceptance Testing (UAT), gave weekly risk and issue log update.
- Conducted system, functional, integration, regression testing and retesting as well.
- Executed automated test scripts and reported the results.
- Attended Daily Stand ups and other project related meetings.
- Collaborated with other QA, Engineering and Product Manager Employees to define and document test plan.
- Wrote test cases in QC/ALM for functional testing and Regression Testing.
- Executed test cases, reported defects and communicated with developers to get them fixed in a timely manner.
- Defined entry criteria and exit criteria for all QA deliverables.
- Tested the application functionalities across different browsers, platforms including mobile devices.
- Performed functional testing, GUI testing, database testing, end to end testing, system testing, and regression testing.
- Worked closely with development team to verify bug fixes.
- Recorded and documented the details of bug in the bug tracking system (JIRA).
- Recognized bug severity and escalated and prioritized based on the severity and impact of any product bugs detected.
- Communicated test progress, test results, and other relevant information to project stakeholders and management.
Environment: HP ALM, Selenium, Java EE eclipse, Oracle, Html5, MS Office Power point, Excel, MS Word, Agile SCRUM.