We provide IT Staff Augmentation Services!

Qa Engineer Resume

MN

SUMMARY

  • 8 years of professional experience in Information Technology with diversified experience in software analysis test design and execution. Detailed knowledge of formal test methodologies. Application testing expertise includes planning, process implementation, automated regression and system testing.
  • Result - driven Quality Assurance professional with solid knowledge in manual and automated software testing and extensive experience in software development methodologies including both Agile (Scrum, Kanban) and Waterfall models
  • Experienced in writing System test plans, defining test cases, developing and maintaining test scripts according to the business specification given.
  • Good understanding and working experience in ETL process.
  • Experience in Web based applications and Object Oriented Technologies. Ability to interact with developers and product analysts regarding testing status.
  • Working knowledge of source code management and configuration
  • Solid understanding of the Source to Target data and Data base testing.
  • Experience in ETL testing in SaaS
  • Experience in leading QA teams and building QA strategy.

TECHNICAL SKILLS:

Languages: SQL VB Script, Visual Basic and HTML

Testing Tools: Quick Test Professional, Selenium

Operating Systems: UNIX, MS-DOS, Windows 95/98, Windows NT, Windows 2000 & Windows XP/XP Pro, Windows Vista, Windows 7

Databases: Oracle, SQL Server, Tera Data and MS-Access

Tracking Tools: HP Quality Center, TFS, Test Track Pro, IBM ClearQuest, Jazz

Browsers: Internet Explorer, Netscape Navigator, Mozilla, Firefox

MS-Suite: MS-Word, MS Excel, MS Outlook, MS Power point and MS Access.

PROFESSIONAL EXPERIENCE

Confidential, MN

QA Engineer

Responsibilities:

  • Manage all QA efforts for the team
  • Responsible for creating and maintaining Test Plan, Test strategy, Test cases, Test scripts
  • Ensure the test cases and test scripts are written according to BRDs and Data mapping documents and they are traceable and repeatable.
  • Responsible for Analytics testing for the Calabrio 1 application
  • Responsible for writing SQL to do data and Data base Validations in all points of Data.
  • Following Agile Development methodology with design and development evolving as the project progressed in development life cycle
  • Involved in daily standups, Sprint planning and Sprint retrospectives/ceremonies
  • Backlog grooming for user stories based on stake holders needs
  • Responsible for testing multiple applications including client/server web applications spanning multiple projects
  • Report/summarize test results to the team and management.
  • Involved in web and client server based testing
  • Serve as a member of User Acceptance Team. Executing test scripts on various tests applications, including Unit testing, Regression testing, System Testing and UAT Testing. Responsible for Organizing and Documenting test process. (Developed Test Conditions and test Scripts).

Environment: Win 7, SQL Server 2008, MS TFS, MS Excel, MS SharePoint, IBM Rational software tools (Jazz)

Confidential, MN

QA Analyst Lead

Responsibilities:

  • Lead and manage all QA efforts for the team
  • UI, Functional, Web Services, cross browser, ETL, Reports Testing of Web, distributed, SOA, Reporting, legacy, J2EE applications in the Scientific Services and Repository Team
  • Responsible for overseeing and resolving issue related to Quality Assurance for Scientific Services
  • Responsible testing 'software for Sales application', applications that are used in different clinics including Donor centers, transplant centers, cord blood bank, coop registries, for the transplant process.
  • Lead and delegate responsibilities between QA team by providing clear guidelines on the scope and deadlines for testing the Sales application.
  • Work with management staff and QA staff and peers to improve QA process, in making in more efficient and robust.
  • Created Test plan for the Scientific Services team identifying all areas of testing, risks and mitigation factors.
  • Created Test strategy for each major changes which identified the points for testing
  • Take part in Daily scrum to update status of testing, sprint planning and sprint retrospectives where application requirements, design and changes were put forward by stakeholders, including product owners, stake holders, developers, Quality Analyst and Business Analysts.
  • Work with Product owners and stakeholders in creating product Backlog, grooming and prioritizing it every sprint
  • Took part in user story planning, writing and prioritizing in every sprint
  • Responsible for writing effective test cases from user stories
  • Conducted Regression test for all test cases every sprint and included additional test cases with the progress of sprints
  • Created Business Requirement documents communicating clearly technical information to non-technical stake holders.
  • Performs data mining to spot trends, performance enhancement opportunities
  • Worked in migration project to migrate the data from RRSW (fox pro application) to Labvantage (new application)
  • Perform Data Validation at all level of data from source to Target tables in SSIS and SSRS
  • Write complex queries to interpret data mapping to compare data

Environment: Win XP, Oracle, DB2, SQL Server, Soap UI, IBM Jazz, Excel

Confidential, MN

Sr.QA Technical Analyst

Responsibilities:

  • Participate in JAD sessions in reviewing business requirements and design specimens
  • Coordinated with the IT Director and QA Manager in deciding the tools that needed to be used in the project
  • Worked also in identifying and troubleshooting systems and tool issues and in streamlining and improving the QA process
  • Initiated in documenting key QA processes to facilitate the QA team
  • Worked with the Application Manager and the QA manger in setting up time lines for project delivery
  • Conducted Knowledge Transfer sessions for the new and existing QA staff.
  • Automating and maintaining scripts for regression testing in Selenium
  • Developed Test strategy for every iteration
  • Testing the GUI for case management application
  • In JAD sessions identified GUI components and their functionality from stake holders and composed test cases accordingly.
  • Wrote and executed UI test cases and Analyzed test results and trends provided feedback for process and performance improvements.
  • Created test data for execution of test cases and test scripts
  • Created complex T-SQL scripts for backend testing for MS SQL Server which included validating triggers and stored procs

Environment: QTP, Selenium Win XP, SQL Server 2008, MS TFS, MS Excel

Confidential St Louis Park, MN

QA Analyst (ETL testing)

Responsibilities:

  • Participate in analyzing and reviewing business requirements and design documents
  • Performed thorough testing for Source to Target ETL data movement.
  • Worked on creating Source-Target mapping support document
  • Coordinated with Technical Leads and Business Analysts to design and implement test scenarios
  • Followed the Water Fall Development methodology in design and development to report the ETL process
  • Responsible for creating test plans and scenarios/test cases, documentation, creating test data and executing test cases.
  • Analyzed and documented all test results including problems and defects.
  • Created Script in Proc SQL, test scripts to test the Source and Target using SAS, and scripts that would Automate the process for 1:1 testing
  • Creation of the Test Data in Excel and converting it to csv format for the Data to be readily available in SAS.
  • Developed reports in SAS to produce clear output for the comparison of source and Target
  • Logged defects in IBM Clear Qwest

Environment: IBM Rational Clear Quest, Win XP, MS Excel, SAS, Tera Data

Confidential, MN

QA Analyst

Responsibilities:

  • Participate in analyzing and reviewing business requirements and design documents.
  • Coordinated with Technical Leads and Business Analysts to design and implement tests scenarios
  • Testing application on .net platform
  • Followed Agile Development methodology with design and development and defining testing/acceptance criteria as the project progressed in development life cycle.
  • Involved in daily standups, iteration planning and scrum ceremonies.
  • Used Test Driven Development methodology
  • Involved in epic and story planning in every iteration.
  • Responsible for creating test plans and scenarios/test cases as well as documenting detailed expected results.
  • Analyze and document all test results including problems and defects.
  • Report and track status of issues/defects through resolution and escalate issues as necessary in Test track pro
  • Regression test and results to ensure system is performing functions as described in business requirements and design documents.
  • Developed User Acceptance Test scripts, and carried out UAT.
  • Performed System testing in different browser versions IE, Netscape, FireFox
  • Conducted functional, System and Regression testing during various phases of development
  • Tested every new build manually (Sanity Testing) and reported if the most important functionality of the build failed
  • Wrote SQL queries to check proper data population in application database tables
  • Assists in the creation, preparation and implementation of systems quality assurance reviews.
  • Applied CMMI method for all testing documents such as Test Plan, Test Strategy, Requirement Validation Matrix and Test Cases
  • Develop and execute System/Master test plan and test scripts.
  • Interacts with technical team(s) and business users to share information and clarify testing
  • Involved in producing the test results in a customized format so as to help the developers in quickly fixing the bugs and reporting it back to the testing team for further testing instructions.
  • Extensively used Quality Center to log defects and bugs
  • Creates basic to complex test plans using templates and guidelines.
  • Performs moderately complex test data conditioning, regression testing and testing validation.
  • Logs, tracks and verifies resolution of software and specification defects (issues).
  • Provides guidance to less experience QA analysts.

Environment: Quality Center, IBM Rational Clear Quest, Win XP, MS Excel

Confidential, MNO

QA Automation Engineer

Responsibilities:

  • Lead a QA team in developing strategies for manual and automated testing.
  • Involved in Testing Software for Sales application called 'Zims'. Zims animal management software which is being subscribed by zoos all over the world.
  • Coordinated with stakeholders regarding software upgrade and issues and plan, create user stories and test case based on the feedback.
  • Testing application created on the VB.net
  • Install, setup and configure thorough business test cases using Quick Test Professional
  • Mentored people in developing test scripts for automation of the application using VBscript.
  • Implemented the Software Development Methodology in the testing life cycle.
  • Executed test cases manually and reported bugs using Quality Centeras well as TFS
  • Scripted test suites in QTP, including custom functions in VBScript, to regression test builds.
  • Used Quality Center to check out the latest versions of the build for testing purposes, and check in the updated test cases, and test documentation periodically
  • Maintained bug lists for critical issues using Quality Center
  • Performed functional testing on the Web applications for ZIMS by creating automated scripts in QTP.
  • Execution of tests in QTP and preparation of metric reports on test case coverage, progress regression, and test cases successfully passed and failed.
  • Performed Manual Testing of Web based application using Quality Center and Automated by using QTP (Quick Test Professional).

Environment: QTP, Quality Center, Win XP, SQL Server 2005, MS TFS, MS Excel

Confidential, Minneapolis, MN

IT support/QA Tester (fulltime)

Responsibilities:

  • Plan, manage and execute the deployment of new installations and upgrades of new computer hardware and software
  • Managing user accounts for existing computing systems as well as managing printing needs.
  • Used heat to log calls of customers and keep track of the problems
  • Managing Computer labs, provide technical support and training to students, faculty and staff.
  • Imaging computers using Norton Semantic ghost
  • Performed network testing to find defects in the network
  • Assist the system administrator in network and server management.
  • Assist in development and delivery of training material, cabling installations, research and provide recommendations for new computing technologies for enhancing the existing computing environments
  • Took part in loading of application in computer labs and prepare documentation for new procedures and revised documentation for existing procedures
  • Took part in the testing the University website
  • Performed unit, smoke, regression, system and browser testing on the website
  • Provided suggestions to improve the quality of the website
  • Business requirements gathering and analysis.
  • Provide input into developing and modifying systems to meet client needs and develop business specifications to support these modifications
  • Assist the architecture team in the logical and physical dimension modeling. Interact with Management to identify key Dimensions and Measures for business performance.
  • Formulating a testing strategy to test all touchpoints / levels of Data
  • Worked directly with the developers to test new and existing functionalities.
  • Was involved in different internal and external meetings to have clear understanding of the business requirements and also to represent fromQAperspective.

Environment: Microsoft PowerPoint, Word, Excel, Outlook

Hire Now