We provide IT Staff Augmentation Services!

Software Verification Tech Resume

3.00/5 (Submit Your Rating)

Winona, MN

SUMMARY

  • About 7 years of experience in IT industry, Expertise in Manual and Automation Testing. Hands on experience in Mercury products (Quick Test Pro, Quality Center) excellent analytical, problem - solving and documentation skills. Team player with excellent Interpersonal and communication skills. An enthusiastic and out-going individual, with ability to interact with the team members.
  • Highly proficient with full life cycle QA Methodologies and concepts on mainly in Manual and partially in the Automated Testing Tools QTP(v9.0) and HP QC(v10.0),Excellent analytical and technical skills with a demonstrated capacity in debugging client/server and web applications.
  • Experienced in all types of application testing, such as manual, integration, system, regression, load, functional, Unit. End-to-End (E2E), UAT and acceptance testing.
  • Expertise in performing manual testing using HP QC (version 10.0), Citrix environment & DataRAMP desktop.
  • Experienced in defining Testing Methodologies, Designing Test Plans and Test Cases, Documentation based on standards for Software Development and effective QA.
  • Extensive knowledge of the relational data modeling and relational database concepts like Tables, Primary and foreign keys, Views. Proficient in the data manipulation using SQL for the retrieval of data from the relational database (inner joins, outer join, group by, order by etc.) Exposure to all aspects of software development, troubleshooting, testing, and maintenance.
  • Good knowledge in standard SQA methodologies and agile methodology.
  • Experienced in Requirement Analysis, Feasibility analysis, organizing meetings with various Stakeholders and the end users and finalizing the Business Requirement Specifications.
  • Good Knowledge of testing the application on Mobile devices. Have tested synchronization between the Wi-Fi server and database for the application.
  • Developing and executing QA / UAT test plans, test cases, test results analysis and reporting, defect status matrices.
  • Involved with Management (Feedback after each Release, Daily Testing Status) to implement Quality Process.
  • Automation using Visual Test, Exactor, Selenium and QTP.
  • Developed Exactor, and Selenium scripts to validate and verify Reliance system
  • Ability to use Mercury Tools, Quality Center for Defect Tracking and Test Plan documentation.
  • Experience in developing and imparting pre and post implementation training, conducting GAP Analysis, User Acceptance Testing (UAT), SWOT Analysis, Cost Benefit Analysis.
  • Testing skills include performing regression, integration, volume and stress testing as well as development, execution, maintenance of test plans, test specifications and test scenarios.
  • Knowledge of SQL (SQL server 2005) to validate Transaction Testing for front end testing to validate data in back end database.
  • Defect Management including defect identification, reporting, prioritizing and tracking using QC/MS Excel (depending on project requirement) have extensive experience in maintaining and performing Functional, Regression scripts using QTP.
  • Have experience in new implementations, enhancement and maintenance projects both off shore as well as Global teams.
  • Excellent Communication, Outstanding Leadership and Interpersonal skills with clear understanding of business logic to multi-task, prioritize, and manage increasingly complex issues.
  • Flexible, innovative and able to thrive in a fast paced, growth-oriented and time-critical environment.
  • Worked on mainframe testing in the beginning of my career. Responsible for File Aid batch testing in the client server architecture.

TECHNICAL SKILLS

Microsoft Tools: MS Visio, MS Office, MS Outlook.

Databases & Languages: HTML, SQL, C, MS SQL Server, MS-Access 2.0.

Testing Tools: Quick Test Professional(10.0), Quality Center(10.0), Manual testing, Silk Performer, Selenium Silk Test, Black Box, Automation, Exploratory, Ad-Hoc, Negative, Regression, Unit, Integration, System, Verification and Validation, End to End, Agile framework, User Acceptance Test (UAT)

Methodologies & Standards: SEI - CMMI, ISO 9001:2000, RUP, Six Sigma, SDLC Agile, QA, QTC, SCRUM, Agile, Waterfall

Operating System: Windows 95/98/NT/2000 Server/XP

Skills: UML, RUP, SDLC, JAD

Web Technologies: HTML, VB Script

Defect-Tracking Software: Mercury Quality Centre(V 10.0)

PROFESSIONAL EXPERIENCE

Confidential, Winona MN

Software Verification Tech

Responsibilities:

  • Manual & automation testing using HP quality center 10.0.
  • Analyzed requirements documents and Use Cases to prepare the detailed Test Plans, Test Cases.
  • Write test cases to test the application manually in HP quality center and automate usingQuick test pro.
  • Worked with developers to resolve and fix the faults found in testing the structure and functionality of the application.
  • Analyzed software requirement specification documents. Created processing model diagrams in UML/Visio using the business process information captured in the business context analysis documents.
  • Assisted,Developed, and implementeddetailedTest Plans,Test Cases,Test Scriptsand Test Scenarios based on Functional and Business Requirements.
  • Reviewed business manuals andrequirement document(BRD) in order to summarize system-specific business rules and other operating conditions.
  • Analyzed business flow of the application.
  • Created test scenarios, test scripts andtest cases(common and provocative), corresponding to the test requirements in order to maximize verification coverage of system variables.
  • Carried out Integration testing to ensure data processing, interface validity and proper communication among components of each application.
  • WroteSQL queriesto verify database tables for the data inserted from the GUI.
  • Created Test Data before manual test run forPositiveandNegativetesting.
  • Created RequirementsTraceability Matrixusing Quality Center to ensure comprehensive test coverage of requirements
  • Used Quality Center as repository for maintainingTest Cases,executionandtrackingthe Defects.
  • PerformedSanityandSmoke testingfor eachnew build of DOTNET, JAVA and C++applications.
  • Performed Black Box Testing, Back-End Testing, Regression Testing, Ad-Hoc Testing, End-to-End Testing, Positive and Negative Testing for AUT.
  • Attendedwalkthroughmeetings with the Business Analysts, ProjectManagers, and developers and provided feedback accordingly. Often discussed enhancement and modification request issues (Change Requests).
  • Created test execution status and summary test report to managers.

Confidential, Minneapolis MN

QA Lead

Responsibilities:

  • Analyzed software requirement specification documents. Created processing model diagrams in UML/Visio using the business process information captured in the business context analysis documents.
  • Based on the client’s unique requirements, developed test scenarios for the individual Modules and products that will be tested during System Testing. The Test Scenario document details the conditions under which the product or module is tested during System Testing. Test Scenario documents are stored in the appropriate Project Folder in the Implementation Document Repository in Microsoft share point.
  • Created and updated test cases and test scripts.
  • Review Selenium automated test scripts with Automation team.
  • Executed and reviewed test cases and test scripts for testing proprietary embedded software.
  • Documented software bugs in TestTrack defect management databases.
  • Performed Test Track verification for software bug fixes.
  • Performed adhoc testing on engineering releases to help minimize and uncover any major anomalies or issues.
  • Lead the execution of complex test cases/scripts and interpreted/analyzed results, ensuring that all issues are resolved in a timely manner with other software engineers.
  • Automated acceptance test using Exactor and Selenium.
  • Verified Test Track issues (entered in Test Track defect management databases) for accuracy in bug fixes, ensuring that the issues reported as fixed have been accurately resolved.
  • Assisted in the development of Black Box test scripts for testing insulin pumps to ensure that all software requirement specifications (SRS) were met.
  • Reviewed Test Cases during the development test cycles for accuracies, which helped uncovered many serious errors/omissions prior to having the test cases submitted to Quality Control.
  • Assisted the Configuration Manager with Engineering Report (ER) support tasks (reviewing/re-reviewing test cases, scan test cases, etc.) for generating the Engineering Test Reports (ETRs) towards the end of each development test cycle, which extensively minimized the amount of time needed by the Configuration Manager to complete the Engineering Test Reports.
  • Performed functionality tests by running all Phases in all workflows in Reliance System (manual and auto using Selenium).
  • Participated in the Global Voices initiatives, in which the team’s ideas/suggestions/recommendations were implemented to improve leadership effectiveness.
  • The IC/ADS/SS performs Unit Testing to test their work making sure it is free of defects.
  • Unit testing is conducted in the DRS environment prior to System Testing. When Unit Testing is complete, the DRS database is copied to DRA.
  • After the DCS sets up the DRA database for System Testing and the LIVE database for Regression, Interface End to End, and User Acceptance Testing, the TE performs Sanity Testing to verify the new database environment has been set up correctly.
  • Once the IC, ADS, or SS has verified Unit Testing is complete, the TE begins System Testing in the DRA environment. During System Testing, the TE executes a series of test cases to test the applicable modules and products for defects. Defects discovered in System Testing are entered into the Quality Center tool with a status of New. The IC/ADS/SS validates the defect, determines the severity, and identifies the necessary steps to resolve them. When resolved, the IC/ADS/SS changes the defect status in QC to Fixed.
  • The TE retests the defects. If the retest is successful, TE changes the defect Status in QC to Close.
  • The TE produces a daily defect metric report.
  • After System Testing is complete, the TE completes the System Test Sign Off form.
  • After System Testing is complete, the DRA database is copied to LIVE.
  • After System Testing has successfully completed, the TE begins Regression Testing in the LIVE environment. Regression Testing identifies the defects that may have materialized after Defects were fixed during System Testing.
  • The TE retests all defects in LIVE.
  • The TE executes selected Test Cases in Quality Center.
  • The TE enters defects and assigns defects to the Team members in QC.
  • The TE retests the defects. If retest successful, changes the defect in QC to Closed.

Confidential, Northfield NJ

Sr. Implementation Test Engineer II

Responsibilities:

  • Manual & automation testing for the website of scuba diving using HP quality center 10.0.
  • Analyzed requirements documents and Use Cases to prepare the detailed Test Plans, Test Cases.
  • Write test cases to test the application manually in HP quality center and automate usingQuick test pro.
  • Worked with developers to resolve and fix the faults found in testing the structure and functionality of the application.
  • Performed manual and Selenium testing of a web based application.
  • Analyzed software requirement specification documents. Created processing model diagrams in UML/Visio using the business process information captured in the business context analysis documents.
  • Assisted,Developed, and implementeddetailedTest Plans,Test Cases,Test Scriptsand Test Scenarios based on Functional and Business Requirements.
  • Reviewed business manuals andrequirement document(BRD) in order to summarize system-specific business rules and other operating conditions.
  • Analyzed business flow of the application.
  • Created test scenarios, test scripts andtest cases(common and provocative), corresponding to the test requirements in order to maximize verification coverage of system variables.
  • Carried out Integration testing to ensure data processing, interface validity and proper communication among components of each application.
  • WroteSQL queriesto verify database tables for the data inserted from the GUI.
  • Created Test Data before manual test run forPositiveandNegativetesting.
  • Created RequirementsTraceability Matrixusing Quality Center to ensure comprehensive test coverage of requirements
  • Used Quality Center as repository for maintainingTest Cases,executionandtrackingthe Defects.
  • PerformedSanityandSmoke testingfor eachnew build of DOTNET, JAVA and C++applications.
  • Performed Black Box Testing, Back-End Testing, Regression Testing, Ad-Hoc Testing, End-to-End Testing, Positive and Negative Testing for AUT.
  • Attendedwalkthroughmeetings with the Business Analysts, ProjectManagers, and developers and provided feedback accordingly. Often discussed enhancement and modification request issues (Change Requests).
  • Created test execution status and summary test report to managers.
  • Experience in DevelopingTest scripts inQTPto test the functionality of the application.

Confidential, Washington DC

Lead Test Engineer

Responsibilities:

  • Lead system and user acceptance tester fora third party contract management application that allowed the organization to contract withhealthcareproviders.
  • Planned, wrote, developed, executed, and verified testcases of web based (VB.NET) health insurance programsfor side effects using theHPQuality Center.
  • Managed bug defects inHPQuality Center.
  • Extensively worked on Informatica tool for Enterprise Datawarehouse projects.
  • Responsible for communicating with Healthcare providers and state government insurance payers (Medicare, Medicaid, Blue Cross/Blue Shield) to set up Electronic Data Interchange (EDI) technology for HIPAA electronics insurance claims processing.
  • Ensured that customer and vendor provided all necessary HIPAA data formats, communication protocols and proprietary business specifications for problems against the EDI production system to upper level management. Assigned problems to appropriate software or system development team for software changes and system upgrades. Work environment consists of data mapping analyst, communication protocols, HIPPA transaction sets, "C", WEB technology and GUIs.
  • LeadUATtester on validating AUDIT reporting function from front end to backend.
  • Designed SQL and stored procedures for regression testing. Trained testers on Mercury Quality Center (MQC) tool for test case development and defects tracking.
  • Lead and trained tester on Mainframe functions to capture insurance claims pricing data.
  • Validated insurance claims pricing functions worked correctly on legacy system and verified that the same insurance claims priced correctly by the enterprise - wide insurance claims pricing system.
  • Created insurance claims test data and scenario using MS Excel spreadsheets. Sr. QA tester on test data development for web base, client server and mainframe applications.
  • Responsible for requirement analysis, scenarios, test case, tests plan development and test execution. Generated defects, weekly reports, and fine tuned QA test process for efficiency and accuracy.
  • Supported End-to-End Thread testing. Validated test files for CARE, Facets, and NASCO.
  • Supported the development of an internet marketplace connection between the Buyer and Supplier of Hospital goods and services.
  • Involved in the modification and upgrade of Lawson Insight 8.0.
  • Responsible for QA of the design implementation and application testing.
  • Responsible for documentation analysis of design, requirement and functional specifications as they pertain to testing.

Confidential, Mclean VA

Software QA Tester

Responsibilities:

  • Analyzed business requirements and developed a roadmap to accomplish testing
  • Initiated best testing practices and cross trainee process
  • Estimated and Scheduled testing efforts for the project
  • Installed and configuredHP QCover the network
  • Administered, customized, and maintained theHP QCenvironment
  • Created, modified and maintained user's groups and profiles
  • Managed Project requirements and tractability usingQC
  • Developed Requirements Traceability Matrix (RTM)
  • Automated Test scenarios for GUI, Functionality, Boundary, Security and Regression Testing usingQuick Test Pro
  • Updated and managed multiple scripts using Object Shared Repository
  • Extensively used Object Spy to view both the run time object methods and test object methods associated with an object
  • Extensively usedQuick Test ProCheck points (Object Property, standard and database)
  • Parameterized test scripts to ensure unique set of data inputs
  • Developed Data driven test to handle the scenarios requiring multiple sets of data
  • Performed regression testing by executing the baseline scripts to identify functional issues
  • Performed cross browser testing to ensure compatibility of the application on IE.
  • Test management and defect tracking process had been accomplished usingHP QC.
  • Analyzed user requirements and software requirement specifications documents
  • Created test plan anddetailed test cases usingHP QC.
  • Executed test cases manually and identified the mismatches
  • Created user defined functions to enhance the maintainability of test scripts
  • Created Data Driven Tests to validate the same scenario with different test data using
  • Performed Regression Testing usingQTP
  • UsedSQL queriesfor retrieving data from SQL Server database as a part of the validation methods
  • Performed cross browser testing to ensure compatibility of the application on IE
  • Communicated defects to the developers using defect tracking toolHP QC.
  • Participated in bugs and enhancement review meetings.
  • Performed user acceptance testing, interacted with users for execution of test cases in UAT

Confidential, Westborough, MA

Sr. Software QA Tester

Responsibilities:

  • Involved in the complete testing lifecycle spanning.
  • Developed test plans and test files for testing Online Data Validation (ODV) systems to ensure that all functional specifications were met.
  • Analyzed system requirement specification and developed test scripts and testing custom and functional enhancements.
  • Setting up the Testing Environments.
  • Interacted with business analysts to understand the requirements.
  • Managed and coordinated offshore testing team.
  • Prepared and executed test cases.
  • Responsible for Backend Testing, Functionality testing, Regression Testing.
  • Responsible for the releases in ATS.
  • Checked the data flow through the front end to backend and used SQL queries, to extract the data from the database.
  • Hands on experience working on Informatica tool for Enterprise Datawarehouse projects.
  • Executed SQL scripts to test the backend database.
  • Prepared and executed SQL queries to query the database for data validation.
  • Used Quality Center to do defect tracking, coordinated the defect resolution process and generated management reports.
  • Interacted with developers to follow up on defects and issues.
  • Participated in various meetings and discussed Enhancement and Modification Requests.
  • Worked with other development team members to better understand system functionality in order to improve testing quality.
  • Trained and oversaw the work done by other members of the testing team reducing the workload for everyone.

Confidential

Software Mainframe Tester

Responsibilities:

  • Responsible for planning testing procedures and test conditions under the instructions of lead tester
  • Perform the tasks of writing test scripts and test cases by referring the specifications of requirements
  • Handle responsibilities of analyzing test results and troubleshooting environment issues
  • Responsible for verifying database interaction defects as well as conduct performance and capacity testing
  • Assigned the tasks of testing and evaluating mainframe interactions with distributed systems
  • Handle responsibilities of preparing detailed test cases by analyzing technical requirements
  • Assigned the tasks of creating bug reports and performing tracking testing of systems
  • Handled the tasks of cross-system testing and prepared test cases for the project team
  • Assigned responsibilities of planning test strategies and executing test cases in mainframe environment
  • Performed other essential job responsibilities as required under the instructions of lead mainframe tester
  • Responsible for mainframe simulation process in order to transmit & process client payroll data before translating back into the SQL server 2005 database.
  • Provided ownership and accountability for assigned accounts.
  • Identified core business requirements and strategy fitting with the customer’s needs.
  • Worked to develop customer self-sufficiency on large scale Enterprise accounts while maintaining a trusting relationship with the client.
  • Developed and engaged stages of the implementation including discovery, production configuration, testing/auditing, documentation, training, and production.
  • Worked as a SME on various subjects and trained other employees on Position Control and PayCard.
  • Performed user acceptance testing, interacted with users for execution of test cases in UAT
  • During System Testing Life Cycle, performed different types of testing, such as Integration, Functionality, Regression and GUI Testing.
  • Performedregression testingfor every new build and system enhancement.
  • Modularized the test Scripts by creating Generic Functions that deal with different sets of data
  • Worked with development team to ensure that the testing issues are resolved
  • Involved in analyzing business requirements and preparing Test Plans based on User Requirement Document (URD) and prepared the Test Scenarios, Test Cases using Quality Center
  • Provided management and sustain team with analysis reports andrecommendations, which resulted in a redesign of the architecture by the sustain team
  • Performed tests on Source Analyzer user defined query, Lookupsand Target, Aggregator.

We'd love your feedback!