Software Verification Tech Resume
Winona, MN
SUMMARY
- About 7 years of experience in IT industry, Expertise in Manual and Automation Testing. Hands on experience in Mercury products (Quick Test Pro, Quality Center) excellent analytical, problem - solving and documentation skills. Team player wif excellent Interpersonal and communication skills. An enthusiastic and out-going individual, wif ability to interact wif teh team members.
- Highly proficient wif full life cycle QA Methodologies and concepts on mainly in Manual and partially in teh Automated Testing Tools QTP(v9.0) and HP QC(v10.0),Excellent analytical and technical skills wif a demonstrated capacity in debugging client/server and web applications.
- Experienced in all types of application testing, such as manual, integration, system, regression, load, functional, Unit. End-to-End (E2E), UAT and acceptance testing.
- Expertise in performing manual testing using HP QC (version 10.0), Citrix environment & DataRAMP desktop.
- Experienced in defining Testing Methodologies, Designing Test Plans and Test Cases, Documentation based on standards for Software Development and TEMPeffective QA.
- Extensive noledge of teh relational data modeling and relational database concepts like Tables, Primary and foreign keys, Views. Proficient in teh data manipulation using SQL for teh retrieval of data from teh relational database (inner joins, outer join, group by, order by etc.) Exposure to all aspects of software development, troubleshooting, testing, and maintenance.
- Good noledge in standard SQA methodologies and agile methodology.
- Experienced in Requirement Analysis, Feasibility analysis, organizing meetings wif various Stakeholders and teh end users and finalizing teh Business Requirement Specifications.
- Good Knowledge of testing teh application on Mobile devices. Has tested synchronization between teh Wi-Fi server and database for teh application.
- Developing and executing QA / UAT test plans, test cases, test results analysis and reporting, defect status matrices.
- Involved wif Management (Feedback after each Release, Daily Testing Status) to implement Quality Process.
- Automation using Visual Test, Exactor, Selenium and QTP.
- Developed Exactor, and Selenium scripts to validate and verify Reliance system
- Ability to use Mercury Tools, Quality Center for Defect Tracking and Test Plan documentation.
- Experience in developing and imparting pre and post implementation training, conducting GAP Analysis, User Acceptance Testing (UAT), SWOT Analysis, Cost Benefit Analysis.
- Testing skills include performing regression, integration, volume and stress testing as well as development, execution, maintenance of test plans, test specifications and test scenarios.
- Knowledge of SQL (SQL server 2005) to validate Transaction Testing for front end testing to validate data in back end database.
- Defect Management including defect identification, reporting, prioritizing and tracking using QC/MS Excel (depending on project requirement) has extensive experience in maintaining and performing Functional, Regression scripts using QTP.
- Has experience in new implementations, enhancement and maintenance projects both off shore as well as Global teams.
- Excellent Communication, Outstanding Leadership and Interpersonal skills wif clear understanding of business logic to multi-task, prioritize, and manage increasingly complex issues.
- Flexible, innovative and able to thrive in a fast paced, growth-oriented and time-critical environment.
- Worked on mainframe testing in teh beginning of my career. Responsible for File Aid batch testing in teh client server architecture.
TECHNICAL SKILLS
Microsoft Tools: MS Visio, MS Office, MS Outlook.
Databases & Languages: HTML, SQL, C, MS SQL Server, MS-Access 2.0.
Testing Tools: Quick Test Professional(10.0), Quality Center(10.0), Manual testing, Silk Performer, Selenium Silk Test, Black Box, Automation, Exploratory, Ad-Hoc, Negative, Regression, Unit, Integration, System, Verification and Validation, End to End, Agile framework, User Acceptance Test (UAT)
Methodologies & Standards: SEI - CMMI, ISO 9001:2000, RUP, Six Sigma, SDLC Agile, QA, QTC, SCRUM, Agile, Waterfall
Operating System: Windows 95/98/NT/2000 Server/XP
Skills: UML, RUP, SDLC, JAD
Web Technologies: HTML, VB Script
Defect-Tracking Software: Mercury Quality Centre(V 10.0)
PROFESSIONAL EXPERIENCE
Confidential, Winona MN
Software Verification Tech
Responsibilities:
- Manual & automation testing using HP quality center 10.0.
- Analyzed requirements documents and Use Cases to prepare teh detailed Test Plans, Test Cases.
- Write test cases to test teh application manually in HP quality center and automate usingQuick test pro.
- Worked wif developers to resolve and fix teh faults found in testing teh structure and functionality of teh application.
- Analyzed software requirement specification documents. Created processing model diagrams in UML/Visio using teh business process information captured in teh business context analysis documents.
- Assisted,Developed, and implementeddetailedTest Plans,Test Cases,Test Scriptsand Test Scenarios based on Functional and Business Requirements.
- Reviewed business manuals andrequirement document(BRD) in order to summarize system-specific business rules and other operating conditions.
- Analyzed business flow of teh application.
- Created test scenarios, test scripts andtest cases(common and provocative), corresponding to teh test requirements in order to maximize verification coverage of system variables.
- Carried out Integration testing to ensure data processing, interface validity and proper communication among components of each application.
- WroteSQL queriesto verify database tables for teh data inserted from teh GUI.
- Created Test Data before manual test run forPositiveandNegativetesting.
- Created RequirementsTraceability Matrixusing Quality Center to ensure comprehensive test coverage of requirements
- Used Quality Center as repository for maintainingTest Cases,executionandtrackingthe Defects.
- PerformedSanityandSmoke testingfor eachnew build of DOTNET, JAVA and C++applications.
- Performed Black Box Testing, Back-End Testing, Regression Testing, Ad-Hoc Testing, End-to-End Testing, Positive and Negative Testing for AUT.
- Attendedwalkthroughmeetings wif teh Business Analysts, ProjectManagers, and developers and provided feedback accordingly. Often discussed enhancement and modification request issues (Change Requests).
- Created test execution status and summary test report to managers.
Confidential, Minneapolis MN
QA Lead
Responsibilities:
- Analyzed software requirement specification documents. Created processing model diagrams in UML/Visio using teh business process information captured in teh business context analysis documents.
- Based on teh client’s unique requirements, developed test scenarios for teh individual Modules and products that will be tested during System Testing. Teh Test Scenario document details teh conditions under which teh product or module is tested during System Testing. Test Scenario documents are stored in teh appropriate Project Folder in teh Implementation Document Repository in Microsoft share point.
- Created and updated test cases and test scripts.
- Review Selenium automated test scripts wif Automation team.
- Executed and reviewed test cases and test scripts for testing proprietary embedded software.
- Documented software bugs in TestTrack defect management databases.
- Performed Test Track verification for software bug fixes.
- Performed adhoc testing on engineering releases to help minimize and uncover any major anomalies or issues.
- Lead teh execution of complex test cases/scripts and interpreted/analyzed results, ensuring that all issues are resolved in a timely manner wif other software engineers.
- Automated acceptance test using Exactor and Selenium.
- Verified Test Track issues (entered in Test Track defect management databases) for accuracy in bug fixes, ensuring that teh issues reported as fixed has been accurately resolved.
- Assisted in teh development of Black Box test scripts for testing insulin pumps to ensure that all software requirement specifications (SRS) were met.
- Reviewed Test Cases during teh development test cycles for accuracies, which helped uncovered many serious errors/omissions prior to having teh test cases submitted to Quality Control.
- Assisted teh Configuration Manager wif Engineering Report (ER) support tasks (reviewing/re-reviewing test cases, scan test cases, etc.) for generating teh Engineering Test Reports (ETRs) towards teh end of each development test cycle, which extensively minimized teh amount of time needed by teh Configuration Manager to complete teh Engineering Test Reports.
- Performed functionality tests by running all Phases in all workflows in Reliance System (manual and auto using Selenium).
- Participated in teh Global Voices initiatives, in which teh team’s ideas/suggestions/recommendations were implemented to improve leadership TEMPeffectiveness.
- Teh IC/ADS/SS performs Unit Testing to test their work making sure it is free of defects.
- Unit testing is conducted in teh DRS environment prior to System Testing. When Unit Testing is complete, teh DRS database is copied to DRA.
- After teh DCS sets up teh DRA database for System Testing and teh LIVE database for Regression, Interface End to End, and User Acceptance Testing, teh TE performs Sanity Testing to verify teh new database environment has been set up correctly.
- Once teh IC, ADS, or SS has verified Unit Testing is complete, teh TE begins System Testing in teh DRA environment. During System Testing, teh TE executes a series of test cases to test teh applicable modules and products for defects. Defects discovered in System Testing are entered into teh Quality Center tool wif a status of New. Teh IC/ADS/SS validates teh defect, determines teh severity, and identifies teh necessary steps to resolve them. When resolved, teh IC/ADS/SS changes teh defect status in QC to Fixed.
- Teh TE retests teh defects. If teh retest is successful, TE changes teh defect Status in QC to Close.
- Teh TE produces a daily defect metric report.
- After System Testing is complete, teh TE completes teh System Test Sign Off form.
- After System Testing is complete, teh DRA database is copied to LIVE.
- After System Testing has successfully completed, teh TE begins Regression Testing in teh LIVE environment. Regression Testing identifies teh defects that may has materialized after Defects were fixed during System Testing.
- Teh TE retests all defects in LIVE.
- Teh TE executes selected Test Cases in Quality Center.
- Teh TE enters defects and assigns defects to teh Team members in QC.
- Teh TE retests teh defects. If retest successful, changes teh defect in QC to Closed.
Confidential, Northfield NJ
Sr. Implementation Test Engineer II
Responsibilities:
- Manual & automation testing for teh website of scuba diving using HP quality center 10.0.
- Analyzed requirements documents and Use Cases to prepare teh detailed Test Plans, Test Cases.
- Write test cases to test teh application manually in HP quality center and automate usingQuick test pro.
- Worked wif developers to resolve and fix teh faults found in testing teh structure and functionality of teh application.
- Performed manual and Selenium testing of a web based application.
- Analyzed software requirement specification documents. Created processing model diagrams in UML/Visio using teh business process information captured in teh business context analysis documents.
- Assisted,Developed, and implementeddetailedTest Plans,Test Cases,Test Scriptsand Test Scenarios based on Functional and Business Requirements.
- Reviewed business manuals andrequirement document(BRD) in order to summarize system-specific business rules and other operating conditions.
- Analyzed business flow of teh application.
- Created test scenarios, test scripts andtest cases(common and provocative), corresponding to teh test requirements in order to maximize verification coverage of system variables.
- Carried out Integration testing to ensure data processing, interface validity and proper communication among components of each application.
- WroteSQL queriesto verify database tables for teh data inserted from teh GUI.
- Created Test Data before manual test run forPositiveandNegativetesting.
- Created RequirementsTraceability Matrixusing Quality Center to ensure comprehensive test coverage of requirements
- Used Quality Center as repository for maintainingTest Cases,executionandtrackingthe Defects.
- PerformedSanityandSmoke testingfor eachnew build of DOTNET, JAVA and C++applications.
- Performed Black Box Testing, Back-End Testing, Regression Testing, Ad-Hoc Testing, End-to-End Testing, Positive and Negative Testing for AUT.
- Attendedwalkthroughmeetings wif teh Business Analysts, ProjectManagers, and developers and provided feedback accordingly. Often discussed enhancement and modification request issues (Change Requests).
- Created test execution status and summary test report to managers.
- Experience in DevelopingTest scripts inQTPto test teh functionality of teh application.
Confidential, Washington DC
Lead Test Engineer
Responsibilities:
- Lead system and user acceptance tester fora third party contract management application that allowed teh organization to contract wifhealthcareproviders.
- Planned, wrote, developed, executed, and verified testcases of web based (VB.NET) health insurance programsfor side TEMPeffects using theHPQuality Center.
- Managed bug defects inHPQuality Center.
- Extensively worked on Informatica tool for Enterprise Datawarehouse projects.
- Responsible for communicating wif Healthcare providers and state government insurance payers (Medicare, Medicaid, Blue Cross/Blue Shield) to set up Electronic Data Interchange (EDI) technology for HIPAA electronics insurance claims processing.
- Ensured that customer and vendor provided all necessary HIPAA data formats, communication protocols and proprietary business specifications for problems against teh EDI production system to upper level management. Assigned problems to appropriate software or system development team for software changes and system upgrades. Work environment consists of data mapping analyst, communication protocols, HIPPA transaction sets, "C", WEB technology and GUIs.
- LeadUATtester on validating AUDIT reporting function from front end to backend.
- Designed SQL and stored procedures for regression testing. Trained testers on Mercury Quality Center (MQC) tool for test case development and defects tracking.
- Lead and trained tester on Mainframe functions to capture insurance claims pricing data.
- Validated insurance claims pricing functions worked correctly on legacy system and verified that teh same insurance claims priced correctly by teh enterprise - wide insurance claims pricing system.
- Created insurance claims test data and scenario using MS Excel spreadsheets. Sr. QA tester on test data development for web base, client server and mainframe applications.
- Responsible for requirement analysis, scenarios, test case, tests plan development and test execution. Generated defects, weekly reports, and fine tuned QA test process for efficiency and accuracy.
- Supported End-to-End Thread testing. Validated test files for CARE, Facets, and NASCO.
- Supported teh development of an internet marketplace connection between teh Buyer and Supplier of Hospital goods and services.
- Involved in teh modification and upgrade of Lawson Insight 8.0.
- Responsible for QA of teh design implementation and application testing.
- Responsible for documentation analysis of design, requirement and functional specifications as they pertain to testing.
Confidential, Mclean VA
Software QA Tester
Responsibilities:
- Analyzed business requirements and developed a roadmap to accomplish testing
- Initiated best testing practices and cross trainee process
- Estimated and Scheduled testing efforts for teh project
- Installed and configuredHP QCover teh network
- Administered, customized, and maintained theHP QCenvironment
- Created, modified and maintained user's groups and profiles
- Managed Project requirements and tractability usingQC
- Developed Requirements Traceability Matrix (RTM)
- Automated Test scenarios for GUI, Functionality, Boundary, Security and Regression Testing usingQuick Test Pro
- Updated and managed multiple scripts using Object Shared Repository
- Extensively used Object Spy to view both teh run time object methods and test object methods associated wif an object
- Extensively usedQuick Test ProCheck points (Object Property, standard and database)
- Parameterized test scripts to ensure unique set of data inputs
- Developed Data driven test to handle teh scenarios requiring multiple sets of data
- Performed regression testing by executing teh baseline scripts to identify functional issues
- Performed cross browser testing to ensure compatibility of teh application on IE.
- Test management and defect tracking process had been accomplished usingHP QC.
- Analyzed user requirements and software requirement specifications documents
- Created test plan anddetailed test cases usingHP QC.
- Executed test cases manually and identified teh mismatches
- Created user defined functions to enhance teh maintainability of test scripts
- Created Data Driven Tests to validate teh same scenario wif different test data using
- Performed Regression Testing usingQTP
- UsedSQL queriesfor retrieving data from SQL Server database as a part of teh validation methods
- Performed cross browser testing to ensure compatibility of teh application on IE
- Communicated defects to teh developers using defect tracking toolHP QC.
- Participated in bugs and enhancement review meetings.
- Performed user acceptance testing, interacted wif users for execution of test cases in UAT
Confidential, Westborough, MA
Sr. Software QA Tester
Responsibilities:
- Involved in teh complete testing lifecycle spanning.
- Developed test plans and test files for testing Online Data Validation (ODV) systems to ensure that all functional specifications were met.
- Analyzed system requirement specification and developed test scripts and testing custom and functional enhancements.
- Setting up teh Testing Environments.
- Interacted wif business analysts to understand teh requirements.
- Managed and coordinated offshore testing team.
- Prepared and executed test cases.
- Responsible for Backend Testing, Functionality testing, Regression Testing.
- Responsible for teh releases in ATS.
- Checked teh data flow through teh front end to backend and used SQL queries, to extract teh data from teh database.
- Hands on experience working on Informatica tool for Enterprise Datawarehouse projects.
- Executed SQL scripts to test teh backend database.
- Prepared and executed SQL queries to query teh database for data validation.
- Used Quality Center to do defect tracking, coordinated teh defect resolution process and generated management reports.
- Interacted wif developers to follow up on defects and issues.
- Participated in various meetings and discussed Enhancement and Modification Requests.
- Worked wif other development team members to better understand system functionality in order to improve testing quality.
- Trained and oversaw teh work done by other members of teh testing team reducing teh workload for everyone.
Confidential
Software Mainframe Tester
Responsibilities:
- Responsible for planning testing procedures and test conditions under teh instructions of lead tester
- Perform teh tasks of writing test scripts and test cases by referring teh specifications of requirements
- Handle responsibilities of analyzing test results and troubleshooting environment issues
- Responsible for verifying database interaction defects as well as conduct performance and capacity testing
- Assigned teh tasks of testing and evaluating mainframe interactions wif distributed systems
- Handle responsibilities of preparing detailed test cases by analyzing technical requirements
- Assigned teh tasks of creating bug reports and performing tracking testing of systems
- Handled teh tasks of cross-system testing and prepared test cases for teh project team
- Assigned responsibilities of planning test strategies and executing test cases in mainframe environment
- Performed other essential job responsibilities as required under teh instructions of lead mainframe tester
- Responsible for mainframe simulation process in order to transmit & process client payroll data before translating back into teh SQL server 2005 database.
- Provided ownership and accountability for assigned accounts.
- Identified core business requirements and strategy fitting wif teh customer’s needs.
- Worked to develop customer self-sufficiency on large scale Enterprise accounts while maintaining a trusting relationship wif teh client.
- Developed and engaged stages of teh implementation including discovery, production configuration, testing/auditing, documentation, training, and production.
- Worked as a SME on various subjects and trained other employees on Position Control and PayCard.
- Performed user acceptance testing, interacted wif users for execution of test cases in UAT
- During System Testing Life Cycle, performed different types of testing, such as Integration, Functionality, Regression and GUI Testing.
- Performedregression testingfor every new build and system enhancement.
- Modularized teh test Scripts by creating Generic Functions that deal wif different sets of data
- Worked wif development team to ensure that teh testing issues are resolved
- Involved in analyzing business requirements and preparing Test Plans based on User Requirement Document (URD) and prepared teh Test Scenarios, Test Cases using Quality Center
- Provided management and sustain team wif analysis reports andrecommendations, which resulted in a redesign of teh architecture by teh sustain team
- Performed tests on Source Analyzer user defined query, Lookupsand Target, Aggregator.
