- Ten years of cumulative experience in Business/Systems Quality Assurance, Configuration Management Practices, Analysis, Specifications, Verification, Management Processes including QA and CM Management Tools and Resources.
- Adept in Software Development Life Cycle with emphasis on Quality Assurance, using Mercury Interactive tools, Rational Test Suite and QA Methodologies under Windows and UNIX environments.
- Extensive software testing hands on experience in implementing QA methodologies throughout the SDLC phase.
- Proficiency in developing QA standards, Test Plans, Test scenarios, Traceability Matrix and Test Cases based on requirements and functional specifications.
- Experience in analyzing Business Requirements, Understanding the User stories, analyzing the Test Conditions
- Expertise in establishing QA Procedures, proficient in writing Test Plans, documenting the testing procedures and writing Test Cases.
- Skilled in various types of manual testing like Black - Box Testing, Functional Testing, Module Testing, System Testing, Integration Testing, Regression Testing, UAT, End-to-End Testing, and Backend Database Testing.
- Expert in developing Test Plan, Test Cases, writing Test Scripts by decomposing Business Requirements and developing Test Scenarios to support quality deliverables.
- Well versed in configuring test environments for specific test cases, creating test data and executing automated or manual tests, document results, update defect-tracking systems and finally update Test execution log.
- Proficiency in Defect Management including defect creation, modification, tracking and reporting using industry standard tools like Quality Center, TFS, Clear Quest and Jira defect tracking tools.
- Followed Industry Standard Agile and Waterfall methodologies to implement the projects.
- Exposure in preparing and presenting Test Execution Reports, Defect Reports, Weekly Status Reports, Monthly Status Reports to the project team.
- Close interaction with architects and software developers to understand application functionality and navigational flow.
- Participation in Sprint planning meetings, Daily Scrum Calls, QA Team meetings, Project Meetings, Design reviews and Inspections.
Testing Tools/Others: Quick Test Professional, Drools tool for Rules testing, Soap UI 2.5.1, 4.5.1 Rational Suite RQM- (Rational Quality Manager, Rational Team Concert (RTC), Requisite Pro (Rec Pro), SAP Business Objects Enterprise (BOE tool for reporting),IBM Pegasus, Batch Tool, Business Rules Admin tool, Variance tool, Test Director, SalesForce, Requirement Gathering: Agile HP Service Test 11.20, Quality Assurance: SCRUM HP Unified Functional Testing 11.50, Project Management: SDLC SOAP UI 4.0.0, Healthcare Industry Expertise: Waterfall Selenium IDE 2.2.0, Financial Industry Expertise: MS Access JIRA v6.1.5, Telecom Industry Expertise: Salesforce DOORS, Rational Clear Quest Documentum, Production Support: Rational Clear case Sharepoint, Defect Management: Quality Center ALM MS Visio, Risk Management: 11.0 TOAD, Jira. Issue Management: Quick Test Professional 11.00 XML, Change Management
Databases: Oracle 11G, MS-SQL Server, DB2.
Languages: C, C++, COBOL, SQL Developer 1.5.5,Java, Java Script.
Defect Tracking Tools: IBM Rational Clear Quest 22.214.171.124, Quality Center 9.0, Quality Center 10.0,Microsoft Team Foundation, Bugzilla, Jira.
Version Control: IBM Rational Clear Case
Operating Systems: Windows 95, 98, NT, 2000, XP, VISTA, UNIX, Secure FX 6.0, DOS
Other: Salesforce.com, Pivotal Tracker, Google Drive, SharePoint, XML, SQL, Sprint Reporting Tool
Senior Salesforce Tester
- Performed complex Salesforce testing tasks for more than 400 User Stories and supported more than 700 defect triage during In Sprint testing, Integration Testing and UAT phases.
- Worked closely with clients and developers during the sprints to target on time Demos.
- Regularly participated in backlog grooming sessions and tasking sessions, and performed client demonstrations of all User Stories upon completion.
- Drafted the Test Plan/Test Strategy and related documents at the end of all Releases and was responsible for ensuring that all QA deliverables are completed each sprint in ALM, QA tool.
Environment: Salesforce, Jira, Google Drive, Windows 7
Senior Salesforce Consultant
- Developed Test cases and Test plan and Functional Requirements for each phase of the Hub and Node Dashboard
- Developed use cases for each Phase
- Identify, report, and resolve issues, defects, and anomalies as early as possible in the Software Development Lifecycle (SDLC)
- Responsible for system functionalities set forth in the User Stories have been achieved and to confirm that the system adheres to requirements documented in the Solution Specifications
- Responsible for providing Test status reports
- Responsible for communicating any risks testing team is facing during each Release
- Wrote a VBA to capture the formulas in the excel sheet and then compare it on a monthly basis (as data gets updated on weekly and monthly basis)
Environment: Salesforce, Sales Cloud, Service Cloud, QC1.11
Salesforce Consultant - Config/BA/QA
- Analyzing Business Requirements, understanding the User Stories, analyzing the Test Conditions
- Understand and document procedures
- Develop publish and implement test plans
- Write and maintain test plan, test cases
- Create and maintain test data
- Clearly document and re - verify all defects
- Document test results
- Developed Tractability Matrix and Test Coverage reports
- Developing high-level test design and planning documentation.
- Design, code, test, and execute test case scenarios
- Analysis of test results leading to defect isolation and resolution.
- Involved in the user acceptance testing (UAT)
- Responsible for increasing the unit test coverage of the Salesforce application used by employees of Royal Caninie Solutions and responsible for general bug fixes related to error handling
Environment: SalesForce (salesforce.com, force.com), Pivotal Tracker, Google Drive, Windows 7
BA/ QA/ Configuration Analyst
- Worked on MeF versions 6 through 9 with the most current MeF version under implementation is 9.5
- The tax rules change every tax year, necessitating re - development and testing of the same
- Responsible for analyzing applications, systems and related customer requirements in order to design, develop, and execute the automated test cases developed in RFT
- Use test documentation to develop and implement automated tests.
- Contribute to test design and test planning to expand existing automated test coverage.
- Used Pegasus automation tool to create and test all our test cases
- Pegasus is an interface that allows us to create XML in a correct format to be accepted in the MeF system and to model tax data for the MeF project
- The tax data is derived from XML schemas. Used Pegasus to parse schema and create XML data that conforms to the XML schema
- Performed XML schema validation in the tax returns against the schema using Pegasus
- Created test cases with multiple scenarios where each scenario is a separate macro
- Created macros for each scenario in the test case
- Used IBM Rational Clear Case as version and source control code repository to save and track changes to our test cases. Test cases can be saved into Clear Case directly from Pegasus
- To execute or update test cases, Pegasus can retrieve the test cases either from a local directory on our PC or from Clear Case
- Executed macros for each test case and captured the results for each run
- Reviewed the macros created by team members and the results for macro executions performed by team members
Environment: Pegasus Test Suite 14.74, IBM Rational Clear Case, IBM Rational Clear Quest 8.0, JAXP and Tibco, JDBC, DB2
BA/ QA/ Configuration Analyst
- The Exchange Periodic Data (EPD) process goes through a series of data consistency checks and validation. As part of this process, the incoming EPD XML data goes through TIN Validation and Business Rules Engine (BRE).
- I was responsible for testing TIN Validation (TINV), Business Rules Engine (BRE) and Business objects Enterprise Reports (BOE)
- Responsible in analyzing the requirements and mapping the requirements to the user stories
- Worked with requirements and development teams to understand application requirements and implementation designs
- Analyzed the test conditions and test scenarios, preparing test case shells, writing test cases based upon the requirements given by the requirements team
- Performed Positive/Negative testing to check the functionality of the systems
- Managed and conducted System testing, Integration testing and Functional testing
- Worked on SOAP UI 4.5.1 for Web - services testing
- Used SQL scripts to load test data into tables and SQL developer to execute SQL Queries
- Evaluate existing manual test cases, and identify opportunities for automation
- Develop a plan that includes both short-term and long-term test automation solutions
- Develop test automation scripts and tools
- Collaborate with project test teams to implement developed automation
- Analyze current automation solutions, understand changing business needs, and propose strategies for the enhancement and/or migration to new automation framework
- Developed Tractability Matrix and Test Coverage reports
- Tracked defects using Clear Quest and generated defect summary reports
- Knowledge of XML editing and testing
- Involved in test executions based on sprints
- Prepared status summary reports with details of executed, passed, failed and deferred test cases
- Involved in peer reviews and make updates for Test Cases where necessary and made recommendations
- Responsible in generating the Weekly and Monthly Status Reports
Environment: Rational Requisite Pro, Rational Team Concert, Soap UI 2.5.1,4.5.1, SAP Business Objects Info view Tool, Web Methods, XML SPY tool for XML, Windows 7, Sprint Reporting Tool, Rational Clear Case, Rational Clear Quest 126.96.36.199, Sql Developer1.5.5, Oracle 11G, Secure FX 6.0
Setup Box SW Consultant
- Create and executed automated software test tools for documenting DIRECTV set top box performance under a variety of automated test scenarios
- Define and developed sophisticated test setups, in conjunction with product managers, field support and design engineering, for testing set top box designs against published hardware and software requirements
- Designed, developed and maintained automated web - based and standalone software tools for testing and documenting STB behavior
- Implemented automated scripts to run STB tests
- Performed both manual and automation testing on STB s
- Reviewed test logs, DVR recordings and generated summary reports of findings
- Executed end user tests to find issues that directly impact the customer in addition to tests developed based on DIRECTV specification
- Ensure efficiency in performing product functionality, usability and interoperability testing according to approved test plan and test schedule
- Perform logical validation process of failures prior to dissemination of information to functional groups or suppliers
- Identifying, analyzing, and documenting software defects and work with the software development team to troubleshoot and resolve any software/hardware issues
- Identify, analyze, and documented software defects and work with the software development team to troubleshoot and resolve any software/hardware issues
- Provided feedback on test plans and procedures developed by Test Development and End to End Systems groups for the purpose of improving accuracy, clarity and quality of documents and execution
Environment: Java, Linux, Perl, MySQL, ARS (Reporting tool), Rack Remote control tool, QC9.2
Business Rules Engineer
- Review the requirements and general understanding of changes
- Analyze highly complex business requirements, designs and wrote technical specifications based on the requirements
- Reconcile technical specifications document with existing code
- Requirement validations. If there is an issue with the requirements or technical specifications, record the incident on the incident tab of the tracker and alert the rule lead
- Check to ensure Rule environment is accessible to accept code i.e. no refreshes or migrations are in progress
- Feeding new variables, new messages accurately in Rules admin tool
- Ensure code messages are classified as internal, external or broker
- Run the rules code through syntax checker
- Complete coding, unit testing and regression testing for 6 specific B2B rules that will need to be modified within the Blaze Rules Engine
- Update the technical specifications with Rule ID provided by the rules admin
- Update the CODE - DATE COLOUMN on the versioning tab of the tracker when coding is completed
- Testing the variables and values in each rule, for any inequalities and boundaries i.e. migration dates before, on, and after and for null values
- Responsible to ensure the rules fire for positive tests and do not fire for negative scenarios. For rules that fail to fire, back track to code, tech specs, requirements etc. until problem is resolved. If the problem is a result of issue with requirements, tech specs or coding enter an incident on the incident tab of the tracker and alert the rule lead
- Managed, coordinated and actively participated in Quality Assurance and Test Automation process of the AOW/LIS applications
Environment: Blaze Rules Engine, Unit test tool, Variance tool for regression, Clear Quest, IBM CICS Transaction Server, Visio, Windows XP, XML SPY
- As a QA Tester performed functional, regression, performance and UAT testing of various functionalities like member enrollment, benefit administration, billing, claims productivity and claims adjudication to ensure consistency and also to make the system user - friendly and error-free for its customers
- Created Test Plan, that defines the test environment, phases of testing, entrance and exit criteria into different phases of testing and resources required
- Identified, built and executed Test Cases and Test Sets for Functional, Error Handling, Navigation and Regression in QC
- Manually tested the entire application before the tests were automated
- Involved in HIPAA/EDI Medical Claims Analysis, Design, Implementation and Documentation
- Developed various test cases for testing HIPAA 837I/P and 835(4010)
- Performed validation testing on the application navigation for various scenarios and reported the errors
- Involved in the user acceptance testing (UAT)
- Manually tested all the interfaces
Environment: Web Services, Q.C 10.0, Microsoft Test Manager, and Oracle8i.
Quality Assurance Analyst
- Worked on My Account Online, Expansion and Home Page Redesign - Phase, Global Navigation, Personal Loan module etc
- Provided preliminary estimates per client s requirements
- Execute duties of providing metrics files required by project strategy, and conduct basic software testing to check that all functional needs have been achieved
- Execution of test scenarios, scripts, and review of product functionality
- Participated in various meetings and discussed Enhancement and Modification Request issues
- Effective co-ordination between development team and testing team, verify audit statements, verify data etc
- Derived Test plans from technical specifications and business requirements for covering functional, integration, regression and end-to-end testing
- Monitor and communicate status of showstopper and high severity defects
- Track status of testing coverage (i.e. percentage of test scripts completed/successful)
- Coordinate with PM/Project Lead and development team on all issues/defects created
- Ensure full retest of any fixed issues
- Analyzed System and Functional requirements developed and executed detailed Test Plan, Test Cases, and Test scripts for testing the functionality using Mercury Quick test Pro
- Created test cases for various levels of tests such as Integration test, Regression test, System test, End to End test and User Acceptance test for different modules
- Test cases included verification of different Pages and objects, automated using various verification points - Object, GUI, Text and Bitmap. Verification of the attributes and exception handling for graceful behavior of the pages was also part of the test cases
- Executed test cases manually. Compared and analyzed actual with expected results and reported all deviations to the appropriate individual(s) for resolution
- Used VBScript file to load all the function libraries using Library functions in QTP utilities
- Enhanced the scripts in QTP by applying checkpoints, parameterizations, synchronization point, data driven tests and creating modular tests
- Documented and tracked test scripts, test results, test analysis, and reported the defects using Quality Center
- Close interaction with designers and software developers to understand application functionality and navigational flow
- Prepared detailed Test Metrics on a weekly basis for the project members to know the status of the application
Environment: QTP 8.2 Quality Center 8.2/9.0, VBScripts, J2EE, Java, XML, Oracle 10i, Software Planner (In-house tool)
- Involved in analyzing system design specifications and developed Test Plans, Test Scenarios and Test Cases to cover overall quality assurance testing.
- Export requirements and test cases into TFS, run manual and automation tests.
- Updated RTM daily, send status reports and daily tracking reports.
- Performed UAT testing in the UAT environment prior to notifying the customer to start performing UAT testing.
- Establish the testing objectives up front and making sure they are measurable.
- Documenting the objectives and making sure the objectives are prioritized.
- Categorizing the issues.
- On the web client side, I performed testing to check for: all the links in web pages, database connection, forms used in the web pages for submitting or getting information from user etc.
- Test all internal links.
- Test links jumping on the same pages.
- Test links used to send email to admin or other users from web pages.
- Test to check if there are any orphan pages.
- Test to check for broken links.
- Checking all the validations on each field.
- Checking for the default values of fields.
- Wrong inputs to the fields in the forms.
- Options to create forms if any, delete form, view or modify forms.
- Usability Testing: Testing for navigation, content checking.
- Compatibility Testing: browser compatibility, operating system compatibility
- Printing options etc.