We provide IT Staff Augmentation Services!

R12 Tester Resume

Philadelphia, PA

SUMMARY:

  • Over 10 years of professional experience in Information Technology with extensive experience in performing Manual and Automated Testing
  • Experience in developing reports by using Oracle Reports and BI Publisher
  • Previous development experience with Object Oriented Programming using Java/J2EE.
  • Working knowledge of BC4J components and good understanding of MVC architecture and SOA architecture.
  • Experience in working within an Agile/Scrum methodology, Waterfall and V-life cycle methodologies.
  • Very good experience in analyzing Business Requirements, Functional Requirements and designing documents to formulate Test Plans and Test Cases
  • Hands on experience with all phases of Software Development Life Cycle. Involved in entire QA Life Cycle, which includes Designing, Developing and Execution of the entire QA Process and Documentation of Test Strategy, Test Plans, Test Cases, Test Procedures and Test Scripts for ERP Applications, Java/J2EE, Web based and Client/Server applications.
  • Involved in Project Automation Framework & Performance analysis for variety of project’s like Product development projects, Maintenance projects and Migration projects.
  • Experience in Testing involved in Unit Testing, Functional Testing, Performance Testing and Regression Testing
  • Hands on experiences in Automation Tools like Quick Test Pro 9.5, Load Runner 8.0.1. Test Director 8.0/ Quality Centre 9.2, Win Runner 8.2.
  • Good experience in Claims Administration System (Life and Group Insurance).
  • Experience in working with Multiple Relational Databases like Oracle 9i/8i/7.x, PL/SQL Developer, SQL Server 2000, MS Access and Flat Files
  • Experience with QA Methodology and QA Validations to ensure the Quality Assurance Control.
  • Managing multiple projects and a team of size around 8 people. Identifying the roles and responsibilities of team members, serving as the main point of contact to the project manager
  • Ability to handle multiple tasks and work independently as well as in a team.
  • Experienced in interacting with Business/Technology groups and analyzing business needs. Experience in coordinating with Business Analysts, Functional Team, Developers
  • Good communication, interpersonal, analytical and problem solving skills

EDUCATION:

Master of Science (Computers).

TECHNICAL ENVIRONMENT:

Operating Systems : Windows NT/XP/2000, 2003 Server, UNIX and Linux
Testing Tools : Manual Testing, Quick Test Pro 9.2/ 9.5, Win Runner 8.0,
Load Runner 8.01, Quality Center 9.2, Test Partner QA
Wizard, Test Track Pro, Bugzilla
Technical Expertise : Oracle Applications, Jave/J2EE, Jdeveloper, JSP, Servlets,
Struts, XML/XSLT, HTML, JavaScript, VB Script, JDBC, Forms 10g, Reports 10g, BI Publisher, SQL Server, TOAD,
PLSQL Developer, Hibernate,

Major Assignments:

Confidential,Philadelphia, PA May’2009 – Nov’ 2009

  • Role: R12 Tester

SunGard is one of the world’s leading software and IT services companies. SunGard serves more than 25,000 customers in more than 70 countries, including the world’s 25 largest financial services companies.
Working as an R12 Tester, I have involved in Manual and Automation Testing of their R12 Oracle Applications (Financial and HRMS modules).

Responsibilities:

  • Development and implementation of Test cases, Test plans, Test data and Test scripts according to the requirements within an Agile/Scrum methodology.
  • Responsible for identifying/testing setup items for different functional areas in General Ledger, Payables and Receivables, Fixed Assets, OTL, HR modules etc…
  • Executed SQL Queries to verify the dataflow from the backend.
  • Performed White Box Testing for JSP pages, BC4J components, Reports, Forms, Conversion programs, Interfaces and validated PL/SQL Packages, Procedures and API’s
  • Involved in intensive testing of Interface programs, which move the legacy data to Oracle tables.
  • Very good experience in Period Closing/Month end process across all the modules
  • Created Requirements using Quality Center and generated trace ability Matrices to ensure that all the requirements are covered by the test cases.
  • Keeping Track of Defects in HP Quality Center.
  • Coordinated with different users for the Functional Specifications and final utility of the modules.
  • Extensively used Quick Test Pro functions to create automated scripts by using VB Script (Descriptive Programming)
  • Created Transactions and reusable actions by using Quick Test Pro 9.5.
  • Used manual testing methods to test Invoice rules, Payment terms and Transaction types in Accounts Receivable module.
  • Performed Black box and White box testing to test the custom forms and reports in different modules.
  • Involved in defect tracking and created various defect reports.
  • Regression testing on weekly builds.
  • Weekly Status meeting with Development and Management teams to discuss bugs and other issues.

Environment: Oracle Apps R12 (GL, AP, AR, FA, OTL, HR), TOAD, PLSQL Developer, Oracle JDeveloper 10g, Forms 10g, Reports 10g, PPM, HTML, JavaScript, VB Script, Mercury Quality Center 9.2, Quick Test Pro 9.5

Confidential,Pittsburgh, PA Jun’ 2008 – May’ 2009

  • Role: Reports Developer / Tester

Eaton Corporation is a diversified power management/ Energy based company and is a global technology leader in electrical components and systems for power quality, distribution and control; hydraulics components, systems and services for industrial and mobile equipment; aerospace fuel, hydraulics and pneumatic systems for commercial and military use; and truck and automotive drivetrain and powertrain systems for performance, fuel economy and safety.

Responsibilities:

  • Created Templates and Data Definitions and attached the Data definitions to the Templates in the XML Publisher in different modules.
  • Designed and Developed New Reports, Ad-hoc reports and publish them using XML Publisher and analyze them
  • Worked on Kintana tool to migrate code in different environments
  • Development and implementation of Test cases, Test plans, Test data and Test scripts according to the requirements.
  • Involved in Preparation of Test Plan (High Level Document).
  • Executed SQL Queries to verify the dataflow from the backend.
  • Responsible for identifying/testing setup items for AP, AR, GL, inventory, Order Management, Service Contracts and Service Requests module
  • Extensively used Quick Test Pro functions to create automated scripts
  • Created Transactions and reusable actions by using Quick Test Pro.
  • Created Requirements using Quality Center and generated trace ability Matrices to ensure that all the requirements are covered by the test cases.
  • Developed and maintained project plan and project recourses
  • Identified product gaps, coordinated with Development Team by creating TARs/ Remedy tickets and obtained one-off fixes. Managed and monitored TARs for several Issues.
  • Keeping Track of Defects in Mercury Quality Center.
  • Extensively used TOAD and PLSQL Developer for database access and PL/SQL Development.
  • Coordinated with different users for the Functional Specifications and final utility of the modules.
  • Involved in intensive testing of interface programs, which move the legacy data to Oracle tables.
  • Involved in Integration, Business, Stress and Functional testing.
  • Used manual testing methods to test Invoice rules, Payment terms and Transaction types in Accounts Receivable module.
  • Performed Black box and White box testing to test the custom forms and reports in different modules.
  • Involved in defect tracking and created various defect reports.
  • Regression testing on weekly builds.
  • Weekly Status meeting with Development and Management teams to discuss bugs and other issues.

Environment: Oracle Applications R12/11i(GL, AP, AR, OM, Service Contracts, Service Requests, Cash Mgmt), Reports 6i/10g, Forms 6i/10g, BI Publisher , Kintana, Mercury Quality Center 9.2, Quick Test Pro 9.2

Confidential,Parsippany, NJ Jan’ 2007 – Apr’ 2008

  • Role: Testing Lead
  • Travelport is the leader in Web-based travel e-commerce, and a provider of the most relevant and cost-efficient technologies and services available to participants throughout the global travel distribution chain. Each day, they process a remarkable 1.1 billion travel transactions.
  • Working as a Testing Lead, I have involved in Manual Testing and Automation of their Oracle E-Business Suite (Procure-to-Pay and Order-to-Cash cycles).

Responsibilities:

  • Led the testing effort for the Order Management modules and Financial Modules.
  • Development and implementation of Test cases, Test plans, Test data and Test scripts according to the requirements.
  • Analyzed and helped in modifying the use cases using MsVisio and created the test cases based on them.
  • Responsible for identifying setup items for AP, AR, GL, inventory and Order Management module
  • Developed scenarios to conduct Load Test and Stress testing
  • Created performance test scripts using Load Runner to perform load, stress and Performance tests.
  • Development of VGen scripts that attempt to simulate load patterns with virtual users for documented test scenarios
  • Developed Load runner scripts to accurately emulate typical customer transactions, so that tests provide a realistic preview of system performance.
  • Created Transactions and reusable actions by using Quick Test Pro.
  • Involved in functional testing using Quick Test Pro
  • Extensively used Quick Test Pro methods to create automated scripts
  • Created Requirements using Test Director and generated trace ability Matrices to ensure that all the requirements are covered by the test cases.
  • Developed and maintained project plan and project recourses
  • Testing effort included identifying, developing, and executing test plans for integration testing of Modules involving (INV, OM, PO, AR, GL, AP)
  • Conducted integration testing to verify the data flow from the AP & AR to GL module.
  • Responsible for business analysis and all phases of custom requirements, upgrades and day-to-day problem resolution and Reports/Forms development.
  • Assigned privileges to new users by using System Administrator Responsibility.
  • Coordinated with different users for the Functional Specifications and final utility of the modules.
  • Involved in intensive testing of interface programs, which move the legacy data to Oracle tables.
  • Involved in Integration, Business, Stress and Functional testing.
  • Used manual testing methods to test Invoice rules, Payment terms and Transaction types in Accounts Receivable module.
  • Performed Black box and White box testing to test the custom forms and reports in GL, AR, AP Modules.
  • Involved in defect tracking and created various defect reports.
  • Regression testing on weekly builds.
  • Conducted Performance Testing using Load Runner.
  • Conducted integration testing to verify the data flow from the AP & AR to GL module.

Environment: Oracle Applications Release 11i (PO, INV, OM, Cash Mgmt, GL, AP, AR), Reports 6i, Forms 6i, J2EE, XML, HTML, JavaScript, Mercury Test Director 8.2, Load Runner 8.1, Quick Test Pro 8.2.

Confidential, Apr’ 2005 – Dec’ 2006 Role: Module Lead
The purpose of this project is to select and implement a Claims Administration System. Fineos is the chosen system to achieve that. While only individual living benefits products will be introduced as part of Nexus program, one of the criteria for the system selection is the ability to process claims for all products sold by RBC Life Insurance both on Group and Individual lines of business in future.
The current Individual products include Disability, Life, Waiver of Premium, Accidental Death and Dismemberment (AD&D), Long Term Care (LTC), Critical Illness (CI).

Responsibilities:

  • Responsible for leading, coordinating and communicating the status of testing effort
  • Responsible for Functional, Regression Testing and Conversion testing from Off shore.
  • Performed Claims Adjudication process and tested different types of payments and benefits (like Single Payments, Recurring Payments, Partial Payments, COLA etc…)
  • Responsible for Bug Verification and Validation against the Requirements using “Test Director” – Manual Testing
  • Responsible for Database Environments maintenance for Testing / Development team
  • Responsible for JAR / WAR creation for different modules
  • Responsible for Deployment and Delivery to OSC
  • Responsible for Post Go Live Defect verification.
  • Ensured the completeness of the test plans with the development teams for different modules
  • Designed Test cases and Test plan from the requirements.
  • Execution of test scripts in Test Director
  • Developed several data driven and individual test cases to satisfy both positive and negative testing by using Win Runner 8.2
  • Responsible for the creation, and review of automation scripts using Win Runner

Environment: Claims Application, Java, JSP, CVS, VBScript, WebLogic, Websphere, Apache/Tomcat, XML SQL Server 2000, Mercury Test Director 7.0, Win Runner 8.2

Confidential,USA Sept’ 2003 – Mar’ 2005 Role: Reports Developer / Tester
Responsibilities:

  • Designed and Developed New Reports, Ad-hoc reports using Reports 6i
  • Customized many reports like Pick Slip, Packing Slip, Mailing Label for shipping and Credit Order Detail, Orders by Item, Order Detail & Summary Reports for Order Management and also created new reports as per client requests.
  • Prepared test conditions, test cases and test scripts according to the requirements.
  • Performed functional and automation testing on Purchase Order, Order Management, Shipping Execution, Advance Pricing, Inventory and Receivable modules etc.
  • Involved in intensive testing of interface programs, which move the legacy data to Oracle tables.
  • Involved in Integration, Regression and Performance testing.
  • Performed Black box and White box testing to test the custom forms and reports in PO, OM, and Shipping Modules.
  • Extensively used TOAD for database access and PL/SQL Development
  • Created compiled modules and functions in Win Runner.
  • Involved in defect tracking and created various defect reports.
  • Created GUI Maps to enable Win Runner to identify the various objects in the application. Worked with both ‘Global GUI map’ and ‘GUI map file per test’ modes.
  • Regression testing on weekly builds.
  • Conducted Performance Testing using Load Runner.
  • Created performance test scripts using Load Runner to perform load, stress and Performance tests.
  • Development of VGen scripts that attempt to simulate load patterns with virtual users for documented test scenarios
  • Developed Load runner scripts to accurately emulate typical customer transactions, so that tests provide a realistic preview of system performance.
  • Conducted integration testing to verify the data flow from the PO & OM to GL module.
  • Executed SQL Queries to verify the dataflow from the backend.
  • Weekly Status meeting with Development and Management teams to discuss bugs and other issues.

Environment : Oracle Applications Release 11i (PO, INV, OM, Cash Mgmt), Reports 6i, Forms 6i, Win Runner 7.01, Mercury Test Director 8.2, Load Runner 8.1.

Confidential,Houston, TX Oct’ 2002 – Aug’ 2003
Role: Module Lead
Responsibilities:

  • Involved in the complete Testing Lifecycle activities
  • Create Test Plans and Test Cases in Test Director for the Functional, Integration and Regression testing.
  • Analyzed system requirements and developed a detailed Test Plan and Test Scenarios for Functionality, System Testing using Test Director. Analyzed business and technical specification to write test cases and test plans(On-site)
  • Assist and guide project teams to implement and document standards, procedures and plans consistent with QA and Test deliverables for the project team.
  • Tracking the progress of test case planning, implementation and execution results.
  • Responsible for entering, Tracking bugs in Test Director to resolve bugs and classifying bugs based on the severity and priority.
  • Collect test metrics weekly from the Test Director database that reflected the current status of the test execution and the state of the defects
  • Designed and developed test cases for putting the application on automated testing using Win Runner.
  • Reviewed Manual Testing Methods and developed and executed Automated Scripts
  • Create, debug and run the test scripts in Win Runner.
  • Created Test cases for each functionality and automated the bug free test cases for Regression Testing.

Environment : Java, EJB, JSP, JAXP, XML, XSLT, JTS, Servlets, HTML,, SQL Server, Test Director 8.0, Win Runner 7.01 and Windows XP

Confidential,USA Jan’ 2002 – Sep’ 2002 Role: Test Analyst
Responsibilities:

  • Designed, Developed and Executed Test Cases
  • Create, debug and run the test scripts in QA Wizard
  • Performed Several Testing Methods like Ad-hoc Testing, Base Line, Unit Testing, Integration Testing, Functional, GUI and Regression Testing during the various phases of the Development
  • Defects were tracked, reviewed, analyzed and compared using Test Track Pro
  • Worked with development team to ensure testing issues are resolved
  • Participated in Automation Framework Design and Setting up the Test Environment
  • Installation of the Builds

Environment : Java, COBOL, JCL, TRANSACT, SQL Server, IMAGE DB,
Windows XP, HP3000

Confidential,USA Oct’ 2000 – Dec’ 2001 Role: Test Engineer
Responsibilities:

  • Involved in the complete Testing Lifecycle activities
  • Create Test Plans and Test Cases in Test Director for the Functional, Integration and Regression testing.
  • Analyzed system requirements and developed a detailed Test Plan and Test Scenarios for Functionality, System Testing using Test Director. Analyzed business and technical specification to write test cases and test plans
  • Assist and guide project teams to implement and document standards, procedures and plans consistent with QA and Test deliverables for the project team.
  • Tracking the progress of test case planning, implementation and execution results.
  • Responsible for entering, Tracking bugs in Test Director to resolve bugs and classifying bugs based on the severity and priority.
  • Collect test metrics weekly from the Test Director database that reflected the current status of the test execution and the state of the defects
  • Designed and developed test cases for putting the application on automated testing using Win Runner.
  • Reviewed Manual Testing Methods and developed and executed Automated Scripts
  • Create, debug and run the test scripts in Win Runner.
  • Created Test cases for each functionality and automated the bug free test cases for Regression Testing.

Environment : JAVA/J2EE, SQL Server, Test Director 7.0, Win Runner 7.01 and Windows XP

Confidential,Spain Aug’ 1999 – Sept’ 2000
Role: Java Developer
Responsibilities:

  • Developed Logon process using Java, Servlets, JSP, HTML, LDAP
  • Designed and developed GUI using HTML & JSP
  • Used Java Script for Front-end validations
  • Involved in developing and creating database objects like Tables, Views & Indexes using SQL
  • Involved in unit testing using unit test suites and Integration testing
  • Responsible for JAR / WAR creation for different modules
  • Responsible for Database Environments maintenance for Testing / Development team

Environment : Java, Servlets, JSP, HTML, Struts, Hibernate, Tomcat, Oracle 9i, 10g, Eclipse, Apache Tomcat

Hire Now