Qa Tester Resume
Cary, NC
Summary
Over 4+ years of experience in Information Technology including approximately 4 years of Amdocs experience with emphasis on Software Quality Assurance, Worked on testing of stand-alone, Web based and Client/Server Applications using both Manual and automated tools. Possess experience in Telecom Industry. Very strong communication and judgment skills, team player, international work experience, quick adaptability to the ever-changing technology and confidence are my strongest points.
Specific Expertise- Excellent working experience in various different industries like Freight, Health Care Providers, Media, Publishing and Telecom
- Excellent working knowledge of various Amdocs tools/applications, processes and workflows in the Telecom industry like Amdocs Ensemble Billing System, Amdocs Enabler Billing System, Clarify CRM, OMS and eCare
- Experience in testing through a full system development and testing life cycle.
- Experience in End-to-End System Testing and System Support
- Experience in creating and validating Test Plans, Test cases and Test scripts formulation.
- Strong analytical, problem solving, and testing skills
- Worked with all stages of testing namely Integration Testing, System Testing and User Acceptance Testing (UAT).
- Experience in Manual, Regression, Functional and Configuration Testing.
- Experience in Performance Testing, Stress Testing, Security Testing, Sanity Testing of Web-based and Client/Server based applications.
- Worked and have understanding of release and quality management processes
- Experience in writing SQL to confirm the system is writing and displaying correct data. Oracle SQL tools used: Toad, SQLPlus, MySQL, etc.
- Performed Backend Testing of the applications using SQL queries to validate the consistency of data.
- Possess Strong experience in using UNIX commands. Used Exceed and Putty to access UNIX environments.
- Performed Functional Testing and GUI Testing of applications and checking them against standards and business requirements.
- Experience in documenting test results for corrective actions, reporting and audits.
- Experience in testing performance of web applications.
- Work with development team and business users to create and document business scenarios for testing
- Ability to work with in a team environment and to communicate well both orally and in writing.
- Extremely diligent, strong team player with an ability to take new roles.
- Excellent communication skills along with verbal and written skills. Exceptional ability to learn new concepts and technologies in the least time.
Technical Skills
Operating Systems: Windows 95/98/2000/2003 Windows XP, UNIX( HP-Unix)
Web Techno;ogy: HTML, XML, VB Sctip, Java Sctipt
Software Packages: MS Office and Oracle Forms
Testing Tools: Win Runner, Load Runner
Bug Reporting Tools: MS Excel, Test Director, AMC, Team Track, Mercury Quality Center, and Bugzilla
Oracle SQL Tools: TOAD, My SQL, SQL Server and SQL Plus
Professional Experience
Confidential, Cary, NC Project: QA Tester Mar 10 - Aug 10
Confidential, T oday\'s FedEx is led by FedEx Corporation, which provides strategic direction and consolidated financial reporting for the operating companies that compete collectively under the FedEx name worldwide: FedEx Express, FedEx Ground, FedEx Freight, FedEx Office, FedEx Custom Critical, FedEx Trade Networks, FedEx SupplyChain and FedEx Services. Through FedEx Services I supported the FRATX project by merging FedEx Ground and FedEx Home Delivery working towards their re-organizational structure. I am involved into this project as a QA lead leading various different aspects of this FRATX project. Simultaneously I am working thru QA responsibilities to fulfill QA needs to meet the deadline.
Responsibilities:
- Gathering Software Requirement Specifications from Business related to FRATX project
- Performing Manual, Regression and Functional testing for FRATX project
- Creation and distribution of the Detailed Test Plan Specification
- Creation of Test Scenarios based on Software Requirements using Microsoft Excel
- Leading the Test Scenario Review with the Application Development Lead and Business Area Test Lead
- Creation of Test Cases based on Software Requirements using Microsoft Excel
- Leading the Test Case Review with the Application Development Lead and Business Area Test Lead
- Lead QA Test Effort with assistance of the Application Development Lead and Business Area Test Lead
- Posted all test scenarios and test cases on Source Forge for Developers, UAT and QA teams to review and approve them
- Reviewed all test scenarios and test cases in person with development and UAT teams
- Executed test cases and updated the results using Microsoft Excel
- Insure completion of all test cases
- Validated all different reports developed in AS400 that were impacted for FRATX changes
- Log defects in Quality Center during execution of test cases
- Creating a Report using Microsoft Excel on all defects logged during execution for meeting reviews and fixes
- Retest defects as they are fixed
- Working thru approx 18 Software specifications towards FRATX project
- Working thru by actively participating in mentoring the test cases with UAT, Development and QA teams assisting them to see no existing functionality is broken with FRATX changes
- Working thru the test cases as part of my active tests in addition to mentoring for some of the Software Requirements
- Attending weekly meetings with the management team and share the Status Report on the weekly basis
- Working closely with Development and UAT teams
Confidential, Port Washington, NY Project: QA Tester Nov09 - Feb10
Sandata Technologies improving the Business of Care by providing leading edge, market responsive information technology solutions and services. They provide home health and human service agencies performing vital work. They are responsible for the care of our nation\'s seniors, children, sick, abled and disabled people. The Project at Sandata involved working through Peformance and Manual testing of Web and Client Server applications. As a QA, I am testing with the Global QA and Development teams in analyzing system requirements, data requirements, testing end-to-end test environments, fulfilling data requests, and testing applications.
Responsibilities:
- Exeperience working with both Web Based and Client server Applications
- Comparing the end results, actual results and performance in both Web Based and Client server Applications
- Working and testing thru different versions of applications in both http and https forms
- Analyzed Business Requirement Document, Software Design Document, Software Requirement Specification and Functional Requirement Document.
- Worked closely with Project Manager, Development Lead, Developers, Network Admins to gather requirements in order to formulate the Performance and Functional Test Plan.
- Executed various loadrunner scripts using Load Runner VU Generator (VuGen)
- Executed various loadrunner scripts using Load Runner Controller
- Analyzed the test results of all scripts using both Load Runner VU Generator (VuGen) and Controller
- Created the Peformance Strategy document based on each script using Controller and VuGen test results
- Performed Auto Correlation by correlating the Controller and VuGen graphs to validate the Users closely
- Performed validation of automation Scripts, test cases and made comments for changes made in the script.
- Performed Load & Performance testing on the Linux and Windows machines.
- Ran thru approx 120 Reports on Conversion Data and Compared the data b/w the old version (Mainframe DB/Client Server App) and the new version (Oracle DB/Web Based App)
- Reported bugs based on the comparison b/w the two DBs and Converted data from old DB to new DB
- Re-tested/Regression tested the reported bugs after the fix implied into QA
- Tested Used Case Scenarios and Test Cases as per business requirements with various user loads using Performance Center.
- Modified/Changed Used Case Scenarios and Test Cases based on missing steps to ensure the testing quality
- Generated LoadRunner reports with Running Vusers, Hits per Second, Throughput, Average Response time and Total Transactions Per Second graphs.
- Created a Performance Report for the end to end testing in comparison with the baseline and final values.
- Uploading the Performance Test Plan, Test Scripts, Scenarios and Final Reports in the Quality Center for every application.
- Extensively used SQL to query the Database to validate the test data for each application
- Also worked closely with the developers for the Performance Tuning and other related bugs of an application by gradual Scaling of the application.
- Faced lot of challenges and issues and over come successfully using my knowledge and expertise.
- Weekly meeting with the management team and share the Status Report on the weekly basis.
Confidential, El Segundo, CA QA Tester Feb 08 - Oct 09
Confidential, is a direct broadcast satellite (DBS) service based in El Segundo, California that was founded in 1990. It transmits digital satellite television and audio to households in the United States, the Caribbean and Latin America except for Mexico. DirecTV is owned by DirecTV Group, which is controlled by Liberty Media.
Responsibilities:
- Leading UAT team and making sure all the new production defects have been tested in UAT before submitting the code into production.
- Understanding the Business Requirement and applications end-to-end functionalities
- Preparation of Detailed Test Plan and Test cases for functional testing.
- Prepared and issued weekly status reports and conducted Defect Review meetings.
- Performed and actively participated in User Acceptance Testing (UAT).
- Ensured Use-Cases were consistent and covered all aspects of the Requirements document.
- Performed Negative and Positive Testing.
- Designed and developed scenarios based on business requirements.
- Executed test cases and reported defects in Mercury Quality Center.
- Supporting Production applications for emergency defects requests
- Validated the SQL scripts to make sure they have been executed correctly and meets the scenario description.
- Reviewing and re-testing reported defects in the concerned applications using Manual as well as Automation tools
- Creating test cases and test steps for the reported defects based on the business requirements
- Executing test scripts for functional and regression testing related to the defects fixed using QTP
- Executing the scripts in Fire Fox and IE and tested the browser compatibility using QTP for QA purpose
- Validated various different logs generated using Java or Perl language in Unix
- Supporting and resolving various data issues reported by different
- Trouble-shooted various issues that came across during the testing Execution
- Writing SQL queries to access Oracle database using 'SQLyog' Enterprise tool
- Experience with UNIX commands to access UNIX servers for reading log files, executing shell scripts and running various jobs
- Conducted routine meetings with team members and developers in case of changing requirements for better understanding of requirements.
- Interacting with other teams through walk-through, teleconferences, meetings, etc. to resolve various issues.
Confidential, Boston, MA Position:QA Tester Apr 07 - Jan 08
Confidential, is one of the leading educational publishers in the United States, publishing textbooks, instructional technology, assessments, and other educational materials for teachers and students of every age. As a QA Engineer/Data Migration and Validation Analyst, I was involved in multiple projects, sometimes subsequently. I wrote the QA requirements and then created the data and process flow diagrams and documents for HMH content management applications in preparation for testing. As a lead, I coordinated the data input for the BTI project into Test Director for the purpose of status and bug tracking. I worked directly with the QA Director and the lead Automation Tester and communicated directly to division owners to create and input test scenarios and test cases into Test Director. I was also largely involved in test execution and bug tracking thereafter.
Responsibilities:
- Involved in gathering business requirement, studying the application and data and collecting the information from developers and business users.
- Worked and reported directly to the QA Managers who was offsite.
- Primarily involved in data validation efforts using Microsoft Excel.
- Design and execute Test Plans and Test Cases, generate Test Scripts and Test Created a daily status report to track data validation efforts.
- Responsible for Back End Testing. Wrote extensive SQL queries for Back End testing
- Reviewed the converted files in XML format to validate the conversion process.
- Analyzed XML data to make sure that the transformation process occurred correctly.
- Compared converted data to the original files in XML format to check discrepancies.
- Involved in Unit testing and Integration testing to check whether the data is coming perfectly from different source systems.
- Coordinated the testing efforts with the Developers, QA Team, the Functional Specialists and the Business Analyst to ensure that the testing requirements were being fulfilled.
- Performed sanity and smoke tests before performing extensive manual functional testing on each application..
- Assisted in Designing, Communicating, and Enhancing QA testing plan for the application.
- Project description and defining the Design steps in Test director.
- Executing System Test scripts developed in Test Director manually.
- Performed extensive Manual Functionality Testing on all the applications assigned.
- Performed Referential Integrity Check on the oracle database.
- Design and execute Test Plans and Test Cases, generate Test Scripts and Test scenarios for eCommerce and Supplemental sites.
- Manually executed testing steps
- Queried the Oracle database using Oracle Enterprise Manager for Back End Testing.
- Optimized SQL queries and interacted with development team to resolve anomalies in the database development.
- Conducted (UAT) User Acceptance Testing and (UT) Usability testing on the AUT of Projects migrated from Test Director
- Maintained test matrix and bug database and generated weekly reports.
- Actively participated in enhancement meetings focused on strategizing the testing effort of subsequent projects.
Confidential, St. Louis, MO Position: QA Tester Apr 06 - Mar 07
Confidential, is a leading global software company focused on Billing, CRM, and OSS for the communications services industry (WISP/WiMAX, ISP VoIP, Web Hosting, IP Video, Gaming, and other IP-based service providers). At IntraISP, I was involved with Production Support and testing production defects for various ClearWire applications such as Full_SignUp, Sales OE, and Boss.
Responsibilities:
- Leading UAT testing and making sure all the new production defects have been tested in UAT before submitting the code into production.
- Co-ordinating with the Business and the development team for the product delivery, also working with the testing team for writing test cases and test scenarios using Bugzilla
- Supporting Production applications for emergency defects requests
- Managing and Tracking defects through bug tracking tool Bugzilla
- Reviewing and testing reported defects in the concerned applications using Bugzilla
- Coordinating with concerned developer/development teams for design reviews per the business requirements
- Creating test plan, test cases and test steps for the reported defects based on the business requirements using Bugzilla
- Executing test scripts for functional and regression testing related to the defects fixed using Bugzilla
- Writing SQL statements to access MySQL database using 'SQLyog' Enterprise tool
- Team coordination and customer interaction for daily reporting
Education & Training
- Bachelor of Science in Computer Engineering
- MBA (Finance)
- M.S - Technology Management