We provide IT Staff Augmentation Services!

Sr. Qa Tester Resume

3.00/5 (Submit Your Rating)

Boston, MA

SUMMARY

  • Over Seven years of Software Testing, Development and Quality assurance of Client/Server and Web based applications using Win Runner, Load Runner, Test Director, Quality Center, Quick Test pro and Manual testing.
  • Proficient experience in Manual and Automated Testing of GUI and functional aspects of teh Client - Server and Web based Applications on multiple levels of SDLC and Testing Life Cycle (STLC)
  • Experience wif usability requirements for mobile platform on mobile devices compatibility for online mobile banking.
  • Extensive experience in IT as QA Tester/Analyst performing Quality Assurance testing primarily on Client Server and Web Applications.
  • Possess strong knowledge and skills in Software testing and Quality Assurance processes, methodology and tools.
  • Strong understanding of Software Development Life Cycle (SDLC).
  • Proven experience wif Manual and Automated Testing Tools such as Mercury Quality Center, Test Director.
  • Strong in Analyzing Business Requirements and Developed Test Plans, Test Scripts and Test Cases and Execute them.
  • Experience wif defect tracking systems like JIRA and Bugzilla.
  • Experience in performing Black Box Testing, White Box Testing, Functionality Testing, Integration Testing, System Testing, Regression Testing, Performance Testing, User Acceptance Testing, End-to-End Testing, Load Testing and Database Testing using SQL.
  • Expert in MS SQL Server suite of products like SSRS, SSAS and SSIS of MS SQL Server 2005.
  • Knowledge in teh ETL (Extract, Transform and Load) of data into a data ware house/date mart and Business Intelligence (BI) tools like Business Objects Modules (Reporter, Supervisor, Designer, and Web Intelligence).
  • Experience in writing SQL queries and optimizing teh queries in Sybase, Oracle and SQL Server.
  • Extensive experience in Unit, Functional, Integration, Regression, User Acceptance, System, GUI, Load, Stress, Performance Testing and black Box Testing.
  • Proficient in Manual and Automated testing tools. Expertise in using testing tools such as QTP and Test Director
  • Good working Experience on Mercury Quality Center9.0, Test Director8.0, RationalClearQuestAtlassianJira 3.6 as test management tools.
  • Extensive experience in Manual Testing, preparing System Test Plans and defining Test Cases.
  • Experience in testing Online Transactional and Analytical Processing systems.
  • Experience in back-end testing wif SQL queries
  • Experience in coordinating teh team wif Offshore.
  • Extensive experience in analyzing business and system requirements.
  • Experience in Robust Defect tracking and reporting.
  • Extensively used MS Office for project documentation.
  • Effective inter-personal and communication skills, team player and can manage multi-tasking.

TECHNICAL SKILLS

Operating Systems/Languages: Windows 98/ME/XP, UNIX, C, C++, JAVA, SQL, VB 6.0

Testing Tools: Mercury Interactive Win Runner 7.6/8.0, Load Runner 7.8/8.0, QuickTestPro 6.5/8.0, Test Director 7.6/ 8.2, JIRA, Quality Center 11.0

Conversant wif Web Technologies: HTML, ASP, XML, DHTML, VB script, Java script, Java Servlets, JDBC, Applets

Documentation Tools: MS-Office, MS Project.

Database Tools: SQL Server 2008/2005/2000 , Oracle 9i, MS Access

PROFESSIONAL EXPERIENCE

Confidential, Boston MA

Sr. QA Tester

Responsibilities:

  • Responsible for leading and determining teh test approach and test Plan.
  • Help design, develop and implement test plans, scripts, and tools using detailed business requirements provided by teh business analysts.
  • Review of Test Plans, Test cases, scripts and results provided by team members.
  • Analyzing and assisting team members to cover End-to-End functionalities.
  • Involve in joint application discussion meetings wif customers to understand teh requirements thoroughly.
  • Reported software defects via JIRA bug tracking system.
  • Participated in teh bug review meetings, updated requirements document as per business user feedback and change in functionality of teh applications
  • Coordinated wif teh development team for teh defects, which were raised in UAT.
  • Managed change requests and conducted status review meetings and reported to teh business.
  • Participated at testing of mobile and online site pages of Banking Application for various clients.
  • Generated test scenarios, test cases and test data. Executed tests, created problem reports
  • Tested application security features, including session expiry, book marking, passwords and compatibility across multiple browser types and encryption levels.
  • Implemented several text alert and text reminder services.
  • Mobile application testing on Mobile phone emulators for iPhone, android, and blackberry.
  • Working wif development teams investigating and correcting software bugs and deficiencies based on teh testing results.
  • Perform System Integration testing and verified that new and modified functions perform according to teh project requirements. dis test effort can include Functional Testing, Interface Testing, Security Testing, Usability Testing, Negative Testing, and Failover Testing.
  • Tested applications in QA, Staging and Production environments.
  • Responsible for all aspects of teh testing process me.e. planning, scheduling, running tests,defect tracking and bug reporting using JIRA.
  • Worked directly wif programmers to help resolve issues
  • Organized and facilitate teh daily meeting wif teh team members and development team involved in teh project for identifying teh daily and weekly targets, coordinating teh work flows, and solve teh issues stalling for further testing activities in both QA and development cycle.

Environment: Agile, AP Test Manager (ATM), PL/SQL, SQL Server 2000, DOORS, SOAP, HTML, XML,IIS, Quality Center 11.0 ALM, QTP 8.X, Mercury, Load Runner 8.1, Windows XP, MS Office Suite, MS Visio, MS Access, MS Project.

Confidential, Herndon, VA

Sr. Test Analyst/ QA Tester

Responsibilities:

  • Developing Test Plan wif a Test Strategy for teh System testing. Instrumental in creating design and framework for automation.
  • Analyzed teh business and functional requirements of teh application and also helped in developing detailed test plan and test cases.
  • Defined teh test criteria, project schedules and base lined teh Test Plan wif teh help of project meetings and walkthroughs.
  • Helping development team in defining teh scope for Unit testing. Although Unit testing is not in teh scope of QA efforts but QA team helped delivery team in putting a process in place for Unit testing activities.
  • Requirements Elicitation, Analysis, Communication, and Validation according to Six Sigma Standards.
  • Has done functional testing of teh front-end application using Quick Test Professional (QTP).
  • Backend testing of teh DB by writing PL/SQL queries to test teh integrity of teh application.
  • Involved in testing various New Policies, Amendments, Renewals, and Cancellations etc.
  • Involved in end to end testing from Quick Quote to teh policy activation in both frond end application and in teh mainframe.
  • Created Use cases, activity report, logical components and deployment views to extract business process flows and workflows involved in teh project. Carried out defect tracking using Clear Quest
  • Involved in story creation and pair testing as part of agile development process
  • Excellent working knowledge of designing and implementation of all QA test strategy plans and automated test solutions for client/server and Web applications wif Mercury Interactive Test Suite (LoadRunner, WinRunner, QuickTest Pro and test Director)
  • Worked wif SQL queries using MS Access for data manipulations.
  • Created UNIX shell script for processing data files from source system, for archiving and running teh workflows.
  • Develop and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access.
  • Queried database using SQL for backend testing
  • Used SDLC (System Development Life Cycle) methodologies like teh RUP and teh waterfall.
  • Responsible for business system analysis of customizing teh BPS Risk Management product wif involvement through teh whole SDLC
  • Worked on documenting Master QA Strategy (a new template per ESDM guidelines). Defining teh QA tasks for Java and data intensive projects. Customizing teh scope of various QA activities based on nature of projects.
  • Extensively involved in teh automation of Regression Test Cases by scripting.
  • Worked on ‘Segmentation Rule Analysis’ document, which will be made as a QA artifact and will be used as a KS tool (Knowledge Session) tool for any new team member.
  • Worked on creating a document on putting a process in place for incorporating Sarbanes - Oxley act to mock up real production data in case of volume testing.
  • Actively involved in Source Driven Testing, Target Driven Testing, Business Cycle Testing, Business Scenario Testing and Data Integration Testing.
  • Worked on different database like Oracle to check teh data consistency.
  • Hand on experience on different SQL tools like QueryMan was used to test teh complex queries.

Environment: QC 11.0 ALM, Bugzila, IBM, AP Test Manager (ATM), DOORS, SQL, PL/SQL, SQL Server 2000, SOAP, HTML, XML,IIS, HPQC, Mainframe, Windows XP, MS Office Suite, MS Visio, MS Access, MS Project.

Confidential, Lakewood, CO

QA Analyst/Tester

Responsibilities:

  • Involved in analyzing teh business process through Use Cases, Work Flows and Functional specifications.
  • Analyzed teh business and functional requirements of teh application and developed detailed test plans, test cases in Test Director.
  • Defined teh test criteria and project schedules, and baseline teh Test Plan wif teh help of project meetings, walkthroughs.
  • Involved in Data Modeling of both Logical Design and Physical design of data warehouse and data marts in Star Schema and Snow Flake Schema methodology.
  • Helped wif Data Mapping between teh data mart and teh Source Systems.
  • Converted various SQL statements into stored procedures thereby reducing teh Number of database accesses.
  • Worked in mainframe environment and used SQL to query various reporting databases.
  • Performance Monitor graphs was analyzed.
  • Memory leaks, Network bottlenecks, Response time for each scenario was monitored.
  • Involved in teh documentation of teh complete testing process.
  • Performed User Acceptance Testing.
  • Interacting wif teh development and testing teams to improve overall quality of teh software.
  • Involved in creating periodic status reports.
  • Data modeling and design of data warehouse and data marts in star schema methodology wif confirmed and granular dimensions and FACT tables.
  • Created Logical and Physical models for Staging, Transition and Production Warehouses using Erwin.
  • Created GUI, Bitmap, Database and Synchronization Checkpoints to compare teh behavior of a new version of teh application wif teh previous version
  • Developed Test Strategy, prepared Track Sheets to keep track of teh tasks assigned to teh Jr. Testers, and resolved issues.
  • Rational ClearCase was used to manage all teh changes and change requests in teh projects. Worked wif QA team wif teh aid of ClearQuest for teh bug tracking and Mercury Test Director for Test Case Management.
  • Used Controller component in LoadRunner to create both manual as well as goal oriented scenarios and running teh scenarios.
  • Created in-depth reports and graphs in LoadRunnerto check where performance delay occurred: network, client, server delays, CPU performance, me/O delays, database locking.
  • Actively participated in BUG meetings to resolve teh defects in efficient and timely manner.
  • Assured that all QA Artifacts are in compliance wif corporate QA Policies and guidelines.
  • Coordinated and communicated teh whole QA effort by effectively working wif different teams.

Environment: LoadRunner, Test Director, QTP, HPQC, Quality Center, Mainframe, VBScript, MS Excel, DAP, TOAD UNIX, Web Logic, XML, VB.net, ASP.net, Windows XP, Oracle, DOORS, MS Office Suite, MS Visio, MS Access, MS Project.

Confidential, NY

QA Analyst/Tester

Responsibilities:

  • Defined teh test criteria, project schedules and base lined teh Test Plan wif teh help of project meetings and walkthroughs.
  • Involved in decision making of converting manual test cases into automated test scripts and analyzing their life time and time required to update teh scripts.
  • Written smoke test cases in Quality Center and modified them when they are automated.
  • As part of troubleshooting; helped reconcile index data. Worked on JIRA to maintain; track and update issues.
  • Involved in environment requirement gathering meetings and giving inputs to teh environment teams.
  • Manually tested teh whole application before going for teh automated testing.
  • Involved in testing teh conversion of teh application into web application using XML web services.
  • UAT testing performed along wif call center managers to make sure that application meets their requirements.
  • Used Quality Center/Test Director for requirement management, test planning, scheduling, executing test cases, managing and tracking defects.
  • Automation of test scripts was done using QTP for test re-usability of different online transaction modules.
  • Reported teh bugs, Email notifications to teh developers using Test Director.
  • Generated Performance strategy, reports, Benchmarking, Analysis and Tuning of teh System.
  • Regular interaction wif developers, business analyst and Logic Unit Workgroup.
  • Prepared documentation for all stages according to company standards.
  • Extensively involved in teh automation of Regression Test Cases by scripting.
  • Used Controller to organize teh test scripts, Execute and Report scenarios.
  • Performed Data driven testing, Correlation and check point in teh scripts.
  • Extensively worked on Performance Monitoring and analyzed teh response time, Memory leaks, hits/sec and throughput graph.
  • Involved in development and reporting of quality assurance project metrics.
  • Created Data Driven Testing for positive and negative scenarios, by creating and storing data sets in MS Access.
  • Backend testing of teh DB by writing PL/SQL queries to test teh integrity of teh application and Oracle databases using TOAD.
  • Wrote and modified required UNIX scripts and other SQL validation scripts and writing scripts in SQL to validate teh outputs analyzed test results and creating performance evaluation reports.
  • Involved in functional and regression testing of teh web-application using teh VO extension and tested teh software using different browsers to study teh browser compatibility.
  • Responsible for collecting and analyzing teh test metrics and tan submitting teh reports keeping track of teh status and progress of teh testing effort.

We'd love your feedback!