We provide IT Staff Augmentation Services!

Qa Analyst Resume

4.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Over 6 years of experience in Information Technology with emphasis on Software Testing/Quality Assurance, Software Testing Life Cycle, System Analysis and Test Methodologies. Experience in testing of Standalone, Client Server and Web based applications using Manual testing techniques and Automation tools.
  • Currently seeking a Testing position in a customer focused and process driven environment where analytical skills and prior experience will add value.
  • I am motivated by a keen interest to learn the technical side of an application and finding defects in order to assess and improve quality.
  • Acquired experience with the full Software Development Life Cycle (SDLC), which includes Planning, Analysis, Design, Development, Testing, Implementation and Support.
  • Developed Test Plans, Test Cases, Test Scenarios and Test Scripts for Manual and Automated testing for web based/mainframe/database application to ensure proper business compliance.
  • Experience in understanding Business Process from the requirements and converting them to test scenarios.
  • Expertise in using Mercury tools (Quality Center, Win Runner, Quick Test Professional and Test Director), Performance testing tool like Load Runner.
  • Develop checklist / test data along with business user to plan the testing activates.
  • Experience in performing System testing, User Acceptance testing, Functional testing, Integration Testing, GUI, Smoke Testing, Database testing, Black box testing, White Box testing and Stress/Performance testing with complete QA cycle - from testing, defect logging and verification of fixed bugs.
  • Sound knowledge of creation of Project Status Report, Testing status report, Metrics Planning & Tracking Sheet, Traceability metrics for test coverage, Software Configuration Management Plan, Resource Management Plan, Risk Management Plan and Review log for audit purpose and involved in audit meetings.
  • Expert in developing QTP code/design for the application under test and designing the QTP Framework for the reusability functionality.
  • Strong Experience in testing ETL/DW software using Informatica/Talend and responsible for testing end-end ETL testing.
  • Proficient in SQL and PL/SQL for testing database integrity.
  • Strong Problem Analysis & Resolution skills and ability to work in Multi Platform Environments like Windows and UNIX.
  • Used BI tools to Verified accuracy of reports.
  • Extensive experience in coordinating testing effort, responsible for test deliverables, status reporting to management, issue escalations.
  • Strong knowledge of Software Development Life Cycle, Methodologies and Techniques like Waterfall, Agile and Iterative.
  • Experience in Capital markets, Investment Banking, E-commerce, Database applications and strong understanding of financial systems background.
  • Experience in Installation and Implementing Mercury Interactive Test suite.
  • Excellent logical skills for understanding system workflows, computing and verifying Software Metrics and well suited for communicating with both technical and non-technical professionals.
  • Highly adaptive to a team environment and proven ability to work in a fast paced teaming environment with excellent communication skills.

TECHNICAL SKILLS

Operating Systems: HPUX, AIX, Solaris, RH Linux 4.0, MS-DOS, UNIX, Windows Xp/2000/98/95.

Languages: C, C++, Java, Sql, Pl Sql.

RDBMS: Oracle 10g/9i/8i, Ms-Access, Sql Server 2000, DB2 & Sybase.

Front End tools: Oracle developer 2000 (forms, report & graphs) 6i/9i/10g, Visual Basic 5.0/6.0, HTML

Scripting Tools: - Unix-shell script, Java script, VB script. testing tools - win runner 7.0 quick test pro 9.2/9.0/8.2,rational robot, Load runner & MAINFRAME - RUMBA. defect tracking tool test director 8.0/8.2, hp quality centre 9.0, JIRA, Rational Clear Quest, Bugzilla

ETL: Informatica Power Center, Data stage and Talend

BI Tool: Siebel Analytics 7.5, 7.7 and Oracle Business Intelligence 10.3

Reporting Tool: Micro Strategy

Data base Access Tools: TOAD, SQL * Plus, Query analyzer.

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

QA Analyst

Environment: Quality Center 9.2, Oracle 10g, SQL server, SQL * Plus, Toad, SQL, PL/SQL, Flex 3.0, MS Access, Informatica 8,Talend, Teradata,SQL Assistant 6.1, Sport fire, Micro strategy. Testing Type: SIT

Responsibilities:

  • Analyzed requirements during the requirements analysis phase of projects to develop the Test Scripts against the requirement.
  • Keep track of the new requirements from the Project and Forecast / Estimate the Project future requirements.
  • Developed Test Plan, Test strategy for manual testing from the business requirement.
  • Involved in the preparation of test scenarios and test data.
  • Execute test cases using valid test data as inputs and ensure that the final outcomes of the tests are satisfactory.
  • Arrange the Hardware and Software requirement for the Test Setup.
  • Implemented various integrity constraints for data integrity like Referential integrity using primary key and foreign keys relationships.
  • Executing back-end data-driven test efforts with a focus on data transformations between various systems and a data warehouse.
  • Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Responsible for testing all new and existing ETL data warehouse components.
  • Involved in the System Testing of the dashboard/Report Functionality and data displayed in the reports.
  • Validating fields present in the reports are as agreed in the specifications.
  • Validated drill down features of reports and worked closely with the queries and reporting team to resolve incidents.
  • Analyzed and tested various reports for Hierarchies, Aggregation, Conditions and Filters
  • Arranged meetings with business customers in better understanding the reports by using business naming conventions.
  • Experienced at testing ETLs and flat file data transfers without relying on a GUI layer and provided technical suggestions and guidance to QA management for improving QA software testing methodology for data warehouse testing.
  • Executed sessions and batches in Informatica/Talend and tracked the log file for failed sessions.
  • Modified and maintained test cases with changes in application interface and navigation flow.
  • Conducted Smoke Testing on new versions.
  • Used Quality Centre to upload requirement and mapping of all test cases to ensure that all requirements are covered.
  • Defect analyzing, defect verification, defect reporting and defect consolidation.
  • Review the feedback from developers and perform Regression testing.
  • Support testing activities for any productions issues.
  • Responsible for sending Daily Status Reports.
  • Assist in performing any applicable maintenance to tools used in Testing and resolve issues if any.
  • Check for timely delivery of different milestones.
  • Document any problems, and work with the project team to resolve problems identified during the tests.
  • Generate Pre test report and post test report using document generator option in Quality Centre.

Confidential, Charlotte, NC

QA Analyst

Environment: Quality Center 9.2, QTP 10.0, Oracle 10g, SQL * Plus, OBI 10.1.3.3.0 , Toad, SQL, PL/SQL, Flex 3.0, MS Access. Informatica Power Center 7. SQL Assistant 6.1.Testing Type: SIT & Automation

Responsibilities:

  • Gathered business rules and scenarios from end users to ensure that accurate business scenarios were tested.
  • Created Activity Diagrams and Workflow diagrams to depict business processes (and analyzed them for their efficiency and effectiveness.
  • Performed Web Testing using Quick Test Pro.
  • Wrote automated VBScripts for Regression Testing using Quick Test Pro.
  • Created Test input requirements and prepared the test data for data driven testing.
  • Used Quick Test Pro data table to parameterize the tests.
  • Used Quick Test Pro's Object Spy to view the property of an Object.
  • Created Standard and Image checkpoint in Quick Test Pro.
  • Used Database Wizard to Connect to Database in Quick Test Pro.
  • Outputted text to Data Table and Database to verify the functionality of application.
  • Created Batch Test using Quick Test Pro.
  • Used Synchronization point using Quick Test Pro.
  • Performed Positive, Negative and ad- hoc testing.
  • Performed System and Integration Testing using Quick Test Pro.
  • Used the Recovery Scenario in Quick Test Pro.
  • Data Validation and Database Integrity testing done by executing SQL, PL/SQL statements.
  • Performed backend testing to test the validity of the data in the reports using complex SQL and PL/SQL queries on the database.
  • Performed data validation between staging and production writing SQL Queries to validate the data loaded.
  • Involved in doing Database Testing by passing SQL Queries and retrieved information from Database.
  • Responsible for monitoring data for porting to current versions.
  • Documented and reported the bugs for the exceptions received through Quality Centre.
  • Prepared trace ability matrix for Test scripts to find gaps in requirement.
  • Tracking and report upon testing activities including testing results test case coverage required resources defects discovered and their status performance baselines etc.

Confidential - Charlotte, NC

QA Analyst

Environment: Jsp, J2ee, java script, HTML, XML, Remedy mail, Web Sphère, Web studio, Oracle 10g, SQL * Plus, Plsql, Test Director 8.2, QTP 9.2, Loadrunner 8.1 Testing Type: UAT

Responsibilities:

  • Developed and executed Test plans and Test cases for Automation and UAT.
  • Developed Test Scripts based on the business requirements and technical specifications.
  • Prepared the Test data (Input files) for interpreting the Positive/negative/regression results based on the design requirements.
  • Developed test scripts for GUI, functionality and regression testing using QTP.
  • Done Data Driven Test with QTP to test the application for different sets of data.
  • Created Reusable actions and functions by using QTP.
  • Used Component based approach to develop automation scripts, this approach reduce the cost of maintenance and maximize the re-use of existing test scripts.
  • Used Toad for writing SQL and Oracle connectivity to run SQL via ADO from QTP.
  • Used XML to Set up the Application Environments for automation scripts
  • Prepared the Manual test cases that wasn't covered under Automation process
  • Developed scripts and scenarios for automated testing new and enhanced web based products using Load Runner 8.1, VuGen 8.1.
  • Determined load accuracy for web based medical imaging applications and other products under development including regression testing.
  • Run the tests manually and maintained logs in Quality Centre and to upload requirement and mapping of all test cases to ensure that all requirements are covered.
  • Involved in testing of GUI by inserting checkpoints in Quick Test Pro scripts for single or multiple objects and text.
  • Reviewing and testing all pages of the website and compare against content requirements.
  • Provide detailed report on the performance testing response time for each page and the results for the server monitor report.
  • Tested the Data Migration to ensure that integrity of data by writing SQL queries.
  • Analyzed, documented and maintained Test Results and Test Logs.
  • Responsible in providing regular test reports to the management.

Confidential, Rhode Island

Test Analyst

Environment: Web Application developed in .Net framework 3.5, HP Quality Center 9.2, Para soft 6.1, XML, SOAP, and WSDL Testing Type: UAT

Responsibilities:

  • Discuss with client on the SOW and scope to the get paper sign off to move forward with the legal department.
  • Prepare test approach for 600 rules testing through WSDL using parasoft tool.
  • Prepared the Software Test Plan.
  • Check / Review the Test Cases document
  • Analyze requirements during the requirements analysis phase of projects.
  • Keep track of the new requirements from the Project.
  • Forecasting / Estimate the Project future requirements.
  • Arrange the Hardware and software requirement for the Test Setup.
  • Develop and implement test plans.
  • Escalate the issues about project requirements (Software Hardware Resources) to Project Manager / Test Manager.
  • Deployed the build in the required setup.
  • Escalate the issues in the application to the Client.
  • Responsible for preparing the Agenda for the meeting for example: Weekly Team meeting etc.
  • Attending the regular client call and discuss the weekly status with the client.
  • Sending the Status Report (Daily Weekly etc.) to the Client.
  • Filed the defects in QC.
  • Discuss doubts/queries with Development Team / Client.
  • Conduct internal trainings on various products.

Confidential

Test Lead

Environment: QTP 8.2, Test Director, Oracle Development Suite 10g, SQL, PL/SQL, Oracle 10g, SQL Developer 1.5, jDeveloper, OC4J Server, VB scripting, Windows XP, Transform Migration Toolkit.

Responsibilities:

  • Analysed the business requirements of the project by studying the Business Requirement Specification document.
  • Conducted manual testing of Oracle Forms/Reports application, including some development and maintenance.
  • Debugged and analyzed application issues, fixed functionality according to customer's specifications and increased quality of product.
  • Participated in creation of web based test automation management tool designed to store test data, run automated tests and report results using the QTP automation object model
  • Developed automated Test Scripts in QTP using VBScript for Regression Testing.
  • Developed manual test cases for regression testing based on the requirement documents
  • Participated in structured code reviews/walkthroughs.
  • Created reports using Oracle Report builder and integrated them to existing Oracle Forms application.
  • Populated Oracle DB 10g with test data during functional and regression testing of Oracle Forms application.
  • Performed regression test of modified forms before shipping application to customer.
  • Added features allowing customers to read and print documents using Brava Reader directly from Oracle Forms application.
  • Analyzed Forms application, its patterns, uniqueness and variances before migration.
  • Performed modifications to PL/SQL packages to remove any Forms specific references.
  • Verified product documentation by finding inconsistencies between documentation and the actual product behavior.
  • Preparation of Trace ability Matrix to ensure test case coverage.
  • Used Test Director Tools for organizing and managing the entire process
  • Analyzed defects, prepared defect report and done follow-up with design/development team.
  • Sign off the application delivery if there is no defect in open and all functionality is been tested and working fine.
  • Assisted in identify, manage and resolve QA issues; work with the QA Lead for post-launch Quality Assurance.

Confidential

Test analyst

Environment: Oracle 9i, SQL, PL/SQL, Oracle Developer 2000 (Forms 6i & Reports 6i), UNIX, TOAD, Windows 2000, Test Director 7.0, QTP 8.2.

Responsibilities:

  • Responsible for creating Test plans, designing test harnesses and test cases, and executing the test cases.
  • Review and creation of Scenarios for system testing.
  • Responsible for QTP manual and test automation development using VBScript and JAVA in SIT, UAT model office, and Production environments to meet business objectives
  • Coded and implemented an automated in-house stress and load performance test tool to evaluate the effectiveness of newly upgraded and installed software
  • Tested the data extraction procedures designed to extract data into flat feed files.
  • Tested the PL /SQL Stored Procedures, Triggers, Views and Functions to verify the dataflow as per the requirement of the application.
  • Executing test cases for the application & logging defects in Mercury Test Director.
  • Responsible for taking daily builds for QA.
  • Responsible for sending the release binaries as a patch On Site.
  • Created testing changes to the Project Release process resulting in a 70% decline in post-production implementation questions.
  • Modified existing rate increase test process resulting in a 200% increase in productivity
  • Proven track record of Project Delivery meeting or exceeding expected deadlines.
  • Managed Automation Projects from design to support
  • Verification of the specification for any issues that requires clarification.
  • Attending meetings with Project Managers to report progress & discuss issues.

Confidential

Database Associate

Environment: Oracle 9i, SQL, PL/SQL, Oracle Developer 2000 (Forms 6i & Reports 6i), UNIX, TOAD, Windows 2000, Sql Server 2000, Win Runner 7.0, Test Director 7.0, Informatica, Java, Oracle 8i, Windows NT and 2000, IE 5.0, XML, HTML &Web sphere. Citrix, MSSQL, MYSQL, XML, HTML

Responsibilities:

  • Analysis of the user specifications and functional specification and design document provided by the clients.
  • Developed Test Scripts and Test plan based on the business requirements.
  • Was responsible for complete SDLC process to work out testing strategy for each development phase.
  • Developed TSL scripts in win Runner for GUI and Regression testing.
  • Extensively used SQL Queries to verify and validate the Database Updates.
  • Created User Defined Functions in Win Runner Functions Library to re-use them in different tests.
  • Documented test requirements and test cases using Test director.
  • Created and verified test scripts for numerous batch processes based on design documents and work with developers.
  • Improved delivery time by implementing re-builds.
  • Improved build process from 1 build per 2-3 days to 1 build per hour.
  • Modified XML files to manipulate the data to fake the web services.
  • Read and understand the Log files to verify the processes for debugging and/or test data verification purposes
  • Created test data for testing specific Membership functionality.
  • Developed Stored Procedures and Triggers for processing to validating data.
  • Developed UNIX Shell Scripts to automate and batch processing tasks.
  • Verification and Validation of financial calculations using Excel.
  • Maintained Test case Matrix for summarized test results information.
  • Participated effectively in weekly team meetings to provide test status to Product and Development management and worked with Development and Product Team members to reproduce test incidents.
  • Used Test Director for Defect tracking and reporting. Participated in test case review meetings and requirements walkthroughs.
  • Responsible for Reviews and documentation for reporting the status to the manager.
  • Involved in the Team of QA engineers to resolve technical issues; reviewed and prioritized open bugs; reported bug status of the products to the QA Manager, evaluated test cases, logged bugs, and verified fixes.
  • Worked with business analysts, developers, and content department to resolve issues.

Confidential

Software Engineer

Environment: Oracle 9i/8, Sql, Pl Sql, Oracle Forms /Report 6i, Erwin, Import & Export, Unix, Shell Scripting, VB & Windows 98

Responsibilities:

  • Involved in the development of packages which were used by the front end system
  • Upgraded Oracle forms from version 4.5 to version 6i.
  • Developed oracle primarily database objects and sql queries for the manipulation of the data from different table.
  • Designed complex Reports, Forms using Reports 6i/Forms 6i according to the client needs.
  • Designed Lookup, Choose and Detail forms by satisfying all functional requirements including unit testing.
  • Created various graphs using graph builder.
  • Upgraded, maintained and trouble shoots Oracle Forms and Reports.
  • Writing PL/SQL stored procedures, functions, triggers to meet new features to be incorporated in the system to implements business rules.
  • Fixed the Bugs in Different Screens resulted from Unit Tests.
  • Generated several reports to Print Purchase orders, Invoices, Contracts and other user- defined reports.
  • The database design responsibility included the conceptual data model design using ER diagrams, logical database design and physical implementation of the database.
  • Added provision to add or modify users and their privileges.
  • Done Performance tuning of Oracle Databases and User applications.
  • Modification of reports to generate similar reports.
  • Involved in the development of Pro*c.
  • Used Import Export tools to create dump files for exporting and importing data to the database.
  • Involved in tuning the database server performance by reducing disk I/O, Sizing rollback segments

We'd love your feedback!