We provide IT Staff Augmentation Services!

Quality Assurance Resume

4.00/5 (Submit Your Rating)

New Jersey, NJ

OBJECTIVE
Seeking a fast paced dynamic environment to apply my professional skills in the field of Information Technology; Software Quality Assurance testing with strong work ethic and interpersonal skills

PROFESSIONAL SUMMARY

  • Seven years of extensive, diversified experience in quality assurance of software development life cycle (SDLC) specializing in quality assurance processes and methodologies
  • Experience in Automation as well as Manual Testing with focus on Quality Assurance (QA) in Client Server, Web based applications, Oracle Applications (ERP), mobile applications and Ecommerce applications
  • Experience on ETL BI tools in ETL process development, testing and in maintaining ETL codes
  • Strong working experience on DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using MS SSIS
  • Extensive experience in Functional Testing, System Testing, Integration Testing, Regression Testing, User Acceptance Testing, Performance Testing, Stress Testing and White & Black Box testing manually and also using automated tools WinRunner, Load and Stress testing using Load Runner and QTP for testing Client/ Server and Web based applications
  • Experience in Integration of TestDirector with Mercury Interactive tools (WinRunner and LoadRunner) and external tools like Rational Clear Quest and Rational Clear Case
  • Experience in different SDLC methodologies like Agile/Scrum, Waterfall, V Model, and Spiral.
  • Good Experience in developing Test Plans, Test Cases, Test Scripts and establishing Test Environments.
  • Experience in all stages of STLC and SDLC ranging from planning, estimation, specification, designing, Preparation of GUI Test Cases, Functional Test Cases and testing with timely delivery against aggressive deadlines and QA process.
  • Worked on various business domains like Retail, Healthcare, Mortgage, Financial Services & Telecommunication industries.
  • Highly skilled in implementation and execution of test using HP Mercury tools Quick Test Professional, Quality Center, Load Runner, Win Runner, and Test Director.
  • Extensive experience on backend testing using SQL queries on Windows.
  • Expertise on Vision Plus application and led a team for web based projects.
  • Strong experience in Unit, Functional, Product, Integration, Regression, User Acceptance, System, Black Box Testing.
  • Achieved proficiency in creating Requirement Traceability Matrix (RTM) and Bug Reports.
  • Experience in Project Execution Methodologies, and short release cycles.
  • In-depth experience in Test Management, Test Process Analysis & Test Automation.
  • QA system functional end-to-end testing of mainframe systems.
  • Experience in implementing QA Methodologies, Test plans, Test cases, Test Scenarios, and Test deliverables.
  • Assisted developers in remediation of issues found in testing by analyzing various bug-tracking techniques and root cause analysis.
  • Utilized Visual Source Safe, MS Team Foundation Server, and Clear Case for version controlling.
  • Experienced in UNIX servers in writing, editing, executing, and testing Shell Scripts.
  • Extensive Experience working with offshore teams.
  • Good documentation experience in creating Technical Reviews, Reports & Strategy.
  • Strong analytical, problem solving, communication, learning and team skills.

TECHNICAL SKILLS &TOOLS

  • Testing Tools: Quick Test Pro 8.x/9.x, Quality Center 8.x/9.x, Win Runner 7.x, Test Director, Load Runner 7.x; IBM RequisitePro, Microsoft Test Manager, Clear Quest, SOAP UI, JIRA
  • Programming: C, C++, VBScript, XML, HTML, SQL, Visual Studio, Java/J2EE
  • Operating Systems: Windows 2000/2003/XP, MS-DOS, UNIX
  • Defect Managements Tools: Quality Center/Test Director, JIRA
  • Database: Oracle, MS SQL Server, MS Access
  • Management Tools: MS-Project, MS-Office and Visio, Power Point
  • Methodologies: SDLC, Agile, Waterfall, Spiral
  • CM Tools: MS Visual Source Safe (VSS), IBM Rational Clear Case,Asset 360
  • RDBMS: SQL Server 2000/2005/2008, Oracle 8i/9i, MS Access 2000
  • DB/Reporting Tools: TOAD, SSRS, Crystal Reports

PROFESSIONAL EXPERIENCE

Quality Analyst | Confidential,NJ Sep 2010 – Present
TDI application is based on claim processing system. NJ State Government is automating the paper based filing to a web application. Additionally, NJ State is trying to enhance its Data warehouse to manage legacy system data.

  • Performed Integration, System, User Acceptance, Regression, Performance and Back end testing of web based applications and mobile applications (Android and Apple iOS)
  • Testing mobile applications on various apple and android based devices (iPhone, iPad, Nexus, Samsung galaxy etc ...)
  • Involved in developing detailed Test plan, Test cases and Test scripts using QC
  • Created and Executed automated test scripts using QTP to perform Functional and Regression testing.
  • Created Test input requirements and prepared the test data for Data Driven testing.
  • Involved in Database Testing (Oracle), loaded the test data into Oracle database by writing SQL queries.
  • Designed and created the Project Test Plan which defines the scope, approach and deliverables of testing using Agile/Scrum Methodology.
  • Utilized Burnt- Down Chart to track the testing progress in a Sprint cycle.
  • Created Requirement Traceability Matrix to map between the requirements and test cases.
  • Used Quality Center to track and report system defects and bug fixes. Written modification requests for the bugs in the application and helped developers to track and resolve the problems.
  • Participated in defect review meetings with the team members and developers.
  • Involved in other test planning meetings and submitted test metrics daily to the management.
  • Checking the data flow and extensively using SQL Queries to extract the data from the Database.
  • Wrote various complex SQL Queries, joins, stored procedures, functions also created tables for backend testing.
  • PerformedETLusingMicrosoft SSIS,to extract, transform and load test data on the test environment and to tested against the database along with performance testing.
  • Assisted DB administrators in enhancing the QA database by query tuning and avoiding deadlocks.
  • Created and tested UNIX Shell scripts to verify the archived data in QA Server and to execute SQL scripts.
  • Automated the Shell Scripts to load the data in timely manner.
  • Constructed, maintained, and conducted Smoke test for UAT environment.
  • Identify defects in aggregate tables and report data, enter defect in Quality Center and coordinate with developers to resolve them based on defect severity and priority.

Environment: Oracle, VB, Quality Center, Windows XP, QTP, Asset 360, SSIS, Apple iOS, Android

ETL/ Quality Analyst | Confidential,Plano, TX Jan 2009 – Aug 2010
POS application connects across multiple data bases like Sales Transaction Database, Returns Database Management, Inventory database Management, Employee Purchases, Rewards, Gift Cards, and Different Tenders used for sales transaction and order management. The Purpose of Pricing Infrastructure project is to support growing business needs by migrating data and applications to an industry standard database management system (SQL Server). The project objective is to redesign/develop store systems pricing infrastructure to improve scalability, reliability and reduce processing time. The Pricing Infrastructure project is being executed in three streams like FMS to SQL Conversion, XS Domain to SQL conversion and End to End testing.

  • Analyzed business requirements, system requirement specifications, and responsible for documenting functional requirements and supplementary requirements.
  • Involved in testing various Oracle Applications Retail modules such as: Retail Allocation (RA), Retail Invoice Matching (RIM), Retail Price Management (RPM), Retail Sales Audit, Retail Store Inventory Management (SIM), Retail Trade Management (RTM), Retail Merchandising System (RMS) and customized objects in these modules
  • Involved in understanding the Requirements of the end Users/Business Analysts and Developed Strategies for ETL processes.
  • Implemented Database Checkpoints for Back-end Testing
  • Performed the Back-end Integration Testing to ensure data consistency on front-end by writing and executing SQL statements
  • TOAD is used to perform manual test in regular basis. UNIX and Oracle are using in this project to write Shell Scripts and SQL queries.
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS.
  • Tuned ETL jobs/procedures/scripts, SQL queries, PL/SQL procedures to improve the system performance.
  • Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity
  • Developed SQL Stored Procedures and Queries for Back end testing
  • Promoted application releases from development to QA and to UAT environments as required.
  • Tested ad hoc and canned reports for Business objects.
  • Tested Business Objects reports and Web Intelligence reports.
  • Managed user accounts and security using Business Objects Supervisor
  • Used TOAD Software for Querying ORACLE. And Used Teradata SQL Assistant for Querying Teradata.
  • Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
  • Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD.
  • Processed a query to update a field in an Oracle database, to ensure that debt payments are allocated.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the DB2 database in UNIX environment
  • Extensively used SQL Navigator to check the results of Unit test.
  • Scheduled the ETL jobs and loaded the data into target tables from staging area using dynamic load process.
  • Responsible for migrating the code changes from development environment to System Test, UAT and Production environments.
  • Extensively worked in the Unix Environment using Shell Scripts.
  • Involved in functional, exploratory and Integration testing.
  • Performed Data validity testing for reports and feeds based on client\'s requirement.
  • Validated format of the reports and feeds.
  • Created manual for the ETL process.

Environment: MS SSIS, Teradata V2R5/V2R6, PL/SQL, Oracle 9i, TOAD, Business Objects XIR2/6.5.1, Test Director 7.0,HP QTP .HP QC, ERWIN 3.5, IBM AIX 5.1,SAS, Shell Scripting, XML, XSLT, XSD, XML Spy 2008, UNIX, Windows NT/2000, SQL Navigator, , Windows NT/2000/XP, MS SQL SERVER, Oracle Applications 11i

Quality Analyst | Confidential,Charlotte, NC Jan 2008 – Dec 2008
The purpose of the project is to develop a system so that underwriting information can be retrieved in an automated fashion through a message sent to the server that is processed and sent to the appropriate underwriting service, creating a response that is then returned to the loan system.

  • Developed and established quality assurance measures and testing standards for new homepage and other applications and their enhancement projects.
  • Prepared and delivered reports to the upper management and teams.
  • Found out solutions to resolve problems related projects and recommended to address existing and potential trouble areas in IT systems and projects across the organization.
  • Analyzed business requirement documents and system specifications. Created and executed test plans and scripts based upon established standards.
  • Ensured that all testing activities allowed applications to meet business requirements and systems goals, fulfill end-user requirements, identified, and reported existing or potential issues.
  • Collaborate with software/systems personnel in application testing, such as system, unit, regression, and acceptance testing methods.
  • Assisted developers in debugging efforts by conducting root cause analysis.
  • Worked with software testing methodologies with all phases and stages of testing desired (Functional, System, Integration, Regression, Data Validation, and User Acceptance).
  • Performed Integration, Functional, and End to End Testing.
  • Responsible for quality assurance and data quality management and control for the project.
  • Worked actively with developers to identify high risk features, database configurations.
  • Assisted the Quality Assurance Manager with the administrative functions and coordinating Client\'s Quality Recognition Programs, councils, and workgroups.
  • Designed and implemented automated scripts (QTP) for the modules.
  • Performed Functional tests using Standard, and Text Checkpoints in QTP.
  • Performed System and Compatibility tests by running QTP scripts on different URLs
  • Ensured proper testing methodologies, approaches, and activities for all changes released into the production environments.
  • Helped Release Manager to implement IT changes, upgrades, releases, or installations.
  • Assisted Change Manager to coordinate all change functions from single door within the company.
  • Created logs for the development and support team to look at for any incident detected.
  • Utilized MS SQL Server Reporting Services in order to generate various reports from the database.
  • Defect tracked and traced using JIRA.
  • Validated the reports generated by Crystal Reports tool.

Environment: Quality Center, QTP, Oracle, JIRA, Crystal Report

QA Analyst | Confidential,Raleigh, NC Feb 2007 - Dec 2007
Cigna is a global health service company dedicated to helping people improve their health, well-being and sense of security. All products and services are provided exclusively through operating subsidiaries of Cigna Corporation. The primary responsibility was to participate in Requirements/Design/Code reviews, develop test plans, test cases, automate tests, execute tests, track defects, identify risks, and communicate status to ensure that the team meets specified business requirements.

  • Analyzed System specifications and helped team members to achieve test goals.
  • Developed a detailed Test Plan for the testing effort of the User Interface.
  • Developed Test Cases based on the requirement documents.
  • Performed Positive and Negative testing manually.
  • Analyzed the results of manual and automated tests.
  • Gap Analysis between the HIPAA Transactions and Facets system
  • Involved in the configuration of Facets system for the HIPAA changes
  • Involved in the reporting analysis and configuration in Facets System
  • Performed Security Testing for the application.
  • Performed Parameterization using Win Runner.
  • Checked whether the requirements are met by inserting GUI Checkpoints.
  • Perform Backend testing by extensively using SQL queries to verify the integrity of the database.
  • Bug reporting, tracking and documentation on the Bug tracking System using Test Director.
  • Executed baseline scripts for Regression testing to handle the changes.
  • Checked the data flow through the front end to backend and used SQL queries to extract the data from the database.
  • Involved in working with the developers and the environment support teams to resolve issues.
  • Reviewed the testing progress and issues to be resolved by conducting walkthroughs.
  • Executed test scripts in VB script and reported any problems using Test Director.
  • Participate in various meeting and discussed Enhancement and Modification request.

Environment: Win Runner 6.0, Test Director 7.0, SQL

Quality Assurance Analyst | Confidential,Vienna, VA June 2005 – Jan 2007
The LOS (Loan Originating System) Track is responsible for the overall design and development of all forms and setup items to support origination, processing, underwriting, closing, funding, and secondary processes. LOS is divided into two primary work groups for the purposes of delivering the agreed upon scope. The first is LOS Application which is focused on validation and customization of the application. The second is the LOS Setups group which is charged with ensuring all setups required to support application functionality are present and contain supporting data required to meet project requirements.

  • Analyzed the user requirements by interacting with developers and business analyst.
  • Written Test Plan and Test Cases by going through the Design, Functional Requirements, and User (Business) Requirements Documentation.
  • Prepared the Test Cases and executed the Test Scripts in Win runner
  • Wrote automated Test Scripts for Regression Testing using Win Runner.
  • Created GUI checkpoint in Win Runner and used synchronization point using Win Runner.
  • Performed UAT Testing, System Testing & Integration testing for the applications.
  • Performed regression testing for every modification in the application and new builds.
  • Conducted system and integration testing, debugged the software errors, and interacted with Developers to resolve technical issues.
  • Responsible for testing the front-end of these interfaces.
  • Participated in walkthroughs and status meetings.
  • Participated in Design review, code reviews, and test review meetings to identify bottlenecks in the system.
  • Executed SQL Queries to check the data table updates after test execution.
  • Used Test Director for managing the Test Cases and Test Scripts.
  • Generated test reports using Test Director.
  • Prepared & Executed Manual Test Cases, Reported Test Results.
  • Coordinating with development team, discussing technical problems and reporting bugs
  • Involved in UAT testing of the application
  • Provide Systems analysis on automation results and identify root cause for all failures
  • Analyzed all the bugs in the Test Director reported by the users during the UAT.

Environment: Win Runner, Test director, MS- Excel, Oracle, SQL

EDUCATION

Masters in Business Administration (Finance)
Bachelors in Electronics and Communications

We'd love your feedback!