We provide IT Staff Augmentation Services!

Sr. Qa Analyst Resume

5.00/5 (Submit Your Rating)

Washington D, C

SUMMARY:

Over 8 Years of IT experience inData Analysis/QA Testing/ Batch Testing/ETL/DWH/BI-Reports & Testing of Software systems inData Warehouse/Business Intelligence/Client/ServerOracleand Web-based environments under Windows and UNIX platforms.

  • Extensively worked in entireQA Life Cycle, which includes designing, developing and execution of the entire QA Process.
  • Strong knowledge ofSoftware Development Life Cycle(SDLC), and QA Methodologies like Agile, Scrum, Waterfall, and Iterative process.
  • Involved in preparing theTest Plans, Test Strategy documents, Test Cases & Test Scriptsbased on business requirements, rules, data mapping requirements and system specifications.
  • Experience on Functional testing,System Testing,Regression Testing, Integration testing, GUI Testing,Security testing, Smoke testing, System Integration Testing (SIT), Software Validation testing, UAT,Batch Testing, Interface testing, and Performance Testing.
  • Involved in preparation of Requirement Traceability Matrix (RTM), Defect Report, and Weekly Status Reports, Checklists, Job Aids, Test Readiness Review Documents,Test Summary Reports.
  • Experience inData Analysis, Data Migration, Data Validation, Data Cleansing, Data Verification and identifying Data Mismatch.
  • Extensive experience in writing complexSQLQueriesand PL/SQL scripts.
  • Have tested several complex reportsgenerated by Business Intelligence tools Cognos, Business Objects BOXI, Micro Strategy, Cognos including dashboards, summary reports, master detailed, drill down and score cards.
  • Expertise in test management using HPQuality Center,Microsoft TFS and maintained Versioned objects using Rational Clear Case.
  • Excellent skills in overallDefect Management/Problem Solving, which includes reporting and tracking bugs using HP Quality Center, Rational Clear Quest, Bugzilla.
  • Experienced in interacting with Clients, Business Analysts, UAT Users and Developers.
  • Understanding of Data Models, Data Schema, ETL and created extensive store procedures, SQL queries to performback-end data warehousing testing.
  • Have experience in testing reconcile (initial) and delta (daily) ETL loads.
  • Have good experience inData Migration testingand Web Interface testing.
  • Have good experience usingInformatica Workflow Managerfor running/managing workflows and Informatica Workflow Monitor for monitoring the workflows and checking the session logs.
  • Strong experience inUNIXshell scripting, writing UNIX wrapper scripts and monitoring the UNIX logs to check for any errors.
  • Extensively worked on Batch Scheduling toolsAutosys, Control-Mto run and monitor ETL jobs.
  • Knowledge on the phases ofAgilemethodology and scrum Methodology.
  • Excellent interpersonal, communication, documentation and presentation skills.

TECHNICAL SKILLS:

QA/Testing Tools:HP Quality Center 10.x,9.x,Test Director,Quick Test Pro (QTP),TFS, Rally
Bug Tracking Tools: Bugzilla, Rational Clear Quest
Version Control Tools: Rational Clear Case, CVS
Requirements Tools: Rational RequisitePro, DOORS
RDBMS (Databases):Oracle 11i/10g/9i/8i, MSSQL Server2003/2005/2008, MS Access,DB2, Teradata
RDBMS Query Tools:SQL*PLUS, Query Analyzer,TOAD, Oracle SQL Developer, SQL Navigator
BI/Reporting Tools:IBMCognos,Hyperion, Micro Strategy, Business Objects BOXI, OBIEE
ETL Tools: Informatica Power Center 9.x/8.x/7.x, DataStage, Ab Initio, SSIS
Programming Languages:PL/SQL,C
Scripting Languages: SQL Scripts, UNIX Shell Scripts,VB Script
Incident Management Tools: Remedy
Operating Systems: Microsoft WindowsFamily,UNIX - Sun Solaris, HP-UX
MS Office:MS Word, MS Excel, MS PowerPoint, MS Outlook
Scheduling Tools: Autosys, Control-M
Other Tools: PuTTY, TextPad, CompareIT, SnagIT, WinSCP, Data Flux, GS Data Generator

PROFESSIONAL EXPERIENCE:

Client: Confidential, Washington D.C
Role: Sr. QA Analyst/ETL Tester/Reports Tester / Data Analyst Jan 11 - Present

Responsibilities:

  • Designed &createdMaster Test Plan based on the BRD and SRS. Also referred Technical Design document, Source to Target Detailed mapping document & Transformation rules document, ER Diagrams to derive test cases for functional testing.
  • Validated database structure with ERD.
  • Prepared Test cases and Scripts in Quality center and executed them in test lab module.
  • Performed pre-process validation for existence of specified database objects like tables, views, synonyms and grants for the same to corresponding schema in database.
  • Validated the target table column definition with source column definition (including type and length).
  • Involved in verifying the readiness of the system before starting process.
  • Involved in creating CR for any type of migration/changes using Remedy.
  • Participated in migratingInformatica, UNIX code from DEV to QA and QA to UAT environments.
  • Used Clearquest, TFS and Jira tools for defect/bug management.
  • Involved in executing/schedulingETLjobs through Autosys.
  • ValidatedInformaticamappings that load files into staging tables.
  • Involved in extensive end to end DATA validation and back-end testing using SQL queries.
  • Good work experience with Partition Exchange concept in Oracle.
  • Created BRD x SRS and SRS x Test Cases Traceability matrix for all the projects.
  • Responsible for QA status update and was Point of contact in the team of 5 members including 4 offshore QA Analysts from India.
  • Validated PL/SQL, Informatica batch processes which are scheduled using Autosys.
  • Validated various SSIS and SSRS packages according to functional specifications.
  • Heavily involved in performance testing and worked with analyzing tables, gathering statistics, tuning of SQL queries used to fetch source data.
  • Used TOAD GUI tool for querying oracle database.
  • Presented QA team work update/Status to Project/Program Manager in status meeting and also worked with business team to get approval on test plan/test cases.
  • Validated that design meets requirement and functions according to technical and functional specifications.
  • Involved with Design and Development team to implement the requirements.
  • Developed Test Scripts and executed them manually to verify the expected results and published the same to wider audience upon successful completion along with QA signoff.
  • Participated in KT meetings with Business Analysts and SMEs.
  • Frequently used Perforce to store documents and update them with the latest versions and additions.
  • Participated in walkthrough and defect report meetings periodically.

Environment:Informatica Power Center 8.6,SQL, PL/SQL,Remedy,DB2,Oracle 11g,Data Stage8.0,TOAD, Rational ClearQuest, RationalClearCase, Doors,Oracle SQL Developer, XML,MSTFS(Team Foundation Server),Putty, VB Script,Quality Center 10, Agile, Business Objects XI,Autosys, SQL Server 2008, SSIS, SSRS, SSAS,XP,UNIX

Client: Confidential, Charlotte, NC
Role: Sr. QA Engineer / Data Analyst/ Web Interface Tester/ETL TesterJan 10 - Dec 10

Responsibilities:

  • Involved in meetings with Business Analysts and End Users to review functional/technical requirements and responsible to translate business requirements into quality assurance test cases.
  • Used TOAD to develop and execute test scripts in SQL, to validate the test cases.
  • Involved in reviewing test scenarios, test cases and test results for data warehouse/ETL testing.
  • Prepared Requirements Traceability Metrics (RTM), positive and negative test scenarios, detailed oriented Test Scripts, Test Kickoff documents, Test Scorecard for test progress status, Test Results, Release Check list, Lessons Learned documents and Regression Test Suite for future use.
  • Responsible for testing Initial/Reconcile and Incremental/daily loads of ETL jobs.
  • Interacted with design/development/DBA team to decide on the various dimensions and facts to test the application.
  • Involved in testing the Autosys Batch programs, UI screens.
  • Planned ahead of time to test the mapping parameters and variables by discussing with BA's.
  • Extensively used Rational ClearQuest to track defects and managed them.
  • Validated various Ab Initio Graphs according to business requirement documents.
  • Extensively tested several Cognos reports for data quality, fonts, headers, footers, and cosmetics.
  • Conducted user acceptance testing (UAT) to validate that the developed application meets the business requirements.
  • Extensively involved in testing the ETL process from different data sources (SAP, PeopleSoft, Teradata, SQL Server, Oracle, flat files) into the target Oracle database as per the data models.
  • Tested reports in Cognos using Analysis studio, Report studio.
  • Validated datastage jobs. Used datastage director to execute and verify datastage jobs.
  • Used Autosys for scheduling the ETL mappings, PL/SQL subprograms.
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Mocked test data to test all the scenarios and test cases planned.
  • Used UNIX commands for file management; placing inbound files for ETL and retrieving outbound files and log files from UNIX environment.
  • Written several complex SQL queries for data verification and data quality checks.
  • Analyzed the testing progress by conducting walk through meetings with internal quality assurance groups and with development groups.
  • Responsible for documenting the process, issues and lessons learned for future references.
  • Reviewed the test activities through daily Agile Software development stand-up meetings.

Environment:Ab Initio 2.13,Teradata V2R6, datastage, Teradata SQL Assistant,SQL, Cognos 8.0,ClearCase,ClearQuest, RequisitePro, HPQuality Center, SharePoint, Web Logic,Oracle 10g, Fiddler, SQL Server 2008, DB2, PL/SQL,UNIX, PuTTy, Flat files, Session Logs, Rally, Remedy, Windows XP.

Client: Confidential, Pittsburg, PA
Role: Sr. QA Consultant / ETL Tester / Reports Tester Apr 08 - Dec 09

Responsibilities:

  • Involved in writing the test plans, test cases and RTM involved in the process of analyzing the expected and actual results.
  • Worked on Extracting Data from Mainframe and load the data into Oracle.
  • Performed Integration, End-to-End and System testing.
  • Responsible for source system analysis, data transformation, data loading and data validation from source systems to Transactional Data system and Warehouse System.
  • Extensively usedETLto load data from Flat files to Oracle.
  • Executed various SQL Queries to perform the backend testing.
  • Tested Informatica mappings for various ETL functions and transformations.
  • Had regular meeting with Developers to report various problems.
  • Clearly communicated and documented test results and defect resolutions during all phases of testing.
  • Stimulated several production cycles. Worked with data validation, constraints, record counts, and source to target, row counts, random sampling and error processing.
  • Performed Black-box testing, White-box testing, System Testing, Data Integrity Testing and end to end testing.
  • Implemented SDLC, QA methodologies and concepts in the Project.
  • Involved in Developing Test Plans and Developed Test Cases/Test Strategy with input from the assigned Business Analysts.
  • Tested UNIX batch jobs according to the specifications and the functionality.
  • Developed test scripts based upon business requirements and processes. Processed transactions from system entry to exit.
  • Used Informatica Workflow Manager to schedule and run Informatica mappings and Workflow Monitor to monitor ETL execution and logs.
  • Involved in developing Unix Shell wrappers to run various SQL Scripts.
  • Extensively created UNIX shell scripts for scheduling and running the required jobs.
  • Tested theETLData Stage jobs and otherETLProcesses (Data Warehouse Testing).
  • Tested theETLprocess for both before data validation and after data validation process. Tested the messages published byETLtool and data loaded into various databases.
  • Performed the tests in both the SIT, QA and contingency/backup environments.
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Heavily involved in testing Business Objects reports, extensively used SQL for testing reports.
  • Used TOAD Software for Querying ORACLE and Used WinSQL for Querying DB2.

Environment:Oracle 9i, Toad, DB2, SQL, Informatica, PL/SQL, Business Objects, MS Visio, DB2, Rapid SQL, Mainframes, TOAD, .Net, XML, COTS, RUP, Windows, Autosys, TIBCO, Remedy, QTP, SQL Server 2003/2005 SSIS, SSRS,UNIX, PuTTy

Client: Confidential, Bethesda, MD
Role: QA Consultant / Data Migration Tester / DWH Tester Oct 06- Apr08

Responsibilities:
  • Analyzed Business Requirements Document and Functional Specification document and analyzed the project needs.
  • Worked with Stakeholders updating project status and ensuring the delivery of QS.
  • Involved in the development of Detailed Test Strategy for Functional and System Testing.
  • Performed data validation testing after theETLprocess usingInformaticaPower Center.
  • Designed, Developed, Executed and Analyzed Test plan, test cases & test scripts for use in Integration, System, and Regression Testing.
  • Reviewed business requirements and added requirements into the Quality Center, and Wrote test cases and Functionality testing based on Requirements.
  • Extensively used HP Quality Center as defect tracking system to log, close and generate reports and tracked through to resolution.
  • Performed system testing, Integration testing, and Regression testing.
  • Performed Inbound and Outbound Data Validation.
  • Involved in preparation of test data to test the functionality of the sources.
  • Used SQL queries to extract data from the target tables to prove the data mapping.
  • Tested variousETLScripts to load the data into Target Data Warehouse (Oracle / Teradata).
  • Involved in understanding of the Specifications for Data WarehouseETLProcess and interacted with the designers and the end users for informational requirements.
  • Validated Data in Database using SQL queries (Teradata, Oracle).
  • Validated Conditions, Rules by using SQL.
  • Responsible for weekly status meetings showing progress and future testing efforts to the QA Manager.
  • Involved in fixing up the production issues, logging defects in the system and delivered the work within the SLA time.

Environment:Oracle 9i, Informatica, Datastage, SQL Server, SQL, PL/SQL, Business Objects Reports, Control-M, TOAD, IBM AIX, Teradata, Micro Strategy,Teradata SQL Assistant, Unix log files, Mercury Quality Center, Rational ClearCase, ClearQuest.

Client: Confidential, Minneapolis, MN
Role: Manual Back-End Tester/Web QA Analyst Aug 04- Sept06

Responsibilities:

  • Analyzed the functional requirements of the application and developed Test strategy for the application.
  • Developed various Test plans and Test cases and analyzed Test results.
  • Analyzed and documented the test results for each build of testing.
  • Enhanced and modified the scripts according to the test case scenarios.
  • Prepared test data to verify different types of scenarios.
  • Extensively involved in the Backend / Data base testing using SQL queries.
  • Responsible for tracking defects using Test Director and made sure those defects are efficiently passed from one layer to another.
  • Involved in adding Requirements, associating Requirements to the test cases writing and executed test cases using Test Director.
  • Used WinRunner for Web Based Offline functionality to test web interface check, object property check, input domain, client validation and server validation.
  • Conducted Static Text Testing, Text-Link Testing, Image Link Testing, and Broken Link testing.
  • Tested the Applications compatibility for various versions of IE and Netscape.
  • Responsible for performing Functional testing by creating Manual and Automated Scripts using WinRunner.
  • Worked with UNIX shell scripts.Analyzed the testing progress by conducting walk through meetings with internal quality assurance groups and with development groups.
  • Responsible for user Acceptance testing by giving demos to the client and getting sign off.
  • Responsible for documenting the process for future references.
  • Responsible for weekly status meetings showing progress and future testing efforts to the QA Manager.
Environment: Test Director, WinRunner, VB, Java, XML, J2EE, Web Sphere, MS Office, PL/SQL, SQL, Oracle, Windows XP/2000, UNIX, MS SQL Server 2000

We'd love your feedback!