We provide IT Staff Augmentation Services!

Sr Dw Quality And Data Analyst Resume

5.00/5 (Submit Your Rating)

FL

SUMMARY:

  • About 12 + years of IT experience, with strong testing and Data Analyst experience in Data warehouse and Business Intelligence Programs and Projects
  • QA experience in all phases of Software Test life cycle, including requirements gathering, analysis, project planning, scheduling, testing, defect tracking & management, and reporting
  • Strong SQL writing skills using different database environments like Oracle, Teradata, MS SQL Server, MySQL
  • Has worked in Migration, Conversion, ETL, Datawarehouse and Application testing projects
  • Well Versed with SDLC and Agile methodologies in business domains Banking and Financial, Life science and Insurance & GIS
  • Extensive experience in QA process design and implementation, Estimation and Strategic QA planning
  • Expertise in Black box testing, Functional testing, Integration testing, Database testing and Regression testing
  • Good Knowledge on Excel Macro, PLSQL, Bteq, UNIX Shell Scripting
  • Fairly good knowledge of ETL tools like Informatica, Data Stage, Ab - initio and SSIS and BI tools like Cognos, QlikView and OLAP Methodology from the QA perspective
  • Have Knowledge performing Data Analysis from source systems to datamart and provide root cause analysis for defects
  • Experience with various environments like UNIX, Windows and have theoretical understanding on Hadoop and Big Data
  • Experience working with Cross functional teams for the execution of organization wide projects and Programs
  • Has Experience working in Onsite Offshore Model
  • Quick learner with the ability to grasp new technologies. Energetic and self-motivated team player
  • Proven ability to work in both independent and team environment

TECHNICAL SKILLS:

RDBMS: Oracle, Teradata, MySQL, SQL Server

Domains: Banking and Financial Services, Life Sciences, Insurance, GIS Domain

Operating Systems: Windows 2000, Windows NT, UNIX

Testing Concepts / ETL Testing: Test Strategy, Test Analysis, Test Design, UAT (User Acceptance Testing)

Tools: Toad, SQL*PLUS, HP ALM, MQC(Mercury Quality Centre), JIRA, SFTP, SCP (Secure Copy), Oracle Data Pump Utility, Control-M, SVN (subversion), SharePoint, Bugzilla

Programming Area: SQL, PL/SQL, Basic Unix Scripting, BTEQ (Teradata)

ETL/Reporting Tools: Informatica, Ab-initio, DataStage, Cognos, Qlik View

PROFESSIONAL EXPERIENCE:

Confidential

Sr DW Quality and Data ANALYST

Responsibilities:

  • Coordinate with key stake holders, SME, Application owners and other analysts for business requirement gathering and analysis.
  • Profile and analyze customer data against information quality expectations or specifications.
  • Unwinding the existing logic in Oracle to create STT (Source to Target) documents
  • Creating mappings that involve complex business rules. Preparing and reviewing detail design and technical specification documents.
  • Writing complex SQL to redefine business requirement from different data sources
  • Validate and Verify the Data in all the Environments including QA & UAT and PRODUCTION
  • Raise and track defects in ALM till closure
  • Handle the defect triage meetings
  • Analyzed the defects to find and document the root cause.
  • Involved in migration of database from Oracle to Teradata.
  • Involved in technical discussion to resolve the complex data issues and help developers on implementation.
  • Assign, Monitor and Manage the tasks among other analysts

Environment: Teradata, Oracle 11G, SQL Assistant, Ab-Initio, Toad for Oracle, HP Quality Center ALM, Excel Macro

Confidential, FL

DW /ETL Test Lead

Responsibilities:
  • Involved in creating Test Strategy and Approach for overall Conversion Testing
  • Understanding the Requirements and analyzing Source Systems
  • Review of ETL specification documents and data models.
  • Designed test plans and defined cases for functional, integration system testing
  • Coordinating with Onsite and Offshore Testing Team on Day to Day basis for Data Validation and Verification
  • Defect management, chairing the Triage calls and Quality Centre Administration, Managed Transitioning Projects to Offshore, Knowledge transition to offshore including On-site / Offshore liaison, resourcing
  • Defect identification & analysis and established guidelines for severity and priority for defects
  • Status reporting to Kemper and Cognizant Program Management
  • Liaison with design and development teams, To undertake testing against agreed test schedules and to report results

Environment: GuideWire(ClaimCenter), Informatica, Winsql, Toad, Quality Center

Confidential, HARTFORD, CT

DW Functional Test Analyst

Responsibilities:
  • Understanding the Requirements and finding the Gap in the Testing and help QA team to understand Business Scenarios apart from the Data Validation & Verification
  • Raising issues that can lead to potential delays, frequent customer interaction and client meetings for understanding the existing functionality and processes
  • Understanding the Business Criticality and update the missing scenarios to help improve overall testing process
  • Defect Triage with BA’s, Data Architects and Developers in order to identify valid defects
  • Reviewing the work from QA team on day to day basis
  • Build relationships to be leveraged for references to other business lines/prospects
  • Was involved in helping the QA team to work on Data Quality Checks and help team to present the same to the Client Business Team

Environment: Oracle 11G, Informatica, PLSQL Packages, Toad for Oracle, HP Quality Center ALM

Confidential

Sr DW Test Analyst

Responsibilities:
  • Leading and managing Team of QA and Production Support
  • Managing overall of Testing of the Application, Analyzing the design documents and understanding the functionality of both source and target systems
  • Raising issues that can lead to potential delays.
  • Configuration management and change management of all project artifacts
  • Frequent customer interaction for solving issues, Defect Management in QC and Weekly & Monthly Test report Generation
  • Prioritizing activity and maximizing effectiveness of resource allocation
  • Ensuring adherence to the project lifecycle processes including preparation and / or review of key testing deliverables, including, Test Approaches, Test Strategy, Master Test Plans, Status reports and Test Summary Reports, Streamlining business processes

Environment: Informatica 9.1, Oracle 11G, Toad for Oracle, Quality Center, QlikView

Confidential

Test Analyst

Responsibilities:
  • Support legacy Credit Risk & Monitoring Application on day to day Execution and Change request Management
  • Understanding the Business Requirements, Analyzing the design documents and understanding the functionality of both source and target systems
  • Modifying the PL/SQL code, developing the code for replicating the database using Oracle Data pump and Unix Shell script
  • Logging the Defects in QC (Mercury Quality Center)
  • Involved in writing Automation scripts for avoiding Manual Intervention
  • Helping the Credit Risk Business Team for testing the code.
  • Involved in interaction with Business users, SMEs and preparing Documents of Understanding for various applications and getting sign-off for the same
  • Knowledge Transfer sessions to the new members in the team, Co-coordinating the over all Testing activities including Unit Testing and System Testing and production implementation

Environment: Oracle, Sysbase, Toad, SSH ( File Transfer), Control M, jisql, Oracle Data Pump

Confidential

ETL Tester

Responsibilities:
  • Analyzing the design documents and understanding the functionality of both source and target systems
  • Preparation of traceability Matrix and Capturing the Complete requirements
  • Frequent customer interaction for solving issues
  • On Call Support for the Complete application and Fixing the Issue as and when required
  • Logging defect and Managing QC (Mercury Quality Center) for complete Testing life Cycle.
  • Worked on some changes in PL/SQL codes for improving the performance,
  • Later on took extra Responsibility of Handing the Test Lead Position

Environment: DB2, Teradata, Datastage, Quality Center, Control-M

Confidential

Software Developer

Responsibilities:
  • Understanding the Change request, getting it clarified by Business Team, Analyzing the design documents
  • Customer interaction for solving issues in requirements
  • Develpoing PL/SQL Procedures to achieve the required functionality as per changes
  • Changes made in PRO*C and Shell Scripts code as per the requirements
  • Testing the Overall Functionality
  • Preparing the necessary documents like Unit test Plan, Deployment Checklist, etc.

Environment: Oracle, Toad, PRO*C, Unix Shell Scripting, Subversion

Confidential

Database-Developer

Responsibilities:
  • Maintenance of Database
  • Changes in PL/SQL Procedures and Function, Packages
  • Testing the Overall Functionality, creating Unit Test cases
  • Client Interactions and attending meeting to clarify the requirements
  • Maintaining the complete Process and Documentation

Environment: Oracle, Toad for Oracle, Batch Processing, SQL Loader

Confidential

Test Analyst

Responsibilities:
  • Validating various navigation rules on the Oracle database (Bison), GIS database testing using the GUI tool Atlas, Batch validation
  • Comparing the D96 (Mainframe) reports with Bison reports by firing appropriate Pl/SQL scripts and SQL queries on the Bison (Oracle) database
  • Determining the mismatches in the two reports and reporting the legitimate mismatches
  • Analyse results and defects reporting according to the Requirements and Communicate and Interact with Client.
  • Documentation of the test reports mentioning the passing or failing of the test cases with adequate supporting reasons in Quality Center.

Environment: Atlas (GUI Tool), Bison(Oracle) Database, Quality Center, Query Tool (GIS)

Confidential

Tester

Responsibilities:
  • Execution of test cases to test the functionalities of the product
  • Modifying the host machines by Installing OS such as Windows and Linux as daily requirement for mobility testing
  • Carried out manual testing in multiple cycles to test the functionality as well as integrated product, Testing
  • Reporting to the client about the bugs, steps to reproduce the bugs with build number and hosted branch, category and component in bugzilla Tool

Environment: VMware Server, GSX, ACE and Workstation

We'd love your feedback!