We provide IT Staff Augmentation Services!

Sr. Etl/data Quality Analyst Resume

5.00/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY​

  • Around 7+ years of experience and Strong background in QA/ETL with analysis, coding, development and execution of Test Cases and Test Scripts.
  • Experienced with Test Plan, Test Case, Test Script, Change Request and Defect Request.
  • 6+ years of experience in Data Warehouse concepts and technologies like OLTP, Normalization/Denormalization, ETL, OLAP, Data Models (Star/ Snow Flake Schemas), Datamarts (Dimensions and Facts) and Reporting.
  • Extensive Experience with Utility Tools like TOAD, SQL Server Management Studio, SQL Developer, SQL* Plus, SQL*Loader and Rapid SQL, SQL Query Analyzer, SQL Assistant
  • Familiar with managing the defects and change requests through Testing Tools like Quality Center/Test Director, Clear Quest and StarTeam.
  • Experienced with ETL tools like Informatica, SSIS, and Datastage.
  • Extensive experience with Reporting Tools like OBIEE and Siebel Analytics.
  • Experienced in testing Cognos and Business Objects Reports. Familiar with Excel for developing Requirements Traceability Matrix (RTM) and Capturing and Updating Change Requests.
  • Experience in Data Analysis, Data Validation, Data Verification and identifying data mismatch.
  • Extensive understanding of Business Requirements/Rules, Functional and Technical Specifications.
  • Experienced with Transformations like Filter, Joiner, Union, Router, Look up and Sorter.
  • Experienced with reports like DrillDown, DrillUp, Narrative View and Pivot/CrossTab reports.
  • Experienced in querying the databases by developing and running SQL and PL/SQL queries using utilities like Toad and SQL Query Analyzer.
  • Extensive experience in manual testing the ETL and Business Intelligence Applications.
  • Experienced in scheduling and kicking off ETL jobs using workflow manager.
  • Knowledge in Performance Tuning of SQL and PL/SQL using Explain Plan and Trace Files.
  • Documentation of Requirements, Test Case, Test Scripts and Results using Excel.
  • Experienced with System, Integration, Regression and User Acceptance Testing.
  • Familiarity in various software systems and platforms with good understanding of Object Oriented Programming (OOP) and ERD. Excellent problem solving, interpersonal, and communication skills.

TECHNICAL EXPERTISE:

Testing Tools: HP Quality Center 10.0/Test Director, Borland StarTeam, Manual Testing, Clear Quest

Databases: Oracle 8i/9i/10g, DB2 8.0/7.0/6.0, SQL Server 2000/2005, MySQL, MS Access, MS Excel, Sybase, Teradata V2R5/V2R6

Utility Tools: TOAD 8.5, SQL Server Management Studio, Rapid SQL, SQL*Plus, SQL*Loader, SQL Developer, SQL Query Analyzer, SQL Assistant, WinSCP

ETL Tools: Informatica 9.1.0/8.1/7.1./6.2/6.1/5.1, Informatica SSIS, DataStage 7.0/6.1

Reporting Tools: OBIEE, Siebel Analytics, SSRS, BI Publisher, Cognos

Operating Systems: Windows 95/98/2000/XP, Linux and UNIX.

Microsoft Office: Excel, Access, Outlook and Power Point

Programming Languages: SQL, PL/SQL, T - SQL, C, C++, Java

PROFESSIONAL EXPERIENCE:

Confidential, Minneapolis, MN

Sr. ETL/Data Quality Analyst

Responsibilities:

  • Analyzed the business requirements and rules from business requirements document.
  • Prepared detailed Master Test Plan and Test Cases by understanding the business logic and user requirements
  • Created Test Plans,Test Cases and Test Specification Document for the application under Test.
  • Attended meetings with architects, developers and business analysts to understand data model, data flow and warehouse design. Reviewed the Mapping Document thoroughly and analyzed the mappings, mappings and design specifications.
  • Tested the Data Extraction, Data Transformation and Data Loading using Informatica Tool as per the Design Document.
  • Executed Informatica Jobs to test the ETL Process(Executed the jobs in UNIX command prompt)
  • Validated the Data in the WareHouse uding SQL Queries.Tested the SAS jobs in batch mode through UNIX shell scripts
  • Involved in code changes for SAS programs and UNIX shell scripts
  • Tested and Automated SAS jobs running on a daily, weekly and monthly basis using Unix Shell Scripting
  • Extracted raw data from an Oracle database and used SAS/ACCESS to read it and ran statistical analysis using SAS/STAT.
  • Extracted and transformed Source Data from SQL Server, Teradata and Oracle Databases.
  • Developed test scripts using SQL and PL/SQL according to the mappings and rules specified in mapping document. Tested OLAP cubes and SSRS reports for various business needs.
  • Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity
  • Wrote complex SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application.
  • Tested the created SSIS packages to transfer data from various sources like Oracle, Excel,
  • Worked with SSIS system variable, passing variables between packages.
  • Involved extensively in doing back end testing of the data quality by writing complex SQL
  • Worked on data quality analysis of data warehouse applications and performing technical data quality using software tools
  • Analyze data and create detailed reports utilizing proprietary tools and third party technologies to identify and resolve data quality issues to ensure quality assurance and drive data quality initiatives
  • Used T-SQL for Querying the SQL Server2000 database for data validation and data conditioning.
  • Tested stored procedures & views in MS SQL Server 2000/2005
  • Worked with Borland StarTeam to open Change Requests to business analysts.
  • Requested CRs to data architects to make changes in DDL via StarTeam tool.
  • Attended Daily Release Planning Meeting with Developers and QA Team to discuss Software changes and configuration management.
  • Scheduled and Kicked off Workflows and Worklets in Workflow Manager.
  • Used Workflow Monitor to monitor the workflow runs in Gantt Chat and Task Views.
  • Analyzed workflow statistics like throughput, Source to Target Success and Failure rows.
  • Read the Workflow Session Logs to find warnings and error messages in case of failure.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Communicated with DBAs to backup the data and discuss the query performance.
  • Written complex SQL queries and executed them to validate the incremental ETL Loads.
  • Used Synonyms and DB Links to validate the data from different schemas and databases.
  • Compared the data sets resulted from the databases Oracle and SQL Server using the tool, Toad for Data Analysts. Experienced with data models Start Schema and Snow-Flake Schema
  • Analyzed Data Marts and tested several Dimension and Fact tables.
  • Extensively used Informatica client tools. The objective is to extract data stored in Oracle database, flat files to load finally into a single data warehouse repository, which is in Oracle.
  • Used StarTeam to report bugs to developers and track them from time to time.
  • Used Microsoft Outlook to schedule meetings and communicate with the project team.
  • Attended Daily Status Call with Manger, Development Team and QA Team to discuss the project updates. Documented test cases, test scripts and test results in Microsoft Excel.

Environment: Teradata SQL Assistant,Oracle 10g, SQL Server 2005, SQL, PL/SQL, DB2, Quality Center 10.0, Borland StarTeam, Informatica (Workflow Manager, Workflow Monitor), Flat Files, Mapping Document, TOAD for Data Analysts 8.5, SQL Server Management Studio, Windows XP Professional, QTP, Unix, Linux, Test Case, Test Script, OBIEE, Microsoft Office (Excel, Power Point and Outlook),Cognos 8.0 Series

Confidential, Portland, ME

ETL/Data Quality Analyst

Responsibilities:

  • Conducted source-system data analysis to understand current state, quality and availability of existing data. Designed & Created Test Cases based on the Business requirements.
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Used SQL for Querying the database in UNIX environment for testing the business rules imposed on data movement.
  • Created SQL Server Integration Service (SSIS) packages between enterprise databases and Third party data source to transfer data
  • Created SSIS package to automate maintenance, database back up, update statistics, re-build indexes as well as create error log for event during data load.
  • Validated that design meets requirements and function according to technical and functional specifications.
  • Extracted data using Proc SQL and Pass-Thru SQL to Create SAS Analysis Data Sets PL/SQL and Oracle Stored Procedures were extensively used. Proc Compare, Proc Summary, Proc Contents and Proc Report were used for analyzing and verifying data.
  • Created SAS datasets by extracting data from various sources and used complicated data step logic.
  • Formulating various post production data quality and error handling SQL scripts
  • Worked on analyzing data quality and data profiling.
  • Analyzing Source Files to determine data quality Standards
  • Used HP Quality Center to state requirements, business components, test cases, and test runs for every iterations, defects. Also link defects with requirements.
  • Worked with other members of the QA and Development teams and offshore team in improving the processes, methods, effectiveness and efficiency
  • Use SQL, PL/SQL to validate the Data going in to the Data Ware House
  • Analyzing the AS-IS system which loads the Monthly/Weekly/daily multi source data files for Data mart using PL/SQL procedures and convert them into informatica code.
  • Testing included unit, system testing, regression testing Cognos reports testing.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Used Clear Quest to track and report system defects and bug fixes. Written modification requests for the bugs in the application and helped developers to track and resolve the problems.
  • Data Quality Analysis to determine cleansing requirements
  • Coordinating with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema Design (Star Schema) and maintenance. Promoted Unix/Informatica application releases from development to QA and to UAT environments as required.
  • Extensively tested the Cognos report by running the SQL queries on the database by reviewing the report requirement documentation. Participated in walkthrough and defect report meetings periodically.
  • Documented and reported various bugs during Manual Testing.
  • Involved in running the ETL loads in development server. Written SQL Queries to define, Identify and validate the code written for the data movement into the database tables.
  • Reported software bugs to the development team using Rational Clear Quest
  • Developed PL/SQL procedures & triggers.
  • Retesting the resolved defects/issues after the developers fix the defects
  • Involved in developing Unix Korn Shell wrappers to run various Informatica Scripts
  • Developed the strategies for analyzing the data for ETL Team
  • Updated the status of the testing to the QA team, and accomplished tasked for the assigned work to the Project Management team regularly.

Environment: Test Cases, Test Scripts, Test Plan, Copy Books,, Informatica 8.6.1, Oracle 11g, Teradata 12.0, Teradata SQL Assistant, SQL Server 2008, QTP, XML Files, XSD, XML Spy 2010, Clear Quest, PL/SQL, QTP, TOAD, SQL, UNIX, Quality Center 10.0,, Cognos 8.0 Series

Confidential, PARK RIDGE, NJ

ETL/Data Quality Analyst

Responsibilities:

  • Analyzed the business requirements and rules from business requirements document.
  • Attended meetings with architects, developers and business analysts to understand data model, data flow and warehouse design. Reviewed the Mapping Document thoroughly and analyzed the mappings, mappings and design specifications.
  • Developed test scripts using SQL and PL/SQL according to the mappings and rules specified in mapping document. Tested OLAP cubes and SSRS reports for various business needs.
  • Tested the created SSIS packages to transfer data from various sources like Oracle, Excel,
  • Worked with SSIS system variable, passing variables between packages.
  • Used T-SQL for Querying the SQL Server2000 database for data validation and data conditioning.
  • Tested stored procedures & views in MS SQL Server 2000/2005
  • Resolved data quality issues for multiple feeds coming from Mainframe Flat Files
  • Creating detailed Data Quality workbooks to define Data Quality Standards and metrics
  • Worked with Data Conversion projects for testing data quality after migration
  • Worked with Borland StarTeam to open Change Requests to business analysts.
  • Requested CRs to data architects to make changes in DDL via StarTeam tool.
  • Attended Daily Release Planning Meeting with Developers and QA Team to discuss Software changes and configuration management.
  • Scheduled and Kicked off Workflows and Worklets in Workflow Manager.
  • Used Workflow Monitor to monitor the workflow runs in Gantt Chat and Task Views.
  • Analyzed workflow statistics like throughput, Source to Target Success and Failure rows.
  • Read the Workflow Session Logs to find warnings and error messages in case of failure.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Communicated with DBAs to backup the data and discuss the query performance.
  • Written complex SQL queries and executed them to validate the incremental ETL Loads.
  • Used Synonyms and DB Links to validate the data from different schemas and databases.
  • Compared the data sets resulted from the databases Oracle and SQL Server using the tool, Toad for Data Analysts. Experienced with data models Start Schema and Snow-Flake Schema
  • Analyzed Data Marts and tested several Dimension and Fact tables.
  • Extensively used Informatica client tools. The objective is to extract data stored in Oracle database, flat files to load finally into a single data warehouse repository, which is in Oracle.
  • Used StarTeam to report bugs to developers and track them from time to time.
  • Used Microsoft Outlook to schedule meetings and communicate with the project team.
  • Attended Daily Status Call with Manger, Development Team and QA Team to discuss the project updates. Documented test cases, test scripts and test results in Microsoft Excel.

Environment: Oracle 10g, SQL Server 2005, SQL, PL/SQL, Borland StarTeam, Informatica (Workflow Manager, Workflow Monitor), Flat Files, Mapping Document, TOAD for Data Analysts 8.5, SQL Server Management Studio, Windows XP Professional, QTP, Unix, Test Case, Test Script, OBIEE, Microsoft Office (Excel, Power Point and Outlook)

Confidential, NEWARK, NJ

ETL/SQL TESTER

Responsibilities:

  • Involved in Data Validation using SQL queries.
  • Utilized Access database to collect data from multiple database sources using ODBC methods.
  • Clearly communicated and documented test results and defect resolutions during all phases of testing.
  • Worked with data validation, constraints, source to target row counts
  • Involved in Developing Test Plans and Developed Test Cases/Test Strategy with input from the assigned Business Analysts. Developed test scripts based upon business requirements and rules
  • Involved in testing the design and development of data warehouse environment.
  • Tested the messages published by datastage and data loaded into various databases
  • Reconciled test results from different tests and different groups.
  • Worked on profiling the data for understanding the data and mapping document.
  • Reported bugs and tracked defects using Quality Center
  • Used TOAD Software for querying the databases.
  • Effectively coordinated with the development team and Business Analysts.
  • Executed Datastage ETL jobs using UNIX to load the data into target tables.
  • Attended status meetings periodically with QA members and project manager.

Environment: Oracle9i, SQL, PL/SQL, SQL* Loader, TOAD, DataStage 7.0/6.1, Windows XP, UNIX, Business Objects XIR2, IBM UDB DB2, XML, COBOL, Flat Files, MS Excel, MS Access, HP Quality Center 8.0, RTM (Requirements, Test Case), Test Script.

Confidential, NEW YORK, NY

DATA/SQL TESTER

Responsibilities:

  • Involved in Business analysis and requirements gathering. Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Involved in complete project software development life cycle (SDLC) and Quality Assurance Life Cycle (QALC). Reported periodic project status and updates to the QA Lead and QA Manager
  • Performed Integration Testing of different modules.
  • Heavily involved in interacting with UNIX Shell scripts.
  • Analyzed Business Requirements and Developed the Test Plan, Test Scripts and Test Cases
  • Developed Test Plan and Test Cases for the entire module of Data mart.
  • the product.
  • Used Test Director to organize the Win Runner Scripts and schedule test execution.
  • Developed SQL scripts to validate the data loaded into warehouse and Data Mart tables using Informatica. Performed Data Driven Testing of the Application for different users/data sets
  • Preparation of test data for various levels of testing
  • Performed segmentation to extract data and create lists to support direct marketing mailings and marketing mailing campaigns. Load new or modified data into back-end Oracle database.
  • Optimizing/Tuning several complex SQL queries for better performance and efficiency.
  • Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
  • Worked on issues with migration from development to testing. Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Validated cube and query data from the reporting system back to the source system.
  • Tested analytical reports using Analysis Studio

Environment: Oracle 8i, SQL, TOAD 7.0, SQL* Loader, Informatica Power Center 7.1 (Workflow Manager, Workflow Monitor), DB2, SQL *Loader, Cognos 7.0, Flat Files, SQL Server 2000, MS Excel, MS Access, Test Director.

We'd love your feedback!