We provide IT Staff Augmentation Services!

Sr. Etl Dw Tester And Data Analyst Resume

Bolingbrook, IL

SUMMARY

  • Sr. ETL tester with 7+ years of Software Testing Life Cycle (STLC) experience with data analysis, data validation, data profiling, data verification, and data loading to achieve the highest degree of Quality and Performance.
  • Proficient in Test Cases and Test Scripts creation for Business processes with expertise in SQL, PL/SQL, Stored Procedures, Data Warehouse concepts, and Business Intelligence.

TECHNICAL SKILLS

ETL: DataStage, Informatica, Ab Initio, Talend, SSIS, Azure Data Factory.

Databases: Teradata, Oracle, SQL Server, MySQL, DB2, Netezza.

Big Data: Hadoop, HIVE, Cassandra, MongoDB, MarkLogic, Snowflake.

Real - Time Data Streaming: Kafka, Flume, NIFI, Kinesis.

Reports: Business Objects, Crystal Reports, Cognos, Excel.

Business Intelligence: Tableau, Power BI.

Scheduling: Control-M, Autosys, Tivoli.

Methodologies: Agile Scrum, Kanban, Waterfall

PROFESSIONAL EXPERIENCE

Sr. ETL DW Tester and Data Analyst

Confidential, Bolingbrook IL

Responsibilities:

  • Virtual payment system, an electronic payment solution whereby companies pay participating suppliers via electronic credit card instead of check eliminating papers and reducing processing cost and time.
  • Wrote test scripts and test scenarios for validating end - to-end data from source to target in 100+ ETL flow design, including SCD, Incremental loads, and automation where necessary.
  • Performed data analysis and validated BI dashboards and reports for data quality and performance as per SLA.
  • Provided exact defect replication method and suggestions to the development team via defect within JIRA.
  • Automated the process of creating, loading, pulling, and validating the data using Python, Shell scripts, and Excel Macros.
  • Expertise in designing the test scenarios and scripting the test cases in order to test the application.
  • Experience in Dimensional Data Modeling using Star and Snowflake Schema.
  • Expertise in QA process and different levels of testing like Functional, Regression, and Integration testing with business scenarios.
  • Strong experience with ETL systems and having expertise with tools Informatica and Data stage.
  • Strong experience in testing Data Warehouse (ETL) and Business Intelligence (BI) applications.
  • Strong experience in Data Analysis, Data Validation, Data Profiling, Data Verification, Data Loading
  • Experience in Dimensional Data Modeling using Star and Snowflake Schema
  • Experience in working with different models such as star and snowflake and strong knowledge in dimensional modeling.
  • Strong working experience in the Data Analysis and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation, and Data Loading (ETL)
  • Design, Development, and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
  • Worked on Informatica power center 9.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Involved in the design and development of complex ETL mappings.
  • Developed Python scripts to automate the test cases.
  • Designed and developed SSIS Packages to extract data from various data sources such as Access database, Excel spreadsheet, and flat files into SQL server for further Data Analysis and Reporting by using multiple transformations provided by SSIS such as Data Conversion, Conditional Split, Bulk Insert, merge and union all.irtual

ETL and Big Data QA Tester

Confidential

Responsibilities:

  • Performed performance testing, metadata validation, source to target data validation, and appropriate error handling in ETL processes.
  • Experienced in Informatica DVO data validation by testing complex views created against the target data.
  • Implemented partitioning and bulk loads for loading large volumes of data.
  • Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations, and sessions to optimize session performance.
  • Developed the Informatica mappings using various transformations, Sessions, and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files, and Teradata database.
  • Used VIM and PyDev (Eclipse binding with Python) as a script editor.
  • Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.
  • Performance tuning by session partitions, dynamic cache memory, and index cache.
  • Created Complex ETL Packages using SSIS to extract data from staging tables to partitioned tables with the incremental load.
  • Wrote complex SQL queries to perform data validation for Referential Integrity checks for history, full, and incremental loads. Used SSIS 2008/2012 to create ETL packages (.dtsx files) to validate, extract, transform and load data to Data warehouse databases, data mart databases, and process SSAS cubes to store data to OLAP databases.

ETL Tester

Confidential, Atlanta, GA

Responsibilities:

  • Performed different kinds of testing like Black Box, GUI, Functionality, Integration, Regression Usability, System, and User Acceptance Testing.
  • Created Requirement Traceability Matrix and updated the Requirement traceability matrix timely with the changes in HP Quality Center.
  • Developed and maintained Manual and Automation test scripts through HP Quality Center.
  • Developed automation Test Scripts using Eclipse, TestNG, and Java.
  • Used Quality Center for Bug Tracking, Bug fixing, and Bug Reporting.
  • Involved in the error checking and testing of the ETL procedures and programs Informatica session log.
  • Extensively created and executed SQL queries in Oracle, SQL Server, and MySQL tables to validate data at the back end, excellent understanding and knowledge of Business intelligence tools for various applications of ETL, Reporting, Data mining, Data Warehousing, and Analysis of data.
  • Involved in all the phases of SDLC and STLC and gather the requirements from BRDs and FRDs.
  • Generated the reports from the COGNOS report tool and compared them with the EDW database.
  • Generated daily/weekly reports using ALM for program and release meetings.
  • Environment: SSIS, SQL Server v R2, DB2, Netezza, IBM Cognos, Tivoli, Waterfall, ALM HP Quality Center.
  • Implemented SDLC, QA methodologies, and concepts in the Project.
  • Coordinated component, system testing with the appropriate technical groups and Release Management, and documented test procedures and findings.
  • Expertise in writing SQL Statements in the database to make sure whether the data is populated in Data Mart/Data warehouse According to Business Rules.
  • Involved with ETL test data creation/Identified for all the ETL mapping rules.
  • Stimulated several production cycles. Worked with data validation, constraints, record counts, and source to target, row counts, random sampling, and error processing.
  • Extensively worked with SCD Type 1 and Type 2
  • Tested the ETL process for both before data validation and after the data validation process. Tested the messages published by the ETL tool and data loaded into various databases

Hire Now