We provide IT Staff Augmentation Services!

Etl Tester /big Data Tester Resume

4.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Over 7+ years of professional experience in as a Quality Assurance (QA) AnalystinHadoop, Big data Technologies, ETL, Automation and BI Tools.
  • Working knowledge in Cloudera Distribution as a Client tool for Hadoop ecosystem.
  • Experience in working wif various components of Hadoop like HDFS, Hive, Sqoop.
  • Experience in transferring the initial and incremental data from RDBMS to Hive using Sqoop.
  • Knowledge in HBase and Pig components.
  • Worked in developing Test Cases, Test scripts, Test Matrix and execution of Test Cases.
  • Knowledge of Dimensional Data Modeling using Star and Snowflake schema.
  • Experience in testing Data Marts, Data Warehouse/ETLApplications developed in Data stage and Informatica using SQL Server.
  • Experience in Data Analysis, Data Validation, Data Verification, Data Cleansing, Data Completeness and identifying data mismatch.
  • Exclusive experience in all aspects of Software Test Life Cycle including System Analysis, Design, Development, Execution, Reporting and Closure Documentation.
  • Experience in writing and executingTestPlansandTestCasesfrom Requirements and Design documents.
  • Used ETL methodologies for supporting data extraction, transformations and loading processing, in a corporate - wide ETL Solution using Informatica PowerCenter.
  • Experience in reports data validation dat are developed in Tableau/Cognos /MicroStrategy
  • Expert in writing complex SQL Queriesto check the integrity of data to perform database testing.
  • Experience on UNIX commands and Shell Scripting.
  • Experienced in Functional or System testing, Integration testing, Regression testing, UAT, GUI or Web-based Testing.

TECHNICAL SKILLS

Operating Systems: Windows, LINUX, UNIX

Big Data Technology: HDFS, MapReduce, Hive, Pig, HBase, Sqoop

Languages: SQL, XML

Test Management: HP ALM 11.0, JIRA 8.2, QTEST 7.4

Webservices: Soap UI 5.5

ETL Tools: Informatica 10x/9.6.x, IBM DataStage 11.7

RDBMS/ Databases: Oracle 11g/10g, SQL Server 2008/2005, Teradata 13

BI Tools: Tableau 10.5, Cognos 11, MicroStrategy 10

Automation Tools: PyCharm, Informatica DVO, Zena

PROFESSIONAL EXPERIENCE

ETL Tester /Big Data Tester

Confidential, Chicago IL

Responsibilities:

  • Analyzed business requirements from client, create test cases based on the requirements, execute test cases, validate test results, and document test evidence for test approval.
  • Prepared test data for positive and negative test scenario as per test plan.
  • Wrote complex SQL script to validate data in different layers for downstream application.
  • Ran ASG Zena Processes to load the data from source layers to different consumption layers of the data lake as a first step for validations.
  • Worked on Teradata SQL Assistance for querying data in the databases for quality analysis.
  • Worked in PyCharm automation tool to validate the data completeness, data quality as per general requirement.
  • Maintained proper test documentation and provide sign off on timely manner.
  • Participated in business discussions wif business, product owner, DA and SAs to understand business requirements, define test requirements and to provide test estimations to satisfy the business objectives.
  • Reviewed and discussed test cases wif SA, BA, architect.
  • Performed Extract Validation and DQ Validation along wif error report and Archival Validation for data quality threshold before sending it to downstream applications.
  • Validated through Infogix Assure to verify balance and controls on the data.
  • Linked test requirements, test cases and defects in Jira and qtest.
  • Accomplished Testing Goals by doing Data Completeness, Data transformation, Data Quality, Performance and Scalability.
  • Extracted reports and metrics from the qTest application on the weekly basis for review wif the upper level management.

Environment: HDFS, HIVE, Sqoop, PIG, TricentisqTest, ASG Zena, Infogix, Talend, WinSCP, MS Word, MS Excel, Teradata, HQL, UNIX, Windows, Tableau

ETL Tester /Hadoop Tester

Confidential, Dayton, Ohio

Responsibilities:

  • Analyze requirements, create and execute test cases, validate test results and document test evidence.
  • Work wif different Hadoop components like HDFS, Hive, Sqoop.
  • Execute the Sqoop jobs to load the initial and incremental data from RDBMS to Hive and performed validation.
  • Used HQL Queries to analyze the HDFS data loaded from the source systems.
  • Written queries in the Hive terminal to validate the data between the source and the target databases.
  • Maintained all test cases on HP ALM.
  • Participated in daily scrum meetings and sprint planning meetings.
  • Tracked all the defects down to closure using defects module in HP ALM.
  • Interacted wif the users to ensure meaningful development of the scripts and simulated real time business scenarios.
  • Developed SQL scripts using TOAD to query the databases and analyze the results.
  • Involved in various kinds of testing like functional testing, system testing and integration testing.
  • Involved in testing the Tableau reports by writing SQL queries in DWH and verified data quality issues, fonts, headers & cosmetic.
  • Test ETL mappings and reusable transformations for daily data loads.
  • Discuss issues/defects wif the business and development teams.
  • Testing/Verifying/logging defects.
  • Good experience in communicating wif offshore team.

Environment: HDFS, Hive, Hbase, Pig, Sqoop, Tableau 10.4, SQL Server, HP ALM 11.0, SOAP UI 5.5, Agile, TOAD, WindowsOS, JIRA 8.2

ETL Tester

Confidential, MI

Responsibilities:

  • Worked wif Business analysts to understand Business/System requirements in to transform business requirements into functional test cases.
  • Responsible for reviewing Functional specification, user documentation and use cases and developing test cases out of it.
  • Responsible for planning and directing Quality Assurance and Software change control policies.
  • Study and analysis of the mapping document indicating the source tables, columns, data types, transformations required, business rules to be applied, target tables, columns and data types
  • Coordinated in setting up Test Environments and Test Plan needs wif team members.
  • Developing various Test Scripts, Test Execution using HPALM based on the Functional Specifications.
  • Executed Test Cases using positive and negative data in ALM Test Lab and reported results and defects using Quality Center’s Defects tool.
  • Involved in validating the Datamart tables as per the business logic applied to the staging tables.
  • Co-ordinate wif the team to ensure and track total quality deliverables to the business requirements.
  • Expertise wif System Testing, Functional Testing, Regression Testing.
  • Documented identified defects and implemented resolutions as defined by management.
  • Prepared and executed SQL queries to validate data between source and target databases
  • Created a Traceability Matrix for Requirements vs. Test Case Matrix.
  • Tested Informatica ETL mappings dat transfer data from DWH systems to the Data Mart.
  • Performed Test regression for integration of Micro Strategy reports.

Environment: Informatica 10x, HP ALM 11.0, Agile, SQL Server, WindowsOS, MicroStrategy 10

ETL Tester

Confidential, Montvale, NJ

Responsibilities:

  • Performed negative and positive testing to make sure validations are done properly.
  • Provided updates to the Test lead on testing status and defect summary reports.
  • Carried out and oversee test execution for Business scenario, User acceptance and Regression testing.
  • Involved in Data Validation using SQL queries.
  • Used DataStage as anETLTool for developing the Data Warehouse.
  • Extensively involved in Extracting, Transformation and Loading of data to target Oracle warehouse database.
  • Raised defects in HP ALM and followed up wif the development team to avoid slippage.
  • Tested Cognos Reports, which would facilitate decision-making. The Cognos Reports designed used formulas, parameters, selection criteria, sub reports etc.
  • Worked wif Data Warehousing developers who extensively used DataStage, to design mapping to move data from source to target database using Stages to do the functionality.
  • Prepared defect summary and test case summary reports.
  • Experienced in writing SQL queries for extracting data from multiple tables.
  • Reviewed the test cases written based on the Change Request document.
  • Testing has been done based on Change Requests and Defect Requests.
  • Preparation of System Test Results after Test case execution.
  • Effectively coordinated wif the development team.
  • Created critical scenarios for each change request and defect request.

Environment: IBM DataStage 8.7, BI Cognos 10.2, MS Access, MS Excel, MS Word, XML, Oracle 11g/12cSQL Server, HPALM 11.0

We'd love your feedback!