We provide IT Staff Augmentation Services!

Big Data And Etl Test Lead Resume

2.00/5 (Submit Your Rating)

NA

SUMMARY:

  • Having 9 yrs. of data warehousing and Big Data experience using Hadoop, Hive, PIG, SQOOP, HDFS, Informatica Power Center 9.1/8.6, ETL Concepts, SAP BO and Data warehouse and BI Testing.
  • Strong understanding of Big Data, Data warehouse and BI Analytics concepts.
  • Experience on Hadoop echo systems Hive, Impala, Flume, SQOOP, Oozie, Hue and Pig.
  • Have Experience on Hive queries for data analysis to meet the business requirements.
  • Experience on data loaded in HIVE External and Managed tables using SQOOP.
  • Experience in validating the files loaded into HDFS.
  • Validating that there is no data loss by comparing HIVE table data against RDBMS data.
  • Well - versed with all stages of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
  • Professional experience in Integration, Functional, Regression, System Testing, Load Testing and UAT Testing.
  • Experience in SQL and PL/SQL scripting.
  • Sound Knowledge and experience in Metadata and Star schema/Snowflake schema. Analysed Source Systems, Staging area, Fact and Dimension tables in Target D/W.
  • Experience in creating Test Readiness Review (TRR), Requirement Traceability Matrix (RTM) documents.
  • Good Autosys experience from job execution and data testing standpoint.
  • Experience in Reports testing using SAP BO and OBIEE.
  • Expertise in Data Lake and Big Data Testing.
  • Expertise in Hadoop, Hive and Informatica Exposure to Insurance domain.
  • Having good experience in ETL and BI Testing, Developing and Supporting Informatica applications.
  • Understanding the various levels of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
  • Experience in understanding Business Process from the requirements and converting them to test scenarios.
  • Experience in Test Estimations based on requirement analysis.
  • Experience in preparing Test Strategy, developing Test Plan, Detailed Test Cases, writing Test Scripts by decomposing Business Requirements, and developing Test Scenarios to support quality deliverables.
  • Experience in using HP Quality Centre for test management, Bug Tracking and Defect Reporting
  • Experience in working with Software Development team in resolving Defects, presenting the Defect Status reports, resolving requirement, and observed design inconsistencies.
  • Expertise in querying and testing RDBMS such as Teradata, Oracle, MS SQL Server, Netezza for data integrity.
  • Extensive experience in HP ALM.

TECHNICAL SKILLS:

Testing Methodologies: Big Data / ETL Testing Methodologies

Big Data/Cloud Technologies: Hortonworks, Cloudera, HDFS, Sqoop, Hive, Flume, Impala, Pig, Oozie, HUE, Elastic Search, Microsoft Azure

ETL Tool: Informatica Power Center 9.6.1, Abinitio 1.1.3

Reporting Tool: OBIEE 10g, SAP Business Objects 14.2.3

Scheduling Tool: Control-M 8v, Autosys

RDBMS: Oracle 10g/11g/12C, Teradata 13v, SQL Server 2008, Mainframe DB2, Netezza

Programming Languages: SQL, PL/SQL

Business Applications: MS-Office

Scripting Languages: HTML, XML, Shell Scripting

Tool: BMC Remedy Tool 7.6, Winscp, HP ALM 12.2, Jira, Jenkins

Methods: Change Management, Problem Management, Incident Management

Operating Systems: Windows 9x/2000, UNIX, LINUX

PROFESSIONAL EXPERIENCE:

Confidential, NA

Big Data and ETL Test Lead

Environment: Hortonworks, HDFS, HIVE, HUE, YARN, PIG, SQOOP, Map Reduce, DB2, Oracle, SQL Server, Netezza, UNIX, HP ALM, Autosys, Informatica 9.6.1, Business Objects

Responsibilities:

  • Prepared Test Plan/Approach, Test Scenario’s and Test Cases Design and Review.
  • Written Hive/Sql queries for data validation to meet the business requirements
  • Written Test Cases for Data Correctness, Data Transformation, Metadata, Data Integrity, Data Quality, Data Security, Negative Scenario Tests.
  • Worked in Sanity Testing, System testing, Re-Testing and Regression Testing of HIVE Tables.
  • Written HIVE queries and HDFS commands to validate the data between HDFS Files and HIVE External tables to validate the data loaded in hive external tables.
  • Have sound knowledge on Environment Setup and END to END Data loads in QA and UAT Env for data refresh.
  • Ensure the MaReduce jobs are running at peak performance.
  • Support testing for a range of projects from defect fixes and enhancements to strategic initiatives.
  • Have sound knowledge on data copy (HDFS Files and Hive Tables) from PROD to QA and UAT Env for test data creation.
  • Have sound knowledge in analysing the source data sets to identify the data integration issues between dependent source systems.
  • Have good technical knowledge to understand the Project Architecture and to identify the design issues.
  • Have good insight into the domain to identify the gaps in requirements and data loss from source systems.
  • Mapped Requirements to Test Cases Requirement Traceability Matrix .
  • Experienced in reviewing Hive Query log files.
  • Experienced in validating SQOOP scripts.
  • Validated data in different file formats like tab delimited, fixed width and Json files.
  • Experienced in writing HDFS commands to validate the data loaded in HADOOP File system.
  • Have knowledge to verify available resource utilization in YARN server
  • Experienced in importing data from different RDBMS source systems into Hive tables using Hadoop echo system SQOOP.
  • Mentor junior members of the team.
  • Test Results Reporting to Stakeholders.
  • Have knowledge on HDFS to bring local files into HDFS system and export HDFS files to local system.
  • Logged defects in the defect-tracking tool QC.

Confidential

Big Data Tester

Environment: Hortonworks, HDFS, HIVE, HUE, SQOOP, Flume, Oracle, SQL Server, DB2, UNIX, HP ALM

Responsibilities:

  • Discussions with the Work Streams (Source Systems) about the Business Requirements and Pre-requisites.
  • Validated the data loaded in HDFS against the RDBMS tables using Hive and SQL Queries.
  • Prepared Test Plan/Approach, Test Scenario’s and Test Cases Design and Review.
  • Written Hive/Oracle queries for data validation to meet the business requirements
  • Written Test Cases for Data Correctness, Data Transformation, Metadata, Data Integrity and Data Quality Tests.
  • Worked in Sanity Testing, System testing, Re-Testing and Regression Testing of HIVE Tables.
  • Written HIVE queries and HDFS commands to validate the data between HDFS Files and HIVE External tables to validate the data loaded in hive external tables.
  • Written HIVE queries to validate the data between the Hive External tables and Parquet tables.
  • Participated in Test Case walkthroughs, Review meetings.
  • Mapped Requirements to Test Cases Requirement Traceability Matrix .
  • Experienced in reviewing Hive Query log files.
  • Experienced in validating SQOOP scripts.
  • Validated data in different file formats like tab delimited, fixed width and Json files.
  • Experienced in writing HDFS commands to validate the data loaded in HADOOP File system.
  • Have knowledge to verify available resource utilization in YARN server.
  • Actively participate with project team in requirements gathering and analysis.
  • Have knowledge on SQOOP echo system to import data into HIVE tables.
  • Create and maintain all testing artifacts.
  • Test Results Reporting to Stakeholders.
  • Have knowledge on HDFS to bring local files into HDFS system and export HDFS files to local system.
  • Logged defects in the defect-tracking tool QC.
  • Worked closely with offshore team on work assignments.

Confidential

Senior Big Data Tester

Environment: Cloudera, HDFS, HIVE, HUE, SQOOP, Oracle, UNIX, JIRA, Impala

Responsibilities:

  • Prepared Test Plan, Test Scenario’s and Test Cases Design and Review.
  • Discussions with the Work Streams (Source Systems) about the Business Requirements and Pre-requisites.
  • Validated the data loaded in HDFS against the RDBMS tables using Hive and SQL Queries.
  • Prepared Test Plan/Approach, Test Scenario’s and Test Cases Design and Review.
  • Written Hive/Oracle queries for data validation to meet the business requirements
  • Written Test Cases for Data Correctness, Data Transformation, Metadata, Data Integrity and Data Quality Tests.
  • Migrated data from different RDBMS systems like Oracle, SQL Server, Mainframe VSAM, AS400 DB2, Delimited and Fixed width file format using SQOOP.
  • Worked in Sanity Testing, System testing, Re-Testing and Regression Testing of HIVE Tables.
  • Written HIVE queries and HDFS commands to validate the data between HDFS Files and HIVE External tables to validate the data loaded in hive external tables.
  • Written HIVE queries to validate the data between the Hive External tables and Parquet tables.
  • Participated in Test Case walkthroughs, Review meetings.
  • Mapped Requirements to Test Cases Requirement Traceability Matrix .
  • Experienced in reviewing Hive Query log files.
  • Experienced in validating SQOOP scripts.
  • Written Python scripts to validate the data between Source Files and Hive table data.
  • Validated data in different file formats like tab delimited, fixed width and Json files.
  • Experienced in writing HDFS commands to validate the data loaded in HADOOP File system.
  • Have knowledge to verify available resource utilization in YARN server.
  • Have knowledge on SQOOP echo system to import data into HIVE tables.
  • Have knowledge on HDFS to bring local files into HDFS system and export HDFS files to local system.
  • Developed Hive queries and Pig scripts to analyze large datasets.
  • Experience in Using Sqoop to connect to the DB2/Oracle and moved the data to hive tables or Avro files.
  • Involved in generating the Adhoc reports using Hive queries.
  • Logged defects in the defect-tracking tool JIRA.
  • Worked closely with offshore team on work assignments.

Confidential

Senior Test Analyst

Environment: Informatica 8.6, Control-M, Unix, Teradata

Responsibilities:

  • Discussions with the Business Analyst and Work Streams (Source Systems) about the Business Requirements and Pre-requisites.
  • Gathered the business requirements from the Business Partners and Subject Matter Experts.
  • Involved in Analysis, Design, Development and Testing of application modules.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Testing of the data in staging area and Loading of data with initial load and incremental loads.
  • Wrote complex SQL Queries to in corporate ETL logic and to validate Test Cases.
  • Worked with fact tables and dimention talbes to validate the data integrity.
  • Validated BO Reports.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.

Confidential

Senior ETL Tester

Environment: Informatica 8.6, Oracle 10g, Control-M, UNIX, QC 10.0

Responsibilities:

  • Discussions with the Business Analyst and Work Streams (Source Systems) about the Business Requirements and Pre-requisites.
  • Analyze the result set of data loaded by monitoring the properties using the Workflow Monitor.
  • Monitored the scheduled Informatica jobs through Control-M Enterprise Manager as part of Test Execution.
  • Prepared Test Plan, Test Scenario’s and Test Cases Design and Review.
  • Worked in Sanity Testing, System testing, Re-Testing and Regression Testing of Data warehouse STLC.
  • Written Test Cases for Data Correctness, Data Transformation, Metadata, Data Integrity and Data Quality Tests.
  • Worked on SQL scripts to validate the data in the warehouse.
  • Executed test cases on Source database tables, Staging tables and Data warehouse tables.
  • Participated in Test Case walkthroughs, Review meetings.
  • Logged defects in the defect-tracking tool Quality Center tested and maintained trailing history of the defects found in the software.
  • Presented Test cases and Test Results to the client.
  • Tracked the defects using Quality Center tool and generated defect summary reports.
  • Created Daily, Weekly Status Reports to state the testing progress, Issues and Risks to the Project Lead.
  • Given training sessions to the Testing team on DWH Concepts, ETL Process and ETL Testing methodologies.
  • Assign tasks, monitor and review status and progress.
  • Mapped Requirements to Test Cases Requirement Traceability Matrix .
  • Uploaded Test cases into QC Test Plan and moved test cases into Test Lab component.
  • Captured the Run statistics of Informatica workflows as part of Performance Testing, and done Load Testing of Informatica objects through High volumes of data.
  • Involved in UAT Testing.
  • UAT Defects casual analysis.

Confidential

Production Support Executive L3, L2

Environment: Informatica 8.6, DAC 10g, Oracle 10g, UNIX

Responsibilities:

  • Root cause analysis of Job Failures and Recovering the Failed Jobs.
  • Involved in planned and unplanned downtime activities and coordinate with the required teams to bring the system up for use.
  • Resolving issues raised by customers using Remedy Request.
  • Analyze the result set of data loaded by monitoring the properties using the Workflow Monitor.
  • Making sure that the application is available for the business users for their critical functioning.
  • Coordinate with the interface teams on any issue which hampers availability of the application.
  • Performing Emergency bug-fixes.
  • Having exposure to BMC Remedy Tool to raise the change requests in requirements to the business people.
  • Requirement Analysis for enhancements/changes.
  • Schedule and Monitor the Workflows using Data warehouse Administrator Console (DAC).
  • Analyzed session log files in case of session failed to resolve errors in mapping or session configuration.
  • Used Debugger to troubleshoot the mappings.
  • Worked with Change Management team during Production migration.
  • Prepared Standard Operating Procedures (SOP) and Frequently Occurring Issues (FOI).

We'd love your feedback!