We provide IT Staff Augmentation Services!

Sr. Big Data Test Analyst Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • 9 years of experience as an ETL/HADOOP/Big Data Tester.
  • Experienced in all stages of Software Development Life Cycle and Software Testing Life Cycle.
  • Good Knowledge and experience in Metadata and Star schema/Snowflake schema.
  • Analyzed Source Systems, Staging Layer, Fact and Dimension tables in Target D/W.
  • Well - versed with different components of Hadoop like HDFS, HIVE, Sqoop, PIG and HBASE.
  • Knowledge in Hadoop 2.0 Architecture and Hadoop 1.0 Architecture.
  • Worked in Cloudera as Client tool for Hadoop.
  • Written several Hive scripts for data validation between the Source and Staging Layer in HDFS.
  • Written several Hive queries to validate the data between the hops in the datalake from Raw, Staging, consolidation, mapping and outgoing layers.
  • Used HDFS scripts to move data from Files to HDFS and executed Sqoop commands to import the data from Oracle database to HDFS.
  • Professional experience in Integration, Functional, Regression, System Testing, Load Testing, UAT Testing, Black Box and GUI testing.
  • Collaborated with project team members to discuss and resolve issues.
  • Experience in Web based applications like online banking, transactions applications, and healthcare.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Experience in creating Requirements Traceability Matrix (RTM), Test Summary Report, documents.
  • Experience in developing Test Plan, preparing Test Strategy, writing detailed Test Cases and Test Scripts and executed the test cases based on Business Requirements Document, and developed Test Scenarios to support quality deliverables.
  • Extensively worked on HP ALM to upload Test Cases, Execute Test Cases.
  • Logged defect Status reports, resolving requirement, and observed design inconsistencies.
  • Expertise in querying and testing RDBMS such as Oracle, MS SQL Server using SQL for data integrity.
  • Tested Web Services/XML /SOAP services using SoapUI tool.
  • Extensive experience in testing BI reports generated by Tableau.
  • Strong working experience in Windows NT/2000/XP and UNIX environments.
  • Clear understanding of business procedures and ability to work as an individual and as a part of a team.
  • Experience in interacting with business analysts, developers, and technical support and help them base line the requirement specifications.
  • Detail oriented personnel with excellent communications and interpersonal skills.

TECHNICAL SKILLS

Operating Systems: Windows, LINUX, UNIX

Big Data Technology: HDFS, MapReduce, Hive, Pig, Sqoop

Test Management: Tricentis qTest, HP ALM 12.0/11.0

Project Management: JIRA

ETL Tools: Talend, Informatica 10x/9x, DataStage 8.7/8.1

Scheduler: ASG Zena

Web Services: Soap UI

RDBMS/ Databases: Teradata, Oracle 11g/10g, SQL Server 2008/2005

BI Tools: Tableau, Cognos

PROFESSIONAL EXPERIENCE

Sr. Big Data Test Analyst

Confidential

Responsibilities:

  • Worked collaboratively with the Data Analyst and System Analyst to understand the requirement.
  • Ran ASG Zena Processes to load the data in the layers of the datalake as a first step for validations.
  • Actively involved in all phases of Software Testing Life Cycle.
  • Scripted HQLs in Teradata and Beeline to validate the data in the datalake.
  • Maintained proper Test Plan documents as deliverable from QA.
  • Created test cases on a regular basis and reviewed it with the team for feedback.
  • Used Tricentis qTest to upload test cases and executed it.
  • Raised InSprint bugs in Jira and linked it to qTest for traceability.
  • Referred to Interface Data Deliverable (IDD) document as a source of truth for validations.
  • Extracted Reports and metrics from the qTest application on the weekly basis for review with the upper level management.
  • Participated in business discussions with business, product owner, DA and SAs to define test requirements and to provide test estimations to satisfy the business objectives.
  • Accomplished Testing Goals by doing Data Completeness, Data transformation, Data Quality, Performance and Scalability.
  • Performed Extract Validation before sending it to downstream applications.
  • Validated through Infogix Assure to verify balance and controls on the data.
  • Completed deliverables and received a sign off and approval from Test Manager on a timely manner.
  • Performed System Integration Testing, Retesting, Functional Testing and End to End Testing in the project.

Environment: HDFS, HIVE, Sqoop, PIG, Tricentis qTest, ASG Zena, Infogix, Talend, WinSCP, MS Excel, Teradata, HQL, PL/SQL, UNIX, Windows, Tableau

ETL/Hadoop Tester

Confidential, Des Moines, IA

Responsibilities:

  • Worked closely with Business Users in understanding designing and documenting the Functional testing plan and then writing, executing, documenting the results and logging defects.
  • Created Hive queries which helped analysts spot emerging trends by comparing fresh data with historical claim metrics.
  • Used Sqoop to move data from individual data sources to Hadoop system.
  • Validated the Pig, Hive Scripts by pulling the data from the Hadoop and validating it with the data in the files and reports.
  • Responsible to validate the data by accessing the Cloudera impala database where the daily feeds are processed through HIVE and stored in the HDFS.
  • Involved in complete life cycle activities like pre-testing phase and testing phase.
  • Created Test Procedures and Test Cases to perform the system testing and Regression testing.
  • Used Jira for Agile Management framework.
  • Participating in business discussions with business, product owner, BA and Tech lead to define test requirements and to provide test estimations to satisfy the business objectives.
  • Reached Testing Goals by doing Data Completeness, Data transformation, Data Quality, Performance and Scalability
  • Tested BI reports developed in Tableau are as per company standards.
  • Created various User Defined Functions for script enhancements and to verify the business logic.
  • Extensively used SQL queries to perform database validation
  • Responsible for reviewing complete test plans, test cases, test data and ensuring accurate coverage of requirements and business processes for the Big data warehouse

Environment: HDFS, HIVE, Sqoop, PIG, HP ALM 12.0, Informatica Power Centre 10, MS Excel, Oracle 11g, SQL, PL/SQL, UNIX, Windows, Tableau

Big Data/ ETL Tester

Confidential, Boston, MA

Responsibilities:

  • Tested fact and dimension table in Star Schema model based on requirements
  • Created and executed test cases based on test strategy and test plans based on ETL mapping document
  • Written complex SQL queries for querying data against different data bases for data verification process
  • Preparation of technical specifications and Source to Target mappings
  • Defects identified in testing environment where communicated to the developers using defect tracking tool Quality Center
  • Written several complex SQL queries for validating Cognos Reports
  • Worked with business team to test the reports developed in Cognos
  • Tested a number of complex ETL mappings, mapplets and reusable transformations for daily data loads
  • Extensively used Informatica power center for extraction, transformation and loading process
  • Creating test cases for ETL mappings and design documents for production support
  • Extensively worked with flat files and excel sheet data sources. Wrote scripts to convert excel to flat files
  • Scheduling and automating jobs to be run in a batch process
  • Effectively communicate testing activities and findings in oral and written formats
  • Reported bugs and tracked defects using Test Director
  • Worked with ETL group for understating mappings for dimensions and facts
  • Extracted data from various sources like Oracle, flat files and SQL Server

Environment: UNIX Shell Scripting, Oracle 11g, Informatica Power Center 9.2 (Power Center Designer, Workflow Manager, Workflow Monitor), SQL, Cognos, Windows, TOAD, PL/SQL

DWH Tester

Confidential, Marlborough, MA

Responsibilities:

  • Scripted complex SQL queries for querying data against different data bases for data verification process.
  • Involved in the error checking and testing of the ETL procedures and programs Informatica session log.
  • Participated in defining and executing test strategies using agile methodology.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Created Test cases for Data Acquisition and Data Delivery and tested the data accuracy for all the transformations.
  • Executed test cases and reported bugs in HP ALM.
  • Executed the Test cases for Cognos Reports.
  • Tested Web Services/XML/SOAP using SOAP UI tool.
  • Performed database testing with SQL queries to verify data integrity.
  • Conducted backend testing by querying databases to synchronize testing databases and checked for data integrity and proper routing based on workflow rules at each step.
  • Generated the detailed Bug reports and tracked, reviewed and analyzed defects.
  • Involved in meetings to discuss the findings in the executed tests and deciding the next steps.
  • Validated application with requirements and functioning as per the technical and functional specifications.
  • Interacted with the Business users to identify the process metrics and various keys dimensions and measures.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging table.
  • Tested the data and data integrity among various sources and targets.

Environment: Informatica, SQL, PL/SQL, UNIX, Teradata, Jira, SOAP UI, Cognos, MS Excel, Agile, XML

We'd love your feedback!