Job ID :
Company :
Internal Postings
Location :
Type :
Duration :
6-12 months
Salary :
Status :
Openings :
Posted :
28 Sep 2017
Job Seekers, Please send resumes to

Role: QA ETL /DWH Tester with Big Data (Hadoop /Hive)

Location: St. Louis, MO (Prefer Locals)

Duration: 6-12 Months

Primary Skills: DWH /ETL Testing, Big Data (Hadoop, Hive, Spark Scala), SQL, UNIX Shell Scripting OR UNIX /LINUX Commands, RDBMS & Tidal, Appworx

Job Summary:

Data QA Engineer responsible for testing of Big Data Projects using Spark Scala. Responsibilities for this position include in-depth data analysis and data management, writing detailed test cases and test plans, performing regression testing and more. Very strong SQL and testing skills are needed to be successful in this role.


Major Duties and Responsibilities

  • Must have strong experience as a QA Engineer. Deep understanding of QA life-cycle and various QA concepts such as Functional, Black Box, Integration, UAT and Regression Testing.
  • Ability to write detailed test cases, test plans and repeatable code to perform regression tests.
  • Strong data management skills and very strong SQL experience querying relational databases and EDW systems. Ability to write complex, custom queries; query optimization and tuning.
  • Experience working in a Linux environment (some shell scripting knowledge, navigating file systems, list and edit files, etc.)
  • Working knowledge of Hadoop/Spark and other Big Data technologies is a big plus.
  • Strong communication and written skills needed to collaborate with cross-functional groups.

Related Work Experience:

  • At least 5-6 years of project testing experience
  • At least 2 years of business analyst experience
  • At least 1-2 years of experience in Hadoop testing Basic Knowledge in Hadoop components like Sqoop, Hive, Pig, HBase, Spark is preferred
  • At least 1-2 years of experience in Unix shell scripting or candidate should know basic Unix/Linux commands
  • At least 2 years of experience in writing complex SQLs
  • At least 1-2 years of experience with any RDBMS is preferred
  • Exposure to scheduling tools like Tidal, Appworx is required