Job ID :
Company :
Internal Postings
Location :
Columbus, IN
Type :
Duration :
6 Months
Salary :
Status :
Openings :
Posted :
14 May 2021
Job Seekers, Please send resumes to
Must Have Skills (Top 3 technical skills only)*:
  1. Strong Spark Scala DataBricks with Azure
  2. Strong ETL and ELT experience
  3. Fluent in Scala programming language
Nice to have skills (Top 2 only):
  1. Communication skills
  2. Stakeholder management

Detailed Job Description:

Need a Data Engineer to develop reusable ETL framework on Big Data using Scala Spark 2.0 or later. This custom framework will be used to create a template driven data load and transformation system on Data Lake and Big Databases. Developer is expected to write Scala Test cases and Data Quality checks for all code produce as a precondition for CICD and higher environment. Should have 10 years of IT Experience, with at least 3 years of Scala and Spark experience.

Minimum years of experience*: 5+

Certifications Needed: No

Top 3 responsibilities you would expect the Subcon to shoulder and execute*:

  1. Develop ETL Framework for template driven ETL
  2. Implement transformations and aggregations as requirement
  3. Agile experience

Interview Process (Is face to face required?) Yes

Does this position require Visa independent candidates only? No