Job ID :
Company :
Internal Postings
Location :
Chicago, IL
Type :
Duration :
3 Months
Salary :
Status :
Openings :
Posted :
11 Apr 2019
Job Seekers, Please send resumes to

Must Have Skills (Top 3 technical skills only)*:

  1. SPARK
  2. LINUX

Nice to have skills (Top 2 only):

  1. SPARK
  2. LINUX

Detailed Job Description:

8 years of Rich and hands on experience with Hadoop distributed frameworks, handling large amount of big data using Apache Spark and Hadoop Ecosystems.Spark, SparkSQL, PySpark, Python, HDFS, Hive, Impala, Ozie, ScalaExperience on Databricks and Azure Cloud.Building complex models using Pyspack and guiding business users on modelling.Proficient knowledge of SQL with any RDBMS.Knowledge of Oracle databases and PLSQL.Working knowledge and good experience in Unix environment and capable of Unix Shel

Minimum years of experience*: 5+

Certifications Needed: Yes

Top 3 responsibilities you would expect the Subcon to shoulder and execute: 

  1. He should be responsible for Bigdata development activities using hadoop tools like spark, scala, linux, hive, impala, scoop,kafka
  2. Responsible for requirements gathering from clients, able to drive end to end development activites, post production and hypercare
  3. Ability to work within deadlines and effectively prioritize and execute on tasks.

Interview Process (Is face to face required?) No