Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills (Top 3 technical skills only)*:
- SPARK
- LINUX
- HIVE SQL
Nice to have skills (Top 2 only):
- SPARK
- LINUX
Detailed Job Description:
8 years of Rich and hands on experience with Hadoop distributed frameworks, handling large amount of big data using Apache Spark and Hadoop Ecosystems.Spark, SparkSQL, PySpark, Python, HDFS, Hive, Impala, Ozie, ScalaExperience on Databricks and Azure Cloud.Building complex models using Pyspack and guiding business users on modelling.Proficient knowledge of SQL with any RDBMS.Knowledge of Oracle databases and PLSQL.Working knowledge and good experience in Unix environment and capable of Unix Shel
Minimum years of experience*: 5+
Certifications Needed: Yes
Top 3 responsibilities you would expect the Subcon to shoulder and execute:
- He should be responsible for Bigdata development activities using hadoop tools like spark, scala, linux, hive, impala, scoop,kafka
- Responsible for requirements gathering from clients, able to drive end to end development activites, post production and hypercare
- Ability to work within deadlines and effectively prioritize and execute on tasks.
Interview Process (Is face to face required?) No