Job Seekers, Please send resumes to resumes@hireitpeople.com
Detailed Job Description:
- Primary - 5+ years on big data – Spark, Scala, Hadoop, python and Hive
- Responsible for implementation and ongoing administration of Hadoop infrastructure.
- Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
- Screen Hadoop cluster job performances and capacity planning
- monitor Hadoop cluster connectivity and security , Cluster maintenance ,manage and review Hadoop log files.
- Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required
Minimum years of experience*: 7+
Interview Process (Is face to face required?) No