Job Seekers, Please send resumes to resumes@hireitpeople.comDetailed Job Description:
- Should be expertise in Hadoop architecture and various components such as HDFS, YARN, High Availability, Job Tracker, Task Tracker, Name Node, Data Node, and MapReduce programming paradigm.
- Should have experience in writing python as ETL framework and Pyspark to process huge amount of data daily.
- Should have work experience as Big Data /Hadoop Developer with good knowledge of Hadoop framework.
- Should have experience in Design, Development, Data Migration, Testing, Support and Maintenance using Redshift Databases.
- Should have experience in AWS cloud solution development using Lambda, SQS, SNS, Dynamo DB, Athena, S3, EMR, EC2, Redshift, Glue, and Cloud Formation.
- Should have experience in transporting and processing real time event streaming using Kafka and Spark Streaming.
Minimum years of experience*: 7+
Certifications Needed: No
Interview Process (Is face to face required?): No
Does this position require Visa independent candidates only? No