Job Seekers, Please send resumes to resumes@hireitpeople.comJob Details:-
Must Have Skills (Top 3 technical skills only) *
1. 2+ years of development experience in Hadoop HDFS, Spark, Hive, Pig, HBase, Map Reduce, Sqoop 2+ years of development experience in Java (or Scala), Python, SQL, shell scripting.
2. Experience developing user defined functions, Hive SerDes is a must.
3. Designing, building and maintaining ETL feeds for new and existing data sources
Nice to have skills (Top 2 only)
1. Experience in scheduled and realtime data ingestion techniques using big data tools. Experience in developing REST services is a plus.
2. Documenting all metadata regarding data source, field type, definition, etc. for each field and table created.
Detailed Job Description:-
Designing, building and maintaining ETL feeds for new and existing data sources
2 years of development experience in Hadoop HDFS, Spark, Hive, Pig, HBase, Map Reduce, Sqoop
2 years of development experience in Java or Scala, Python, SQL, shell scripting.
Experience developing user defined functions, Hive SerDes is a must.
Experience in scheduled and realtime data ingestion techniques using big data tools.
Experience in developing REST services is a plus.
Minimum years of experience*:5+
Certifications Needed: No
Top 3 responsibilities you to shoulder and execute*:
1. Implementing data ingestion and transformation workflows fromto Hadoop.
2. Designing, building and maintaining ETL feeds for new and existing data sources
3. Experience developing user defined functions, Hive SerDes is a must. Experience in scheduled and realtime data ingestion techniques using big data tools. Experience in developing REST services is a plus. Developing and implementing data ingestion and transformation workflows fromto Hadoop.
Interview Process (Is face to face required?) No.