Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills (Top 3 technical skills only)*:
- 2+ years of development experience in Hadoop HDFS, Spark, Hive, Pig, HBase, Map Reduce, Sqoop 2+ years of development experience in Java (or Scala), Python, SQL, shell scripting.
- Experience developing user defined functions, Hive SerDes is a must.
- Designing, building and maintaining ETL feeds for new and existing data sources
Nice to have skills (Top 2 only):
- Experience in scheduled and realtime data ingestion techniques using big data tools. Experience in developing REST services is a plus.
- Documenting all metadata regarding data source, field type, definition, etc. for each field and table created.
Detailed Job Description:
- Designing, building and maintaining ETL feeds for new and existing data sources
- 2 years of development experience in Hadoop HDFS, Spark, Hive, Pig, HBase, Map Reduce, Sqoop
- 2 years of development experience in Java or Scala, Python, SQL, shell scripting.
- Experience developing user defined functions, Hive SerDes is a must.
- Experience in scheduled and realtime data ingestion techniques using big data tools.
- Experience in developing REST services is a plus.
Minimum years of experience*: 5+
Certifications Needed: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute*:
- Implementing data ingestion and transformation workflows fromto Hadoop.
- Designing, building and maintaining ETL feeds for new and existing data sources
- Experience developing user defined functions, Hive SerDes is a must. Experience in scheduled and realtime data ingestion techniques using big data tools. Experience in developing REST services is a plus. Developing and implementing data ingestion and transformation workflows fromto Hadoop.
Interview Process (Is face to face required?) No
Does this position require Visa independent candidates only? Yes