Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills (Top 3 technical skills only)*:
- Strong working knowledge of Teradata, Hive and HBase
- Experience in data architecture for handling streaming and batch data within a data warehouse landscape
- Understanding of Big Data technologies, Hadoop and modern data architectures like Spark and NoSQL structures
Nice to have skills (Top 2 only):
- Experience in distributed data processing framework such as Spark and MapReduce and data streaming such as Kafka, Spark streaming preferred
- Strong programming skill with at least one of the following Python, Java, Scala, etc
Detailed Job Description:
- Strong working knowledge of Teradata, Hive and HBase Experience in data architecture for handling streaming and batch data within a data warehouse landscape
- Understanding of Big Data technologies, Hadoop and modern data architectures like Spark and NoSQL structures Experience in distributed data processing framework such as Spark and MapReduce and data streaming such as Kafka, Spark streaming preferred
- Experience with API management solution Strong programming skill with at least one of the following
Minimum years of experience*: 5+
Certifications Needed: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute*:
- Experience with data profiling tools such as Ataccama, Trifacta, etc preferred
- Excellent SQL tuning knowledge and experience with Teradata Query Grid preferred
- Experience working with vendors to evaluate, select, and implement 3rd party solutions
Interview Process (Is face to face required?) Yes
Does this position require Visa independent candidates only? No
