Job ID :
28328
Company :
Internal Postings
Location :
Denver, CO
Type :
Contract
Duration :
12 Months
Salary :
DOE
Status :
Active
Openings :
1
Posted :
23 Sep 2020
Job Seekers, Please send resumes to resumes@hireitpeople.com

Must Have Skills:

  • Must have at least 8+ years of development and administration experience of AWS, Kafka and Snowflake

Detailed Job Description:

  • Configure Replicate software (attunity/streamsets, etc) to capture CDC from OSS and BSS systems to AWS S3 and Snowflake/Redshift
  • Building streaming data pipelines from Kafka/kinesis to AWS S3 and Snowflake/Redshift
  • Building API framework for all possible API use cases to AWS and Snowflake/Redshift
  • Building batch data pipelines as per end user requirements
  • Pipeline for tokenization of data in AWS S3 using data bricks
  • Data Curation framework on AWS S3 using data bricks
  • Data Curation Framework on Snowflake/Redshift (ex: using snowpipe, streams and tasks)
  • Unit testing the data from OSS, BSS and other sources using above frameworks 

Minimum years of experience: 5+

Certifications: No

Top 3 responsibilities you would expect the Subcon to shoulder and execute:

  1. Building streaming data pipelines from Kafkakinesis to AWS S3 and SnowflakeRedshift
  2. Data Curation Framework on SnowflakeRedshift
  3. Pipeline for tokenization of data in AWS S3 using data bricks

Interview Process (Is face to face required?) No

Does this position require Visa independent candidates only? No