Job ID :
21433
Company :
Internal Postings
Location :
Hillsboro, OR
Type :
Contract
Duration :
6 Months
Salary :
DOE
Status :
Active
Openings :
1
Posted :
19 Mar 2019
Job Seekers, Please send resumes to resumes@hireitpeople.com

Must Have Skills (Top 3 technical skills only)*:

  1. AWS
  2. Java
  3. CICD

Detailed Job Description:

  • Good knowledge in Snowflake, Airflow, Phython, AWS, EC2, Cognos, Kinesis and Design and implement distributed data processing pipelines using Spark, Hive, Sqoop, Python, and other tools and languages prevalent in the Hadoop ecosystem.
  • Ability to design and implement end to end solution.
  • Build utilities, user defined functions, and frameworks to better enable data flow patterns.
  • Research, evaluate and utilize new technologies tools frameworks centered around Hadoop and other elements

Minimum years of experience*: 5

Certifications Needed: No

Top 3 responsibilities you would expect the Subcon to shoulder and execute*:

  1. Contribute in requirements elicitation, creation of application architecture document and creation of design artifacts
  2. Deliver high quality codes, support activities related to implementation and transition interface with internal team and key stakeholders
  3. Analyze and resolve issues to ensure high quality deliverables at each stage of SDLC WITHIN the guidelines, policies

Interview Process (Is face to face required?) No

Does this position require Visa independent candidates only? No