Job ID :
Company :
Internal Postings
Location :
Minneapolis, MN
Type :
Duration :
12 Months
Salary :
Status :
Openings :
Posted :
02 Aug 2019
Job Seekers, Please send resumes to

Must Have Skills (Top 3 technical skills only)*:

  1. AWS
  2. Hadoop
  3. Python

Detailed Job Description:

  • Should have 3+ years of experience working on Cloud technologies.
  • Exposure / experience working with AWS S3, AWS Elastic Compute technologies like EC2, EBS, IAM, CloudTrail, Cloud Watch etc.
  • Exposure / experience working with Big Data Technologies like Hadoop, Spark, Hive etc. highly desirable
  • Exposure / experience working with AWS Elastic MapReduce service will be plus
  • Should have experienced complete SDLC process for one or more projects / programs in the Cloud with above technologies
  • Experience leading teams, development etc. is a must
  • Hands on coding experience in Python, or other scripting language in a Cloud based environment is desirable
  • Should have experience providing warranty / operational support. This could include experience in a non - cloud environment as well
  • Experience trouble shooting, log analysis, incident & change management
  • Experience working with appropriate infrastructure and other cross commit teams to communicate, diagnose issues, sanitize & share information / logs
  • Exposure / experience working on Advanced Analytics / Data Science use case highly desirable
  • Exposure / experience working on tools such as Dataiku, Data Robot will be highly desirable
  • Exposure / experience working with Data Virtualization, like Denodo, will be a huge plus

Minimum years of experience*: 5+

Certifications Needed: No

Responsibilities you would expect the Subcon to shoulder and execute*:

  • Design discussion
  • Onsite offshore coordination

Interview Process (Is face to face required?) No

Does this position require Visa independent candidates only? No