Job ID :
24546
Company :
Internal Postings
Location :
Beaverton, OR
Type :
Contract
Duration :
6 Months
Salary :
DOE
Status :
Active
Openings :
1
Posted :
10 Oct 2019
Job Seekers, Please send resumes to resumes@hireitpeople.com

Must Have Skills (Top 3 technical skills only)*:

  1. Spark, Hive
  2. Python, Sqoop
  3. AWS

Detailed Job Description:

  • Advanced experience building cloud scalable, real time and high - performance data lake solutions leveraging AWS, EMR, EC2, S3, Spark, Hive & Lambda
  • Experience with relational SQL, preferably Snowflake
  • Experience with Python or another scripting language
  • Experience with source control tools such as GitHub and related dev process
  • Experience with Airflow or another scheduling tool
  • Understanding of micro service architecture
  • Strong understanding of developing complex data solutions
  • Experience working on end-to-end solution design
  • Experience preparing data to be analyzed and visualized (preferably with Tableau)
  • Experience with Supply Chain a plus
  • Experience with Attunity, DMS, Sqoop a plus
  • Able to lead others in solving complex problems by taking a broad perspective to identify innovative solutions
  • Willing to learn new skills and technologies
  • Has a passion for data solutions
  • Strong understanding of data structures and algorithms
  • Strong understanding of solution and technical design
  • Has a strong problem solving and analytical mindset
  • Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
  • Able to quickly pick up new programming languages, technologies, and frameworks

Responsibilities you would expect the Subcon to shoulder and execute*:

  • Help lead a team of data engineers, architects, and platform engineers to build innovative and sustainable data products that demonstrate business value
  • Lead the design and build-out of reusable components, frameworks and libraries at scale to support analytics products
  • Lead the design and implementation of product features in collaboration with business and technology stakeholders
  • Anticipate, identify and solve issues concerning data management to improve data quality
  • Help clean, prepare and optimize data at scale for ingestion and consumption
  • Drive the implementation of new data management projects and refactoring of the current data architecture
  • Help implement complex automated workflows and routines using workflow scheduling tools
  • Lead the build-out of continuous integration, test-driven development and production deployment frameworks
  • Drive collaborative reviews of design, code, test plans and dataset implementation performed by other data engineers in support of maintaining data engineering standards
  • Analyze and profile data for the purpose of designing scalable solutions
  • Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational issues
  • Mentor and develop data engineers in career development and adopting best practices