We provide IT Staff Augmentation Services!

Manager Data Engineering Resume

5.00/5 (Submit Your Rating)

Boston, MA

PROFILE SUMMARY:

  • My 12+ years of experience in technology, across multiple industry domains, has helped me understand the value of data.
  • As a Machine Learning architect, I design and integrate machine learning solutions with enterprise data and applications.
  • I work extensively in designing distributed ML, big data architectures and developing solutions optimized for public cloud.
  • I've lead data - driven transformations for a variety of organizations - from startups to fortune 500 companies.
  • I collaborate with business and technology stakeholders across the firm to evaluate solutions and develop consensus on technical direction.

CORE COMPETENCIES:

  • Machine Learning Architecture & Design
  • Cloud Native Solutions
  • Evaluate New Technologies, PoCs
  • Stakeholder Collaboration
  • Big Data Architecture & Design
  • Cloud Migration Strategy
  • Data Modernization
  • Hiring, mentorship & staff development

KEY SKILLS:

Programming Skills: Python, Shell Scripting, SQL, JSON, XML, Java, Cobol

Machine Learning Tools: Tensorflow, Keras, Scikit: learn, Pandas

Big Data Tools & Platforms: Hortonworks (HDP), Apache Beam, Spark, Oozie, Pub/Sub, Hive, Pig

Other Tools: Jenkins, Git, Tableau, Google Data Studio

Database: BigQuery, Hive, Cassandra, MySQL, SQL Server, DB2

Cloud Platforms: Google Cloud Platform, Microsoft Azure Cloud

Domain: Retail, Finance, Insurance, Education, Automotive

PROFESSIONAL EXPERIENCE:

Confidential, Boston, MA

Manager Data Engineering

Responsibilities:

  • Lead teams of data engineer & scientists for designing & implementing machine learning solutions.
  • Initiated set-up of applied machine learning engineering practice within the organization.
  • Identified opportunities & implemented solutions to leverage machine learning & cloud for personalization, scaling operations, and optimizing marketing efforts, for various clients.
  • Acted as a solution architect for multiple clients, for solving their data & analytics challenges.
  • Design software architecture to integrate ML models with enterprise data and applications.
  • Define real-time and batch processing architectures using Big Data tools and Cloud Platforms.
  • Conduct assessments to determine best cloud platform for implementing a cost effective and scalable data & analytics solution.
  • Assess current stage of data management, identify gaps, provide recommendations and develop roadmaps for modernizing data architecture
  • Work with client’s data science and data engineering teams to lay out the vision for current and future solutions.
  • Assist in sales cycle by working on RFPs, PoCs and creating estimates.
  • Mentor data engineers and scientists to train them on best practices for automating data & analytics workflows.

Confidential, Boston, MA

Sr. Technical Lead

Responsibilities:

  • Architect & develop enterprise data lake for a large Insurance client, to bring in data from various business units - leveraged Apache Spark, Hive, HDInsight, and Azure Cloud.
  • Lead architecture design and implementation of data lake for a major Banking client, using Hortonwork Development Platform.
  • Design architecture to integrate data engineering and data science workflows.
  • Architect ‘infrastructure as a code’ to support the deployment of data solutions on cloud platforms.
  • Translate complex functional requirements into detailed design.
  • Architect big data solutions to optimize processing of very large datasets.
  • Train and develop a pool of developers to work on big data projects.
  • Developed an efficient cross-functional team to keep up with the rapid developments in Big data arena.
  • Explore emerging tools and techniques in big data analytics space.
  • Develop pre-sales pitches and subsequent proof-of-concepts (PoCs) for multiple clients.

Confidential, Miami, FL

Technology Analyst

Responsibilities:

  • Database design, optimizing SQL queries and developing solution architecture.
  • Design process for automating system monitoring and log analysis using shell scripting.
  • Build ETL pipeline for processing high volume data,
  • Migrating legacy batch-heavy operations to real-time data processing and storage.
  • Legacy modernization and demand management initiatives.
  • Liaise with the client to understand the business requirements.
  • Mentoring programmers and offshore team.
  • Train on emerging technologies, including big data, Hadoop & NoSQL.

We'd love your feedback!