We provide IT Staff Augmentation Services!

Hadoop Architect Resume

2.00/5 (Submit Your Rating)

RI

SUMMARY:

  • 11.3 years of experience in professional software development
  • 4 years of work experience in Distributed Computing environment - Hadoop (Cloudera & Hortonworks)
  • Proficiency in industry standard application development in Java, Hadoop, Sqoop, Hive, Spark, HBase and Pig
  • Good exposure in industry standard Agile Methodologies
  • Inspiring teams to strive relentlessly towards success
  • Strategic collaboration, brainstorming and mentoring
  • 2 years of work experience in ideation, strategize, creation of high level solution architecture, E2E architecture, blue print, technology architecture, information architecture, data quality framework, key architecture risk and mitigation plan and reference implementation. 1 year of work experience in Java/J2EE application development in Cards & Payments services.
  • 3 years of work experience and expertise in creation of solution architecture and implementation of solutions such as Online Recommendation Engine, Inventory Analytics, Consumer Insights
  • Degree, and Confidential, etc.
  • 7 years of experience in industry standard product development (software engineering products) using Java/J2EE and Eclipse Plugin Development Environment (PDE) in Agile Environment.
  • Architect exposure includes more than 2 years of experience in information architecture, technology &
  • Application architecture in retail and finance Industries.
  • Currently assigned to work as a solution architect in enterprise hadoop platform initiative for the client Confidential . Working on the ideation and strategy creation & working for the Identification of the technology options and tools for the strategies. Also working for the Identification of architecture building blocks and defining the conceptual architecture with the blueprint and landscape for the initiative.
  • Worked for 1.5 years in the enterprise level advisor platform for Fidelity client for building solution using distributed computing (hadoop) technology stacks and created the blueprint, landscape, conceptual, logical & technology architecture for different use cases. Worked on different analytic tool evaluation from big data perspective. Experienced in Bigdata Hadoop, HDFS, MapReduce, Hive, Sqoop, Impala, Spark, HBase and Pig. Worked primarily in the below initiatives.
  • Data Ingestion and extraction framework Data quality/validation framework Analytics (batch, iterative, interactive and in-memory) Central data manager Significant event engine (CEP processing) 360 degree view of data
  • Distributed Computing Hadoop Map Reduce, Spark (Scala), Hive, Impala, and PIG Data Integration Sqoop, Pentaho DI, and Talend Big Data Integration NoSQL Database HBase and Cassandra

TECHNICAL SKILLS:

Operating Platforms: Windows and Linux

Languages & Tools: Java, Eclipse PDE, Grails, Maven 2.0, CVS, Microsoft TFS, Microsoft Visio, Bugzilla

Industries: Retail, Banking, and Product Development (Software Engineering)

Area of Expertise: Distributed Computing, Big Data, Agile, Application Design and Project Management

PROFESSIONAL EXPERIENCE:

Confidential, RI

Hadoop Architect

Responsibilities:

  • Ideation and strategy creation
  • High level solution
  • High level application architecture creation
  • Identification of technology options, tools for the defined strategy
  • E2E high level system flow creation
  • System integration interaction methods creation
  • Data architecture creation
  • Technology architecture creation
  • Defining Implementation strategy
  • Defining Analysis approach and scope
  • Governance Review
  • Solution Review
  • Architecture Consulting
  • Data modeling - Hive
  • Data integration - Sqoop
  • Analytics - Spark, Hive/Impala Queries, Map Reduce (Java) and PIG Script
  • Performance tuning - Sqoop, Hive and Spark

Confidential

Hadoop Architect

Technologies: Hadoop (HDP - Hortonworks), Sqoop, Hive and Spark

Responsibilities:

  • High level application architecture creation
  • Identification of technology options, tools for the defined strategy
  • E2E high level system flow creation
  • Data architecture creation
  • Technology architecture creation
  • Defining Implementation strategy
  • Architecture Consulting
  • Data modeling - Hive
  • Data integration - Sqoop
  • Analytics - Spark, Hive and Impala
  • Performance tuning - Sqoop, Hive and Spark

Confidential

Data Engineer

Technologies: Hadoop (CDH - Cloudera), Map Reduce, Hive, Pentaho DI, Pig and Oracle

Responsibilities:

  • E2E high level system flow creation
  • Data architecture creation
  • Defining Implementation strategy
  • Data modeling - Hive
  • Data integration - Pentaho DI
  • Analytics - Map Reduce and Hive
  • Data summarization

Confidential

Data Analyst

Technologies: Hadoop (CDH - Cloudera), Map Reduce, Hive, Pentaho DI, and Oracle

Responsibilities:

  • Configured & administered 10 node Hadoop Cluster (CDH 4.6, fully distributed cluster on top of RHEL)
  • Defined algorithms and boundary conditions for Inventory Analytics Solution by attending brainstorming sessions and workshops.
  • Data Modeling - Hive
  • Data Integration Framework - Pentaho Data Integration
  • Inventory Analytics - Map Reduce and Hive queries
  • Data Summarization (CUBE) - Pig Latin Script

Confidential

Data Analyst

Technologies: Hadoop, Map Reduce, Sqoop, Hive and Impala

Responsibilities:

  • Data Integration - Sqoop
  • Clickstream metrics analysis - Map Reduce, Hive and Impala
  • Custom UDFs/UDAFs - Hive

Confidential

Data Analyst

Technologies: Hadoop, HDFS, Map Reduce, Hive, Mahout and HBase

Responsibilities:

  • Administered 10 Node Hadoop Cluster (fully distributed cluster, in RHL) for
  • Customer Recommendation Engine
  • Data Modeling in Hive
  • Data ETL to HDFS Environment
  • Defined Hive queries and Map Reduce algorithms for generating clusters and recommendation generation engine based on customer purchase history
  • Prepared data analysis report (Funnel Chart)

Confidential

Senior Developer

Technologies: Hadoop, HDFS, Map Reduce, Hive and AWS

Responsibilities:

  • Data Integration - Sqoop
  • Defined HQL (Hive Query Language) for various recommendation analytics based on customer purchase history
  • Implemented custom UDF/UDAFs for various metrics

Confidential

Systems Analyst

Technologies: Java, Eclipse, Maven 2.0, Struts, Hibernate

Responsibilities:

  • Requirements elicitation and effort estimation
  • Organized JADs (joint application discussions)
  • Prepared requirement document from client BRD walk through sessions
  • Iteration Manager
  • Motivated and inspired team members
  • Ensured a healthy and positive work environment
  • Ensured best practices are being implemented with regards to source code management and promotion
  • Data Integration

Confidential

Product Lead

Technologies: Java, Eclipse PDE, Swing, SWT, JFace, Hibernate, Groovy, GRAILS Framework and

Responsibilities:

  • Team Management
  • Organize team
  • Task assignment
  • Motivate and inspire team members
  • Ensuring a healthy and positive work environment
  • Perform quality checks of team's output to ensure high standard of
  • Quality
  • Ensure team is using best practices with regards to source code management and promotion
  • Ensure appropriate level of documentation is created
  • Requirements gathering and effort estimation
  • Design and implementation of Command Line Reporting Module
  • Design and implementation of Requirements Traceability Views

Confidential

Module Lead

Technologies: Java, Eclipse PDE, Swing, SWT, JFace, Maven2.0 and JavaCC

Responsibilities:

  • Involved in preparing of Kono analysis report and product features plan and features effort estimation
  • Effectively configured the daily build process for the initial project, website and environment: setup (CVS, Maven, and Bugzilla)
  • Design and implementation of Eclipse Views and Perspectives
  • Design and implementation of Metrics Reporting Module
  • Design and implementation of Features Dashboard
  • Generating Metrics and Dashboard reports

We'd love your feedback!