We provide IT Staff Augmentation Services!

Hadoop Developer Resume

2.00/5 (Submit Your Rating)

Richmond, VirginiA

SUMMARY:

  • 2+ Years of Total work experience in IT, which includes experience in requirement gathering, design and development, and Implementation of Hadoop and Data warehousing solutions and also embedded application.
  • More Than 2 Years of experience in dealing with Apache Hadoop components like Java, HDFS, MapReduce, Hive, HBase, Scoop and Tableau
  • Good knowledge of Java (J2SE)
  • Good knowledge on JDBC, integration with java program
  • Expert in writing HiveQL queries and have knowledge of Hue Tool.
  • Very Good understanding of SQL, ETL and Data Warehousing technologies.
  • Active Member of the Hadoop Core team, actively participating in defining various hadoop design standards and getting them implemented.
  • Have very good Data Analysis and Data Validation skills.
  • Have very good exposure to the entire Software Development Lifecycle.
  • Expert in requirements collection, analysis. Excellent troubleshooting and debugging skills.
  • Expert in Building, Deploying and Maintaining the Applications
  • Expert in Embedded, Automotive, Avionics, Telecom, Product, and Maintenance. Ability to contribute individually and work well with minimal supervision

TECHNICAL SKILLS:

Open source technologies and tools: Cloudera Hadoop, Java Technology

Development technologies & Operating Systems: Java, Hive, HiveQL, HBase, SQL script, Shell Script, Linux, MAC OS X

Build tools : Maven, Ant, Jenkins

Data Storage: HDFS, MySQL, XML, Microsoft SQL Server 2000, MS Access

Tools & Utilities: Rational Clear case,VMWare, Eclipse, Microsoft Visio & office, TOAD, SQL developer, Codeflow, SourceControl

Application Servers: Apache Tomcat

PROFESSIONAL EXPERIENCE:

Hadoop Developer

Confidential, Richmond, Virginia

Responsibilities:

  • Requirement gathering from the Business Partners and Subject Matter Experts.
  • Understand the data requirements and prepare the requirement document and get agreement from the client that all the data requirements are complete and understood.
  • Have experience in installing Hadoop Ecosystem components.
  • Used to manage and review the Hadoop log files.
  • Responsible to manage data coming from different sources.
  • Supported Map Reduce Programs those are running on the cluster.
  • Involved in HDFS maintenance and loading of structured and unstructured data.
  • Wrote MapReduce job using Java API.
  • Written Hive queries for data analysis to meet the business requirements.
  • Creating Hive tables and working on them
  • MySQL to HDFS on regular basis.
  • Developed Batch Job and Scripts to schedule various Hadoop Program.
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.

We'd love your feedback!