We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

2.00/5 (Submit Your Rating)

TECHNICAL SKILLS

Big Data tools: Apache Hadoop, YARN, Cloudera, Hortonworks, MapR, AWS, MapReduce, Hive, Pig, Kafka, Sqoop, Flume, Zookeeper, Nagios, Ganglia

Language and other skills: Java, Python, XML, JSON

RDBMS: MySQL, MS SQL Server

NoSQL Databases: HBase, MongoDB

Configuration Management Tool: Puppet

Environment: Linux, UNIX, MAC OS and Windows.

PROFESSIONAL EXPERIENCE

Hadoop Administrator

Confidential

Responsibilities:

  • Experience in installation, management and monitoring of Hadoop cluster using Cloudera manager
  • Implemented a Big Data system in cloud environment(AWS)
  • Implemented development nodes using CDH4, CDH5 on RHEL, Ubuntu, CentOS.
  • Implemented Hadoop cluster in MapR.
  • Backup configuration and Recovery from a Namenode failure. worked in a Cloud environment like Amazon EC2 and internal VMware Virtualenvironment.
  • Configure and setup solution for the Backup and Recovery cluster.
  • Experience in Encryption security layer in the Hadoop environment.
  • As an admin involved in balancing the loads on server and tuning of server for optimal performance of the cluster.
  • Expertise in Installing, configuration and managing Red hat and Ubuntu OS.
  • Create Hive tables, load with data and write Hive queries.
  • Used Hive and Impala for analysis of data.
  • Managed and scheduled jobs on a Hadoop cluster using Oozie Service
  • Implement data transfers between Hadoop and non - Hadoop systems (MySQL, Microsoft SQL Server) loaded data into HDFS and Extracted the data from MySQL, MS SQL Server into HDFS using Sqoop
  • Manage and review Hadoop log files.
  • Engage in daily meetings to discuss the development/progress of work.

Hadoop Administrator

Confidential

Responsibilities:

  • Installed and configured Hadoop in MapR Cluster
  • Configured HUE in MapR
  • Configured Auditing in MapR
  • Implemented password file encryption in Sqoop for MapR

Hadoop Administrator

Confidential

Responsibilities:

  • Prototype Big Data system with Hadoop, Java, Zoo keeper, Sqoop, Hive using Cloudera on EC2 for Application Log Analyzer.
  • Implemented 10 node CDH5 Hadoop cluster on RHEL.
  • Automated the cluster setup and administration.
  • Created Hive tables, loaded with data and wrote Hive queries.
  • Implemented Kerberos security for various Hadoop services.
  • Implemented data transfers between Hadoop and non-Hadoop systems.
  • Managed and reviewed Hadoop log files.
  • Defined job flows.
  • Engage in daily meetings to discuss the development/progress of work.

Hadoop Administrator

Confidential

Responsibilities:

  • Prototype Big Data system with Hadoop, Java, Zookeeper, Sqoop, Hive using Cloudera on EC2 and HDP.
  • Plan and configure cluster setup and administration.
  • Implement development nodes CDH5 on RHEL.
  • Commissioned datanodes when data grew and decommissioned when the hardware degraded.
  • Create Hive tables, load with data and write Hive queries.
  • Integrated Oozie Jobs with Hive for Email Action, Hive Beeline and with Pig (Scripting Language)
  • Implement data transfers between Hadoop and non-Hadoop systems.
  • Manage and review Hadoop log files.
  • Engage in daily meetings to discuss the development/progress of work.

Hadoop Administrator

Confidential

Responsibilities:

  • Prototype Big Data system using Cloudera and Hortonworks distributions for various Business use cases.
  • Plan and configure cluster setup and administration on a 10-node cluster
  • Configure and setup solution for the Backup and Recovery cluster
  • Implemented Local Kerberos setup
  • Implement data transfers between Hadoop and non-Hadoop systems (MySQL, MS Sql Server)
  • Engage in daily meetings to discuss the development/progress of work.

We'd love your feedback!