We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

2.00/5 (Submit Your Rating)

BoisE

OBJECTIVE

  • To achieve a position in corporate world as deemed suitable to my skills, abilities and to hone my skills by working in a dynamic environment leading to both personal and organizational growth.

SUMMARY

  • 6 + years of IT experience including 2+ years in with Hortonworks installation and configuration of different Hadoop eco - system components in teh existing cluster and 4+ years in EMC Networker administration on Linux/AIX/Windows platforms.
  • Good knowledge on data warehousing and DBMS techniques.
  • Hadoop cluster planning and implementation.
  • Writing Oozie workflows and configuring Sqoop.
  • Formed Hadoop cluster using Cloudera RPMs and as well as using Hortonworks.
  • Configuring, managing & tuning of HDFS with MapReduce (v1)/Yarn (v2) architecture with High-Availability.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node, Yarn, HBase, Hive, Sqoop, Pig, Zookeeper, Flume and MapReduce, Kafka.
  • Adept at mapping client requirements, custom designing solutions & troubleshooting for complex software related issues.
  • Worked on end to end support in EMC Networker backup and recovery.
  • Professional experience in installing and maintaining EMC Networker backup and recovery software and upgrading to latest technologies.
  • Capacity planning based on teh backup growth and implementing teh plan based on teh client requirement.

TECHNICAL SKILLS

Big data/Hadoop: HDFS, Map Reduce, YARN, HIVE, PIG, HBase, Sqoop, ZooKeeper, Kafka, Flume.

Backup Technologies: EMC Networker, Commvault, HPDP, Symantec Backup Exec.

Scripting Languages: Java, Shell script.

Database: Oracle, MySQL.

Web Development Tools: HTML.

Distribution Management: Cloudera, Hortonworks, Apache Hadoop.

Operating Systems: RHEL (5,6), Ubuntu, Windows (2008,2012).

PROFESSIONAL EXPERIENCE

Confidential, Boise

Hadoop Administrator

Responsibilities:

  • Installed and configured multi-nodes fully distributed Hadoop cluster of large number of nodes.
  • Addressing and Troubleshooting issues on a daily basis in Horton works cluster.
  • File system management and monitoring.
  • Provided Hadoop, OS, Hardware optimizations.
  • Installed and configured Hadoop ecosystem components like Map Reduce, Hive, Pig, Sqoop, HBase, Zookeeper and Oozie.
  • Involved in testing HDFS, Hive, Pig and Map Reduce access for teh new users.
  • Cluster maintenance as well as creation and removal of nodes using Apache Ambari.
  • Worked on setting up high availability for major production cluster and designed automatic failover control using zookeeper and quorum journal nodes.
  • Implemented capacity scheduler to allocate fair amount of resources to small jobs.
  • Configured Oozie for workflow automation and coordination.
  • Implemented rack awareness topology on teh Hadoop cluster.
  • Importing and exporting structured data from different relational databases into HDFS and Hive using Sqoop.
  • Configured Zookeeper to implement node coordination, in clustering support.
  • Rebalancing teh HDFS on Horton works Hadoop Cluster.
  • Allocating teh name and space Quotas to teh users in case of space problems.
  • Installed and configured Hadoop security tool Ranger and enabled Kerberos.
  • Managing Horton works cluster performance issues.
  • Backed up data on regular basis to a remote cluster using DistCp.
  • Regular Commissioning and Decommissioning of nodes depending upon teh amount of data.
  • Maintaining Cluster in order to remain healthy and in optimal working condition.
  • Handle teh upgrades and Patch updates.
  • Closely working with Development team on job failure issues.
  • Working with Linux on Disk failure issues.
  • Creating principles and keytab files as per teh requirement.
  • Creating HDFS directories as per teh requirement and giving permissions through ranger.
  • Working with Hortonworks support on critical issues.

Environment: Hortonworks (HDP 2.5), Ambari 2.2, HDFS, Shell Scripting, Python, Java, Hive, Spark, Sqoop, Linux, MYSQL, Zookeeper, AWS, HBase, Oozie, Kerberos, Ranger

Confidential

Hadoop Administrator

Responsibilities:

  • Monitoring Hadoop Ambari cluster and jobs.
  • Working on 83 nodes of Hadoop Hortonworks prod cluster and monitor storm topologies.
  • Worked on ELK (ElasticSearch, Logstatsh and Kibana) cluster setup for log analysis, writing Logstash grok patterns, dashboards, alerting with elastalert, monitoring with Marvel and security with shield plugin.
  • Worked on Ranger, Knox for test clusters
  • Worked on Apache Nifi, Zeppelin for different POCs.
  • Worked on setting up Hadoop clusters in AWS
  • Having knowledge on Apache spark.
  • Improved HBase performance and Yarn performance by fine tuning parameters.
  • Participating in code deployments and coordinating with developers
  • Creating snapshots and restoring snapshots.
  • Troubleshooting issues with jobs and coordinating dev team to fix them
  • Configuring scripts to backup of MYSQL database and HDFS Snapshots.
  • Good experience in troubleshoot production level issues in teh cluster and its functionality.
  • Participating on call with Onshore to update teh status of jobs and cluster

Confidential

Backup & Storage Admin

Responsibilities:

  • Providing end to end support in EMC Networker backup and recovery
  • Professional experience in installing and maintaining EMC Networker backup and recovery software and upgrading to latest technologies.
  • Capacity planning based on teh backup growth and implementing teh plan based on teh client requirement
  • Identify opportunities for improving systems in terms of Backup Time Optimizing, Minimize Errors dat increase resilience.
  • Troubleshooting teh backup failures of client machines at instant and theirby ensuring up to date backup data.
  • Creating problem management tickets for reoccurring failures and providing a permanent fix and root cause for teh failure to teh client.
  • Working on Problem management and Risk management, raising risk in critical situations and updating it in Risk register.
  • Preparing SOP for EMC Networker modules backup.
  • Planning, Implementing, and validating teh on-going maintenance in backup and recovery.
  • Worked in both Windows and Linux based backup servers.

We'd love your feedback!