We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

0/5 (Submit Your Rating)

PA

SUMMARY

  • Overall 7 Years of IT experience in which 3 years of experience as Hadoop Administrator and 6 years of experience in Unix Linux Administrator.
  • Expertise in Installing, Updating Hadoop and its related components in multi - node Cluster environment.
  • Expertise in Managing, Monitoring and Administration of Hadoop 100 Node Cluster.
  • Expertise in Installing and setting up the Hadoop Stack.
  • Expertise in Implementing and maintain Hadoop Security and Hive security.
  • Expertise installing and working with monitoring tools like Ganglia and Nagios.
  • Expertise working with Hadoop Ecosystems like Hive, Pig and HBase.
  • Expertise in Sqoop and Flume.
  • Expertise in Installing, Configuration and Managing Redhat Linux 4, 5 & CentsOS Linux 6.5.
  • Expertise in Unix Shell Script.
  • Working experience in Large Datacenters in Production environment with 24X7 On-Call Support.
  • Excellent communication skills, hardworking and good team player with ability to work under pressure in a highly visible role.
  • Expertise in ETL processes, importing the data from Teradata using sqoop.
  • Working experience in MapReduce and YARN.

TECHNICAL SKILLS

Operating Systems: RedHat Linux 4,5, CentOS 6.5, Windows 95,98, NT,2000, Windows Vista, 7

Networking: HTTP, SMTP, NFS, CIFS, TCP/IP, SSH, FTP, DNS, DHCP

Scripting: Unix Shell Script

Hadoop EcoSystems: Hive, Pig, Flume, Sqoop & Hbase

Monitoring Tools: Ganglia, Nagios

Database: Oracle Database, Mysql DB

PROFESSIONAL EXPERIENCE

Confidential, PA

Hadoop Administrator

Responsibilities:

  • Installing, Upgrading and Managing Hadoop 100 Node Cluster on MapR and Hortonworks distribution.
  • Experience in building the cluster in different environments (Test, Production).
  • Migrating data between two clusters using DistCp.
  • Expertise in importing data from Teradata using sqoop.
  • Expertise in analyzing the logs and diagnosis the issues.
  • Installing and working with monitoring tools like Nagios and Ganglia.
  • Built configured log data loading into HDFS using Flume.
  • Maintaining the Hadoop Security and Hive Security using PAM.
  • Working with the Hadoop Eco-systems like Hive,Pig,Hbase
  • Expertise to install RPM packages on Linux.
  • Performing importing and exporting data into HDFS and Hive using Sqoop.
  • Expertise in building cluster environments for stg,dev and prod environements.

Confidential, High Point, NC.

Hadoop Administrator

Responsibilities:

  • Researched and recommended hardware configuration for Hadoop cluster
  • Installing, Upgrading and Managing Hadoop Cluster on Cloudera distribution.
  • Troubleshooting many cloud related issues such as Data Node down, Network failure and data block missing.
  • Installing and working with monitoring tools like Nagios and Ganglia.
  • Managed in analyaing the logs and diagnosis the issues.
  • Built automated set up for cluster monitoring and issue escalation process.
  • Built and configured log data loading into HDFS using Flume.
  • Performed Importing and exporting data into HDFS and Hive using Sqoop.
  • Involved in building Pig Latin Scripts and Hive queries.
  • Expertise to install RPM Packages on Linux.
  • Good experience in Cloudera Impala
  • Adding and removing nodes from the Cluster environment.

Confidential

Unix Administrator

Responsibilities:

  • Involved in support and monitoring production Linex Systems.
  • Installation SQL and DB Backp.
  • Expergtise in Archive logs and Monitoring the jobs.
  • Monitoring Linex daily jobs and monitoring log management system.
  • Expertise in troubleshooting and able to work with a team to fix large production issues.
  • Expertise in creating and managing DB tables, Index and Views.
  • User creationa and managing user accounts and permissions on Linux level and DB level.
  • Expertise in Security in OS level and DB level.

Confidential

Unix Administrator

Responsibilities:

  • Build Linux servers. Upgrade and patch existing servers. Compile, built and upgrade Linux kernel..
  • Setup Solaris Custom Jumpstart server and clients and implement Jumpstart installation.
  • Worked with Telnet, FTP, TCP/IP, rlogin, used to inter-operate hosts.
  • Contact various systems administration works under CentOS, Redhat Linux environments.
  • Performed regular day-to-day system administrative tasks including User Management, Backup, Network Management, and Software Management including Documentation etc.
  • Recommend system configurations for clients based on estimated requirements.
  • Performed reorganization of disk partitions, file systems, hard disk addition, and memory upgrade.
  • Monitored system activities, log maintenance, and disk space management..
  • Encapsulated root file systems, and mirrored the file systems were mirrored to ensure systems had redundant boot disks.
  • Administer Apache Servers. Published client’s web site in our Apache server.
  • Fix all the system problems, based on system email information and users’ complaints.
  • Upgrade software, add patches, and add new hardware in UNIX machines.

We'd love your feedback!