We provide IT Staff Augmentation Services!

Linux Admin/hadoop Admin Resume

2.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • Over 4+ years of experience in various phases of project implementations including System Integration, 2+ years experience Big data technologies and Hadoop ecosystem: HDFS, MapReduce, YARN, Pig, Hive, Oozie, Hbase, Sentry, Sqoop, Flume & Zookeeper.
  • Experience in administering, installation, configuration, supporting and maintaining Hadoop cluster using Cloudera, Hortonworks and Mapr distributions.
  • Technical support User training and Documentation. on various technologies primarily Linux/Unix & Big Data Systems in diverse industries.
  • Experience in setting, configuring & monitoring of Hadoop cluster using ClouderaCDH4, CDH5, Hortonworks HDP 2.1, 2.2 and 2.3 on Ubuntu, Redhat, Centos systems .
  • Experience in design, development, and maintenance and support of Big Data Analytics using Hadoop Ecosystem components like HDFS, Hive, Pig, Hbase, Sqoop, Flume, MapReduce, Kafka and Oozie.
  • Extensive knowledge of Mapper/Reduce/HDFS Framework.
  • Hadoop Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
  • Experience in Big data domains like Shared Service ( Hadoop Clusters, Operational Model, Inter - Company Charge back, and Lifecycle Management).
  • Worked on NoSQL databases including Hbase and MongoDB
  • Hands on experience with "Productionalizing" Hadoop clusters
  • Exposure to installing Hadoop and its ecosystem components such as Hive and Pig.
  • Experience in Cloudera Hadoop Upgrades and Patches and Installation of Ecosystem Products through Cloudera manager along with Cloudera Manager Upgrade and in i nstalling, supporting operating systems and hardware in CentOS/RHEL .
  • Experience in systems & network design ; physical system consolidation through server and storage virtualization, remote access solutions.
  • Experience in understanding and managing Hadoop Log Files, experience in managing the Hadoop infrastructure with Cloudera Manager . Involved in building Big Data cluster and successfully performed installation of CHD using Cloudera manager.
  • Experience in Linux admin activities and in IT system design, analysis and management.
  • Experience in HDFS data storage and support for running map-reduce jobs.
  • Experience in Performance Management of Hadoop Cluster
  • Experience in using Flume to load log files into HDFS
  • Expertise in using Oozie for configuring job flows
  • Managing the configuration of the cluster to meet the needs of data analysis whether I/O bound or CPU bound
  • Experience in using full suite of infrastructure services like DNS and NFS Mount
  • Developed Hive Queries and automated those queries for analyzing on Hourly, Daily and Weekly Basis
  • Strong troubleshooting and performance tuning skills
  • Coordinating Cluster services through Zookeeper
  • Importing and exporting data into HDFS and Hive using Sqoop
  • Experience in importing and exporting the pre processed data into the commercial Analytic database - RDBMS
  • Deep understanding of data warehouse approaches, industry standards and industry best practices Created Hive tables, loaded data and wrote Hive queries that run within the map.
  • Implemented business logic by writing Pig UDF's in Java and used various UDFs from Piggybanks and other sources.
  • Used OOZIE Operational Services for batch processing and scheduling workflows dynamically.
  • Extensively worked on creating End-End data pipeline orchestration using Oozie.
  • Responsible for continuous monitoring and managing Elastic Map Reduce cluster through AWS console.

TECHNICAL SKILLS:

Programming Languages: Core Java, C, C++

Hadoop/Big Data: Hadoop, HDFS, Hive, Sqoop, Oozie, Flume and MapReduce

No Sql Database: HBase, MongoDB

Operating Systems: Linux, Unix, Ubuntu

Databases: NoSQL, Mongo DB, RDBMS

Scripting Language: Bash, Perl, Python and Shell scripting

Tools: Puppet, Chef, Nagios, Ganglia

WORK EXPERIENCE:

Confidential

Hadoop Admin

Responsibilities:

  • Installed and configured Hadoop and Ecosystem components in Cloudera and Hortonworks environments.
  • Installed and configured Hadoop, Hive and Pig on Amazon EC2 servers
  • Upgraded the cluster from CDH4 to CDH5 The tasks were first performed on the staging platform, before doing it on production cluster.
  • Enabled Kerberos and AD security on the Cloudera cluster running CDH 5.4.4.
  • Implemented Sentry for the Dev Cluster
  • Configured MySQL Database to store Hive metadata.
  • Involved in managing and reviewing Hadoop log files.
  • Involved in running Hadoop streaming jobs to process terabytes of text data.
  • Worked with Linux systems and MySQL database on a regular basis.
  • Supported Map Reduce Programs those ran on the cluster.
  • Involved in loading data from UNIX file system to HDFS.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • As a admin followed standard Back up policies to make sure the high availability of cluster.
  • Involved in Analyzing system failures, identifying root causes, and recommended course of actions. Documented the systems processes and procedures for future references.
  • Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
  • Installed and configured Hive, Pig, Sqoop and Oozie on the HDP 2.2 cluster.
  • Managed backups for key data stores
  • Supported configuring, sizing, tuning and monitoring analytic clusters
  • Implemented security and regulatory compliance measures
  • Streamlined cluster scaling and configuration
  • Monitoring cluster job performance and involved capacity planning
  • Works with application teams to install operating system and Hadoop updates, patches, Version upgrades as required.
  • Documented technical designs and procedures

Environment: HDFS, Hive, Pig, sentry, Kerberos, LDAP, YARN, Cloudera Manager, and Ambari

Confidential

Hadoop Administrator

Responsibilities:

  • Responsible for Cluster maintenance, Adding and removing cluster nodes, Cluster Monitoring and Troubleshooting, Manage and review data backups, Manage and review Hadoop log files on Hortonworks, MapR and Cloudera clusters
  • Responsible for architecting Hadoop clusters with Hortonworks distribution platform HDP 1.3.2 and Cloudera CDH4.
  • Experienced in Installation and configuration Hortonworks distribution HDP 1.3.2 and Cloudera CDH4
  • Upgraded Hortonworks distribution HDP 1.3.2 to HDP 2.2
  • Responsible on-boarding new users to the Hadoop cluster (adding user a home directory and providing access to the datasets).
  • Played responsible role for deciding the hardware configurations for the cluster along with other teams in the company.
  • Resolved tickets submitted by users, P1 issues, troubleshoot the errors, documenting, resolving the errors.
  • Experienced in writing the automatic scripts for monitoring the file systems, key MAPR services.
  • Responsible for giving presentations about new ecosystems to be implemented in the cluster with the teams and managers.
  • Helped the users in production deployments throughout the process.
  • Managed and reviewed Hadoop Log files as a part of administration for troubleshooting purposes. Communicate and escalate issues appropriately.
  • Applied patches to cluster.
  • Added new Data Nodes when needed and ran balancer.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Continuous monitoring and managing the Hadoop cluster through Ganglia and Nagios.
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs, which run independently with time and data availability.
  • Done major and minor upgrades to the Hadoop cluster.
  • Upgraded the Cloudera Hadoop ecosystems in the cluster using Cloudera distribution packages.
  • Done stress and performance testing, benchmark for the cluster.
  • Commissioned and decommissioned the Data Nodes in the cluster in case of the problems.
  • Debug and solve the major issues with Cloudera manager by interacting with the Cloudera team from Cloudera.

Environment: Flume, Oozie, Pig, Sqoop, Mongo, Hbase, Hive, Map-Reduce, YARN, Hortonworks and Cloudera Manager

Confidential

Linux Admin/Hadoop admin

Responsibilities:

  • Responsible for administration, installation and maintenance of RHEL3/4 and Sun Solaris 9.
  • Installation, Configuration and upgrade of Redhat Linux (2,3,4,5), CentOS, SUSE, Solaris, AIX, HP-UX & Windows 2000 / 2003/2008 Operating System.
  • User Administration, management and archiving.
  • Configuring NFS, NIS, NIS+, DNS, Auto Mount & Disk Space Management on SUN Servers.
  • Experience in Configuring and Managing Virtual Disks, Disk Mirrors & RAID Levels.
  • Worked with OVF "Open Virtual Machine Format" supported tools Virtual Box, and VMware Workstation.
  • Extensive use of Logical Volume Management (LVM) to create volume groups, logical volumes (LVs), extended volumes to support file system growth.
  • Configured TCP/IP, NFS, NTP, Auto mount and send mail etc as per the requirement.
  • Installation, Configuration & upgradation of Solaris, Linux, HP-UX, AIX operating systems using Jumpstart/Flash/live upgrades for Solaris, Kickstart for Red Hat and Ignite for HP-UX for automated installations.
  • Installation, Maintenance, Administration and troubleshooting of Sun Solaris, AIX, HP-UX, Linux.
  • Performed automated installations of Operating System using Jumpstart for Solaris and kickstart for Linux.
  • Installation and configured Solaris 10 with Zone configuration
  • Created and maintained ZFS filesystems.
  • Involved in configuration of NIS, NIS+ and LDAP.
  • Manage ACTT ticket queue resolving Operating Systems issues related to process failures, software and hardware system alarms which generate Trouble Tickets for Sun servers.
  • Involved in Disaster Recovery testing every quarter.
  • Trouble shooting of day to day system and user problems.
  • Used VMware for testing various applications on different operating systems.
  • Interacting with various teams such Oracle database team, and Net Backup team.
  • Troubleshooting various problems, logging call with vendors.
  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and pre-processing.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase database
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Created Hbase tables to store variable data formats of PII data coming from different portfolios.
  • Implemented a script to transmit sysprin information from Oracle to Hbase using Sqoop.
  • Implemented best income logic using Pig scripts and UDFs.
  • Implemented test scripts to support test driven development and continuous integration.
  • Worked on tuning the performance Pig queries.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
  • Responsible to manage data coming from different sources.
  • Involved in loading data from UNIX file system to HDFS.
  • Load and transform large sets of structured, semi structured and unstructured data
  • Experienced in managing and reviewing Hadoop log files.
  • Load and transform large sets of structured, semi structured and unstructured data that includes Avro, sequence files and xml files.
  • Involved in loading data from UNIX file system to HDFS.

Confidential

Window/Linux Admin

Responsibilities:

  • Installing, configuring and troubleshoot the lotus notes client with the respective id files and locals.
  • Maintaining and configuring Active Directory.
  • Adding Installing Windows and Linux operating system on desktop, laptops
  • Maintaining and creating domain.
  • Adding machines into the domain and installing the required software’s
  • Creating users and giving the permissions to the required DL’s for the users.
  • Installing and configuring hub (IBM—Pcomm).
  • Configuring network printers to the desktops and laptops.
  • Installing Microsoft products like ms-office, Visio, ms-project and other software’s like vss.
  • Setting up the process for attending to the support request from the users.
  • Laying down the network design for the premises and recommending the required equipments.
  • Getting desktop PCs assembled and Operating systems & other applications installed.
  • Deploying, maintaining and troubleshooting mail servers, application servers, DHCP servers and modems.
  • Ensuring proper working of all the equipments.
  • Producing reports network productivity, plans and issues for higher management.
  • Configuration, maintenance and troubleshooting of LAN, WAN, modems.
  • Deployment and Trouble Shooting DHCP Server, DNS Server, Citrix Server, Proxy Server, application server.
  • Installation and maintenance of security software on the mail servers, desktops and application servers. Implementing security policies.
  • Installation and configuration of Aruba 800 controller & Access Points, User Roles, User Authentication Web, 802.1x Authentication, Captive Portal Authentication.
  • Configuration and maintenance of Cisco router, hp layer3 switch, l2 switches.

We'd love your feedback!