We provide IT Staff Augmentation Services!

Hadoop Admin Resume

4.00/5 (Submit Your Rating)

MichigaN

SUMMARY

  • 6 + Years of IT experience and 4+ of experience in the administration, modification, installation and maintenance of Hadoop on Linux RHEL operating system.
  • Developed solid understanding of operating systems like Linux, UNIX, Windows.
  • Gained experience in installing, administering, and supporting operating systems and hardware in an enterprise environment (Centos/RHEL).
  • Gained experience in complete Software Design Life Cycle including design, development, testing and implementation of moderate to advanced complex systems.scalassss.
  • Gained experience in IT systems design, systems analysis, development and management.
  • Possessed hands on experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Horton works, Cloudera (CDH3, CDH4, CDH5), Yarn distributions.
  • Possessed hands on experience in Hadoop Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
  • Demonstrated ability to design Big Data solutions for traditional enterprise businesses.
  • Gained experience in data Integrity/Recovery/High Availability; Service & data migration; Disaster Recovery Planning; Contingency Planning; Capacity Planning, Research & Development; Risk Assessment & Planning; Cost Benefits Analysis.
  • Gained experience in setting, configuring and managing of security for Hadoop clusters.
  • Gained implementation experience in configuration and tuning of various components such as HDFS, MAP REDUCE, ZOOKEEPER, YARN, HBASE, HIVE, IMPALA, SQOOP, OOZIE, Cloudera Manager.
  • Furthered installation of various Hadoop Ecosystems and Hadoop Daemons.
  • Involved in bench marking Hadoop/HBase cluster file systems various batch jobs and workloads .
  • Obtained Experience in monitoring and troubleshooting issues with Linux memory, CPU, OS, storage and network
  • Obtained good experience on design, configure and manage the backup and disaster recovery for Hadoopdata
  • Gained hands on experience in analyzing Log files for Hadoop and eco system services and finding root cause
  • Obtained experience on Commissioning, Decommissioning, Balancing, and Managing Nodes and tuning server for optimal performance of the cluster.
  • Involved in Cluster maintenance, trouble shooting, Monitoring and following proper backup& Recovery strategies.
  • Obtained experience in HDFS data storage and support for running map - reduce jobs
  • Gained experience in installing and configuring Hadoop eco system such as Sqoop, pig, hive, Ansible etc.
  • Obtained experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa
  • Supported in optimizing performance of Hbase/Hive/Pig jobs
  • Closely worked with Developers and Analysts to address project requirements. Ability to effectively manage time and prioritize multiple projects
  • Gathered expertise in My SQL, MS ACCESS, POSTGRESQL, ORACLE 8i/9i/10g databases
  • Possessed Strong ability to troubleshoot any issues generated while building, deploying and in production support
  • Well versed with programming languages such as C, C ++, Java, .net, Python, SAP ABAP
  • Possessed expertise in Enterprise Systems configuration and business processes in SAP.
  • Self-motivated, quick learner, takes independent responsibility to contribute and teamwork skills

TECHNICAL SKILLS

Programming: C, C++, Java, VB. NET, Python, Shell Scripting and SAP ABAP

Database skills: My SQL, MS ACCESS, POSTGRESQL

ERP Skills: SAP Systems Configuration, End to End Business Processes in SAP

Software: HADOOP, MSOFFICE (Including MS Project, Access, Visio, Excel)

Operating Systems: Windows 10/8.1/7/XP/2000/9x, Linux, UNIX, MAC

Hardware: Assembling of PC, Networking & Trouble shooting

PROFESSIONAL EXPERIENCE

Confidential - Michigan

Hadoop Admin

Responsibilities:

  • Involved in start to end process of Hadoop cluster setup including installation, configuration and monitoring the Hadoop Cluster
  • Administered Cluster maintenance, commissioning and decommissioning Data nodes, Cluster Monitoring, Troubleshooting
  • Performed Adding/removing new nodes to an existing Hadoop cluster
  • Implemented Backup configurations and Recoveries from a Name Node failure.
  • Monitored systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures
  • Configured various property files like core-site.xml, hdfs-site.xml, mapred-site.xml based upon the job requirement
  • Performed Importing and exporting data into HDFS using Sqoop
  • Installed various Hadoop Ecosystems and Hadoop Daemons
  • Installed and configured HDFS, Zookeeper, Map Reduce, Yarn, HBase, Hive, Scoop, Ansible and Oozie
  • Integrated Hive and HBase to perform analysis on data.
  • Managed and reviewed Hadoop Log files as a part of administration for troubleshooting purposes. Communicated and escalated issues appropriately.
  • Applied standard Back up policies to make sure the high availability of cluster.
  • Involved in analyzing system failures, identifying root causes, and recommended course of actions. Documented the systems processes and procedures for future references.
  • Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
  • Involved in Installing and configuring Kerberos for the authentication of users and Hadoop daemons.
  • Monitored Clusters with Ganglia and Nagios

Environment: Hadoop, HDFS, Zookeeper, Map Reduce, YARN, HBase, Hive, Sqoop, Ansible, Oozie, Linux- CentOS, Ubuntu, Red Hat, Big Data Cloudera CDH, Horton Works, Apache Hadoop, SQL plus, Shell Scripting

Confidential - Harrisburg, PA

Associate Software Engineer

Responsibilities:

  • Engineer in Big Data team, worked with Hadoop, and its Ecosystem.
  • Used Flume to collect, aggregate and store the web log data onto HDFS.
  • Wrote Pig scripts to run ETL jobs on the data in HDFS.
  • Used Hive to do analysis on the data and identify different correlations.
  • Having knowledge on Installation and configuration of Cloudera Hadoop on single or cluster environment.
  • Worked on setting up of environment and re-configuration activities.
  • Development and maintenance of the Hive-QL, Pig Scripts.
  • Facilitating testing in different dimensions
  • This project involves File transmission and electronic data interchange trades capture, verify, process and routing operations, Banking Reports Generation, Operational management.
  • Worked on production support environment.

Environment: Apache Hadoop, Pig, Hive, SQOOP, Flume, My SQL, Application Server, Linux OS, Windows OS, Mac OS etc

Confidential

Associate Engineer

Responsibilities:

  • Worked on MySQL and successfully launched queries to provide required data to the department
  • Successfully Integrated MySQL with .Net to develop applications with best user Interface
  • Collaborated with the IT department and other departments and shared experiences with the other staff in the department to solve issues in information systems
  • Provided Support to clients on SAP systems through Email, phone and in person
  • Worked on SAP ECC 6.0 EHP 7 and CRM
  • Maintained SAP, share point, Perspective content 7 systems for required data availability
  • Generated and Analyzed Reports in SAP and Crystal Reports for the department and Prepared Documents and met requirements of the department within specified deadlines

Environment: My SQL, ERP, SAP, SAP ECC 6.0 EHP 7, CRM, Crystal Reports, Linux, Unix, Windows, MS Office, .Net, C#

We'd love your feedback!