We provide IT Staff Augmentation Services!

Sr. Hadoop Admin, Bigdata Engineer Resume

3.00/5 (Submit Your Rating)

St Louis, MO

SUMMARY

  • Result - driven IT Professional with referable 10+ years of experience as Hadoop Administrator, Oracle Database Administrator in implementing large-scale production environments across multiple heterogeneous system environment including UNIX, Linux, AIX, HP-UX and Windows .
  • Hadoop Administrator in Cloudera Cluster and having good experience in all Hadoop distribution modes like (Apache Hadoop cluster, Cloudera, HortonWorks and MAPR).
  • Hands on experience on Hadoop Admin like storage, processing layers and its Eco system tools like HDFS, NameNode, DataNode, YARN, Journal Node, Zookeepers, Sqoop, Pig, Hive, HBase, Map Reduce, Core Java and Oozie .
  • Experience on Installing & Configuring Clusters ( Cloudera, HortonWorks (Ambari) and MAPR ) on Red hat Linux.
  • Experience on Hive and HBase for Data Manipulation in Hadoop .
  • Experience on developing application on Map Reduce programming model of Hadoop in java .
  • Experience on Performance tuning in all layers of Hadoop like storage, processing and its ecosystem level.
  • Hands-on experience with Oracle Real Application Clusters (RAC10g/11g/12c), Golden gate, Exadata and Data Guard .
  • Experience as an Oracle DBA in various domains to deliver new products, upgrades with Oracle 11g/10g/9i/8i on Sun Solaris, Red Hat Linux, SUSE, HP-UX, IBM-AIX and Windows platforms.
  • Expert in Installation of high availability RAC environment (2, 3 and 4 node clusters) and upgraded standalone databases to RAC environment.
  • Experience in Real Application Clusters (RAC) setup, configuration, and cluster interconnection with Automatic Storage Management (ASM), configured user log on load balancing and failover on RAC database for high availability.
  • Experience in upgrading databases from 10g R2 to 11g, 10g R1 to 10gR2, 9i to 10gR1 .
  • Proficient in creating Golden Gate extracts process, extract pump and replicate for all types like classic, Integrated and coordinator.
  • Experience in using OEM, TOAD, and Data Migrations using export/import, ATG, benchmarking using Load testing tools.
  • Good knowledge on ITIL processes and concepts for managing incidents, change requests.
  • Good Experience and working knowledge on Exadata configuration like 1/8, Quarter, Half, Full Rack and Infini-band inter connect.

TECHNICAL SKILLS

Framework: HADOOP (HDFS, MapReduce, Sqoop, Pig, Hive, HBase, Oozie, S3, Glacier, MySQL, Teradata, Redshift, JIRA, Shell Scripting etc.)

Databases: Oracle 11g/10g/9i/8i ASM, Grid, Real Applications, MySQL

Tools: Asmcmd, OEM, RMAN, DataPump, VNC Viewer, Putty, AWR, Lsnrctl, Tnsping, OPatch, Edit plus, Golden Gate, WinSCP, Quality Controller, ADDM, PL/SQL Developer, Oracle APEX

RAC Utilities: Crsctl, Srvctl, Ocrconfig, Ocrdump, Ocrcheck, Oifcfg, Asmcmd, Taf

Data Guard: Physical Standby, Active Data Guard, and Snapshot Data Guard

Programming: Oracle SQL, PL/SQL, C, C++, HTML, CSS, Java Script, Unix Shell Scripting

Hadoop Distribution modes: Cloudera, HortonWorks, MapReduce

Operating Systems: Red Hat Linux, Oracle Enterprise Linux 4 /5.5/6, Sun Solaris, MS Windows 7, VMWare

Data Replication: Oracle Golden Gate 11gR2, 12c

Ticketing Tools: Remedy, Maximo, SAP CRM, IRIS, HPSM, ServiceNow

PROFESSIONAL EXPERIENCE

Confidential - St. Louis, MO

Sr. Hadoop Admin, BigData Engineer

Responsibilities:

  • Effectively involved in Hadoop HDFS, Map-Reduce and other Eco-System Projects.
  • Installation and Configuration of Hadoop Cluster distributions like Cloudera .
  • Worked with Cloudera Support Team to Fine Tune Cluster.
  • Developed map Reduce jobs to analyze data and provide heuristics reports.
  • Adding, Decommissioning and rebalancing nodes.
  • Created POC to store Server Log data into MySQL to identify System Alert Metrics.
  • Rack Aware Configuration.
  • Configuring Client Machines.
  • Configuring, Monitoring and Management Tools.
  • HDFS Support and Maintenance.
  • Cluster HA Setup as per the business requirement.
  • Applying Patches and Perform Version Upgrades .
  • Incident Management, Problem Management and Change Management.
  • Performance Management and Reporting.
  • Recover from Name Node failures.
  • Schedule Map Reduce Jobs -FIFO , FAIR share and Capacity scheduler.
  • Installation and Configuration of other Open Source Software like Pig, Hive, HBASE, Flume and Sqoop .
  • Integration with RDBMS using Sqoop and JDBC Connectors.
  • Worked with Dev Team to tune Job Knowledge of Writing Hive Jobs.
  • Worked with data delivery teams to setup new Hadoop users. This job includes setting up Linux.
  • Performed cluster/node migrations and upgrade (new hardware and/or new OS version and/or new Hadoop version) with OS or Hadoop tools or manually.
  • Handled Ganglia & Nagios alerts to handle CPU, Memory, Storage and Network.

Environment: Cloudera Manager 5.9.0, RHEL 6, MySQL 5.6, JDK 1.8, Apache Hadoop 2.6.0, Apache Sqoop 1.4.6, Pig 0.13.0, Hive 1.2.0, HBase 1.1.0, Oozie 4.1.0, Spark 1.6.0, Kafka 0.9.0 and Apache Zookeeper 3.4

Confidential - Dallas, TX

Hadoop Admin

Responsibilities:

  • Responsible for implementation and ongoing administration of Hadoop infrastructure in MapR Distribution mode.
  • Prepare and perform the installation of Hadoop software
  • Added and configured the nodes
  • Monitor Cluster Health and Troubleshooting
  • Management of the meta data databases
  • Manage and review Backups
  • Moving data efficiently between clusters using Distributed Copy
  • Restore in case of physical data loss (e.g. block corruptions)
  • Restore in case of project-specific requirements
  • Rebalancing of the HDFS
  • Regularly scheduling statistics runs and automating them
  • Quota administration, notification in case of space problems
  • Moving/shifting nodes/roles/services to other nodes
  • Performing cluster/node migrations (new hardware and/or new OS version and/or new Hadoop version) with OS or Hadoop tools or manually
  • Processing the events of Hadoop logs, taking measures, correcting errors, and involving the relevant teams if necessary
  • Tuning by making changes to settings (e.g. HBase, Hive, NoSQL, etc.)

Environment: MCS 5.2.0, 5.9.0, RHEL 6, MySQL 5.6, JDK 1.8, Apache Hadoop 2.7.0, Apache Sqoop 1.4.6, Pig 0.15.0, Hive 1.2.0, HBASE 1.1.0, Oozie 4.2.0, Spark 1.6.0, and Apache Zookeeper 3.4

Confidential

Hadoop Admin

Responsibilities:

  • Installed and configured Hadoop ecosystem tools Sqoop, Flume, HBase, Zookeeper and Oozie.
  • Configured various property files like core­site.xml, hdfs­site.xml, mapred­site.xml based upon the job requirement.
  • Managing alerts from cluster monitoring tools like Ganglia and Nagios.
  • Managing and reviewing Hadoop log files.
  • Performance tuning of Hadoop cluster and Hadoop jobs.
  • Disk space management and monitoring.
  • Importing and exporting data into HDFS using Sqoop.
  • Performing data balancing on clusters.
  • Managing HDFS cluster users, permissions.
  • Involving in Analyzing system failures, identifying root causes, and recommended course of actions. Documented the systems processes and procedures for future references.
  • Taking backup of Hadoop metadata using snapshots.
  • Working together with infrastructure, network and application teams to guarantee high data quality and availability.
  • Install Hadoop patches and version upgrades when required.
  • Decommissioning and commissioning the Node on running Hadoop cluster.
  • Worked along with the Service Providers to resolve the tickets that were raised by various business teams
  • Data copy from one cluster to another or cluster using distcp utility

Environment: Cloudera Manager 5.9.0, RHEL 6, MySQL 5.6, JDK 1.8, Apache Hadoop 2.6.0, Apache Sqoop 1.4.6, Pig 0.13.0, Hive 1.2.0, HBASE 1.1.0, Oozie 4.1.0, Spark 1.6.0, Kafka 0.9.0 and Apache Zookeeper 3.4

We'd love your feedback!