Job Seekers, Please send resumes to firstname.lastname@example.org
Pl. send resumes to email@example.com
- Around 10+ years of Experience
- Deploy the Hadoop File System (HDFS), and peripheral technologies in its ecosystem (HDFS, Hive, Pig, Hbase/Cassandra, Flume, Zookeeper) using both automated toolsets as well as manual processes.
- Maintain, support, and upgrade Hadoop clusters.
- Monitor jobs, queues, and HDFS capacity using Zookeeper and vendor-specific front-end cluster management tools.
- Balance, commission & decommission cluster nodes.
- Apply security (Kerberos / Open LDAP/SAML) linking with Active Directory and/or LDAP.
- Enable users to view job progress via web interface.
- Onboarding users to use Hadoop – configuration, access control, disk quota, permissions etc.
- Address all issues, apply upgrades and security patches.
- Commission/de-commission nodes backup and restore.
- Apply "rolling " cluster node upgrades in a Production-level environment.
- Assemble newly bought hardware into racks with switches, assign IP addresses properly, firewalling, enable/disable ports, VPN etc.
- Work with virtualization team to provision / manage HDP cluster components.
- Flexible work hours
- Minimum 3 years of Linux/Unix administration.
- Minimum 3 years of experience with (Cloudera/Hortonworks) Hadoop Administration.
- Extensive experience in Hadoop ecosystem including Spark, MapReduce, HDFS, Hive, HBase, and Zeppelin.
- 1 year experience with Hadoop-specific automation (e.g. blueprints).
- 1 year technical experience managing Hadoop cluster infrastructure environments (e.g. datacenter infrastructure) that utilized at least 20 data nodes.
- Experience scripting in one or more of Python, bash, PowerShell, Perl, Java.
- 1 year experience with Puppet and / or Chef (good to have not mandatory)
- 1 year virtualization experience in any of VMware / Hyper-V / KVM. (Good to have not mandatory) ability to thrive within a dynamic technology environment.
- Certified Hadoop Admin (Hortonworks/Cloudera)
- Red Hat certified
- Networking (TCP/IP, Routers, IP addressing, use of network tools)
- Analyzing data with Hive, Pig and/or HBase
- Data Ingestion, streaming, or Importing/exporting RDBMS data using Sqoop
- DBA experience.
- RDBMS SQL Development.
- Manage cluster-hardening activities.
For faster process, call to - (202) 719-0200 Ext:207.