Hadoop Admin/developer Resume
4.00/5 (Submit Your Rating)
TX
SUMMARY
- Responsible for providing database administration, database security and production support . Installing, upgrading and maintaining databases. Plans and executes database backup, archiving and recovery.
- Provides database tuning for performance and stability. Manages physical database design and implementation, database security implementation and monitoring.
- Uses clients change management processes to prevent production disruptions and outages. Partners wif other client IT organizations to ensure quality support and delivery to the end - user. Participates in an on-call rotation wif other DBA's to provide 24X7 coverage.
- Typically requires 7-10 years of experience. Senior level. Understands advanced aspects of discipline and is viewed as an expert in a given field. Applies broad range of competencies to develop solutions to complex problems.
- Influences others to achieve objectives. Often provides specialized/technical/functional guidance to others wifin department and/or business asset.
TECHNICAL SKILLS
Hortonworks: Hadoop - HDP - on-prem / cloud. HDF a big plus
Big Data tools: HBase, Hive, Phoenix, Ranger other services in Ambari.
Shell Scripting: Bash, Perl, Python
Streaming experience: (Nifi, Kafka, Storm, Spark, Solr, etc.)
NoSQL experience: (HBase, Cassandra, MongoDB, etc.)
Advanced knowledge: in Linux including shell scripting.
ODBC/JDBC: wif various clients like Spotfire, BOBJ, Tableau etc
PROFESSIONAL EXPERIENCE
HADOOP Admin/Developer
Confidential, TX
Responsibilities:
- Experience installing, upgrading, configuring, and maintaining a Hadoop cluster.
- Responsible for implementation and ongoing administration of Hadoop infrastructure of some or all of the big data systems
- Cluster maintenance as well as creation and removal of nodes.
- HDFS support and maintenance.
- Setup security using Kerberos and AD on Hortonworks clusters.
- Setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
- Automate operations, installation and monitoring of the Hadoop Framework specifically: HDFS, Map/Reduce, Yarn, HBase.
- Automate the setup of Hadoop Clusters
- Ability to write, debug, and maintain automation scripts / jobs
- Continuous evaluation of Hadoop infrastructure requirements and design/deploy solutions (high availability, big data clusters, etc)
- Cluster Monitoring and Troubleshooting
- Manage and review Hadoop log files
- Works wif application teams to install operating system and Hadoop updates, patches, version upgrades as required
- On - Call responsibilities, create documentation, resolve support tickets and meet Business SLAs.
- Supporting production environment wif availability
- Experience wif Hadoop Architecture and Big Data users to implement new Hadoop eco-system technologies to support multi-tenancy cluster
- Experience in working wif the vendor(s) and user communities to research and test new technologies to improve the technical capabilities of existing Hadoop clusters