Cloudera Hadoop Administrator Resume
2.00/5 (Submit Your Rating)
Beltsville, MD
SUMMARY:
- Over 8+ years of total experience in Information Technology.
- 2.5+ Years as Hadoop Administrator with extensive experience on HDFS, MapReduce, yarn, Hortonworks, Cloudera and Hadoop Ecosystems.
- 3+ Years multifaceted and performance - focused professional, offering hands-on experience as AWS Solution Architect by designing, deploying, configuring and maintaining highly available, scalable and fault tolerant systems.
- Design, planning and implementation
- Hadoop cluster management, Troubleshooting, Process Optimization
- AWS solution architect, Cost and performance optimization
PROFESSIONAL EXPERIENCE:
Confidential, Beltsville, MD
Cloudera Hadoop Administrator
Responsibilities:
- Hadoop cluster setup, installation, and configuration, upgrades, backup and recovery and administration experience of multinodes cluster using Cloudera manager.
- Strong understanding on Hadoop architecture and MapReduce framework.
- Configuring Hadoop ecosystem tools with including Pig, Hive, HBase, Sqoop, Flume, Kafka, Oozie, Spark and Zookeeper.
- HDFS support and maintenance
- Expertise in cluster maintenance, bug fixing, troubleshooting and monitoring along with proper backup and recovery strategies.
- Experience in deploying and managing the multi-node development, testing and production Hadoop cluster with different Hadoop components using Cloudera Manager.
- Experience on commissioning, decommissioning, balancing and managing cluster nodes.
- Strong knowledge on setting up HA for Name node, Resource manager and Hbase.
- Provisioning, installing, configuring, monitoring, and maintaining HDFS, Yarn, HBase, Flume, Sqoop, Oozie, Pig, Hive.
- Monitored workload, job performance and capacity planning using Cloudera Manager.
- Implement and manage Secure Authentication and Authorization mechanism for Hadoop clusters using Kerberos and Rangers.
- Experienced in setting up the Linux environments, SSH, creating file systems, disabling firewalls, swappiness, SELinux.
- Experience monitoring and troubleshooting issues with Linux memory, CPU, OS, storage and network.
- Experience in upgrading Hadoop cluster from current version to minor version upgrade as well as to major versions.
- Installing, Upgrading and Managing Hadoop Cluster on Cloudera distribution.
- Collaborated with multiple teams for design and implementation of big data clusters in cloud environments.
Confidential, At Auburn, Alabama
AWS Solution Architect / Hadoop Admin
Responsibilities:
- Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop HDFS, Yarn, MapReduce, HBase, Oozie, Hive, Sqoop, Pig, Ranger, and Kerberos.
- Handled importing of data from various data sources, performed transformations using Hive, Map Reduce, loaded data into HDFS and Extracted the data from MYSQL into HDFS using SQOOP.
- Administering Kafka messaging platform and working with Kafka components
- Focusing on high-availability, fault tolerance, and auto scaling using AWS Cloud Formation.
- Provisioning, configured and managed various AWS Services including EC2, AMI, RDS, VPC and subnets, S3 buckets, EBS, Glacier,Cloud Watch, Cloud Front, Route 53.
- Importing and exporting data into HDFS using Sqoop
- Configured various performance metrics using AWS Cloud watch & Cloud Trial
- Worked on configuring Cross-Account deployments using AWS Code Pipeline, Code Build and Code Deploy by creating Cross-Account Policies & Roles on IAM.
- Written various Lambda services for automating the functionality on the Cloud.
- Used AWS Route 53 for configuring High-Availability and Disaster recovery.
- Maintained user accounts (IAM), RDS, Route 53, VPC, RDB, Dynamo DB, SES, SQS and SNS services in AWS cloud.
- Create AMIs or mission critical production servers for backup.
- Hold responsibility in migrating data to AWS Redshift, including developing and delivering of data across all the nodes and clusters on different availability zones in AWS Redshift.
Network Engineer
Confidential
Responsibilities:
- Construction and installation of telephone subscriber lines
- Building Transmission links (optic fiber, radio relay) connected to Confidential switching systems.
- Configuration of Cisco equipment (routers, switches) and perform basic troubleshooting.
- LAN and WAN network deployment on the client’s end.