Hadoop Administrator Resume
5.00/5 (Submit Your Rating)
Lansing, MI
SUMMARY:
- A qualified Technocrat and a seasoned professional offering 7+ years of IT experience in Administration in Hadoop of Apache, Cloudera, Hortonworks & working with Cassandra (NoSQL Database) and Product Development using agile model.
- Results - driven professional, recognized for taking on major initiatives, adapting to rapidly changing environments and resolving mission-critical issues to ensure bottom-line success.
- A visionary leader with good communication, team building and management, interpersonal & analytical skills. Hadoop Administration Expertise:
- Built real time Big-Data solution handling HDP Stack & HBASE which handles billions of records.
- Experience in installation, configuration and Setting up HBase and worked on various loading techniques in HBase.
- Having proficient knowledge on HADOOP configuration and monitoring and logging activities on HADOOP Ecosystems
- Worked on tuning of HADOOP cluster and designed and implemented specific configuration as per Business application jobs
- Skillful in installing the HADOOP ecosystem and its creating its backup and maintaining the same.
- Experience in Setting Hadoop Cluster and worked with MapReduce programming using NoSQL Database HBase.
- Implementing queues for application to application needs to ensure jobs to run smoothly
- Good at redoing metadata when a particular name node metadata is corrupted
- Experience in importing and exporting data from different databases like MySQL into HDFS and Hive using Sqoop.
- Experience in benchmarking, performing backup and disaster recovery of Name Node metadata and important sensitive data residing on cluster.
- Experience in running YSCB test on HADOOP Cluster and tune according to application
- Commissioning and decommissioning of nodes on Hadoop cluster.
- Good knowledge in implementing Kerberos security features to ensure system run effectively
- Setting up Knox A perimeter security to handle Hadoop ecosystem.
- Expertise in Monitor system performance and utilization using Linux utilities
- Install, configure, and document new servers and applications based on application
- Proficient in maintain and audit user accounts & Permissions
- Setting up manage fileserver utilization (shared folders, quotas, etc.) for NAS
- Track vulnerabilities and apply appropriate patches and upgrades Academic Background
- Bachelor of Technology in electronics and communication engineering, JNTU University 2008.
TECHNICAL SKILLS:
Hadoop Ecosystem: HDFS, MapReduce, YARN, Hive, Pig, Sqoop, Oozie, FlumeSecurity: Kerberos
Programming Languages: Java, C#, C, SQL, Java Script, HTML
Scripting Languages: Shell Scripting
NoSQL Database: Cassandra
IDE Tools: Eclipse, Visual Studio, MS SQL Server, MS Office
Monitoring Tools: Nagios, Ganglia
Operating Systems: Linux RedHat/CentOS, Windows (XP/7/8)
Virtualization technologies: VMware vSphere
WORK EXPERIENCE:
Confidential, Lansing MI
Hadoop Administrator
Roles & Responsibilities:
- Built & Deployed Hadoop clusters with different Hadoop components (HDFS, YARN, HBASE, ZOOKEEPER).
- Orchestrating Hadoop cluster using Ambari & maintaining house hold tasks for system to sustain.
- Configuring Scheduler on the Resource Manager to provide a way to share large cluster resources.
- Deployed Name Node high availability for Hadoop cluster where to handle automatic failover control
- Successful implementation of using zookeeper service and quorum journal nodes for HA Environment
- Implementing rack aware topology on the Hadoop cluster ensure data integrity
- Integrating Hadoop cluster with Kerberos authentication for secured authentication & authorization of Hadoop cluster
- Making Work flow automation using OOZIE to maintain jobs flows
- Troubleshooting issues with Hadoop components and fine tuning cluster to run smoothly
- Regular Ad-Hoc execution of Hive and Pig queries depending upon the use cases.
Confidential, St. Louis MO
Hadoop Administrator
Responsibilities:
- Benchmarking Hadoop cluster to ensure that application runs smoothly like Tera sort & Teagen.
- Configured Fair scheduler to share the resources of the cluster for resource management
- Managing filesystem check for blocks & directories on Hadoop cluster to ensure no block are missing.
- Manage the day-to-day operations of the cluster for backup and support.
- Performed minor version and major patch upgrades on Hadoop cluster
- Visualizing data using Tableau, in such a way the data is directly accessed from HDFS and reports were populated in Tableau.
- Restructuring platform for application such as Installing and maintaining
- Ensure HDFS user & block permissions to make system smoother
- Installing and updating packages using YUM.
Confidential, Cary NC
Hadoop Administrator
Responsibilities:
- Developed MapReduce programs to cleanse and parse data in HDFS obtained from various data sources.
- Used Hive data warehouse tool to analyze the unified historic data in HDFS to identify issues and behavioral patterns.
- The Hive tables created as per requirement were internal or external tables defined with proper static and dynamic partitions, intended for efficiency.
- Used the RegEx, JSON and Avro SerDe’s for serialization and de-serialization packaged with Hive to parse the contents of streamed log data.
- Implemented Hive custom UDF’s to integrate the Weather and geographical data with business data to achieve comprehensive data analysis.
- Used Oozie workflow engine to manage interdependent Hadoop jobs and to automate several types of Hadoop jobs such as Java map-reduce, Hive and Sqoop as well as system specific jobs.
- Worked along with the Hadoop Operations team in Hadoop cluster planning, installation, maintenance, monitoring and upgrades.
Confidential
Software Developer and Linux System Administrator
Responsibilities:
- Application development, Maintenance and Database research activities using JAVA and MySQL.
- Worked as a part of team in the development of a PKI product, Dhruvam®-Lite, which is used to generate, suspend, activate and revoke the digital s, depending on the request received from the user
- Created the life cycle of digital and developed RSA Encryption Standard using Java Cryptographic Extensions and Bouncy Castle Cryptographic APIs
- Incorporated Design Patterns like MVC, Singleton, Abstract Factory and Factory Method and OOP.
- Implemented model view controller architecture with the help of JSP, Servlets and Java
- Installation and configuration of Linux for new build environment.
- Created volume groups logical volumes and partitions on the Linux servers and mounted file systems on the created partitions.
- Experience with Linux internals, virtual machines, and open source tools/platforms.
- Improve system performance by working with the development team to analyze, identify and resolve issues quickly.
- Ensured data recoverability by implementing system and application level backups.
- Performed various configurations that include networking and IPTable, resolving hostnames, SSH key less login.
- Managing Disk File Systems, Server Performance, Users Creation and Granting file access Permissions.
- Support pre-production and production support teams in the analysis of critical services and assists with maintenance operations.
- Automated administration tasks through use of scripting and using CRON.
