Hadoop Admin Resume
Irvine, CA
SUMMARY
- Around 8+ years of professional experience including around 5 years of Linux Administrator and 3 plus years in Big Data analytics as Sr. Hadoop/Big Data Administrator.
- Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
- Experience in working with business analysts to identify study and understand requirements and translated them into ETL code in Requirement Analysis phase.
- Experience in architecting, designing, installation, configuration and management of Apache Hadoop Clusters, Mapr, Hortonworks & Cloudera Hadoop Distribution.
- Good understanding in Microsoft Analytics Platform System (APS) HDInsight.
- Experience in managing the Hadoop infrastructure with Cloudera Manager.
- Good Understanding in Kerberos and how it interacts with Hadoop and LDAP.
- Practical knowledge on functionalities of every Hadoop daemons, interaction between them, resource utilizations and dynamic tuning to make cluster available and efficient.
- Experience in understanding and managing Hadoop Log Files.
- Experience in Massively Parallel Processing (MPP) databases such as Microsoft PDW.
- Experience in understanding Hadoop multiple data processing engines such as interactive SQL, real time streaming, data science and batch processing to handle data stored in a single platform in Yarn.
- Experience in Adding and removing the nodes in Hadoop Cluster.
- Experience in Change Data Capture (CDC) data modeling approaches.
- Experience in managing the Hadoop cluster with IBM Big Insights, Horton works Distribution Platform.
- Experience in extracting the data from RDBMS into HDFS Sqoop.
- Experience in bulk load tools such as DW Loader and move data from PDW to Hadoop archive.
- Experience in collecting the logs from log collector into HDFS using up Flume.
- Experience in setting up and managing the batch scheduler Oozie.
- Good understanding of No SQL databases such as HBase, Neo4j and Mongo DB.
- Experience in analyzing data in HDFS through Map Reduce, Hive and Pig.
- Design, implement and review features and enhancements to Cassandra.
- Experience in tuning large / complex SQL queries and manage alerts from PDW and Hadoop.
- Deployed a Cassandra cluster in cloud environment as per the requirements.
- Experience on UNIX commands and Shell Scripting.
- Experience in Python Scripting.
- Experience in statistics collection and table maintenance on MPP platforms.
- Experience in creating physical data models for data warehousing.
- Experience in Microsoft SQL Server Integration Services (SSIS).
- Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
- Proficient in Oracle 9i/10g/11g, SQL, MYSQL and PL/SQL.
- Good understanding in end - user tools such as interactive dashboards, advanced reporting and publishing, ad hoc analysis over the Web, proactive detection and alerts, mobile analytics,Microsoft Office integration,Web Services and business process integration in OBIEE.
- Experience on Web development with proficiency on PHP, PHP frameworks like CakePHP, CodeIgniter, JavaScript, NodeJs, AngularJs, CSS and MySQL.
- Experience in integration of various data sources like Oracle, DB2, Sybase, SQL server and MS access and non-relational sources like flat files into staging area.
- Experience in Data Analysis, Data Cleansing (Scrubbing), Data Validation and Verification, Data Conversion, Data Migrations and Data Mining.
- A highly motivated self-starter, hardworking and energetic professional, with a high degree of responsibility and ownership.
- Possess excellent technical skills, consistently outperformed schedules and acquired interpersonal and communication skills.
TECHNICAL SKILLS
Hadoop /Big Data Technologies: Hadoop 2.4.1,HDFS, Map Reduce, HBase, Pig, Hive, Sqoop, Yarn, Flume, Zookeeper, Spark, Cassandra, Storm, MongoDB, Pig, Hue, Impala, Whirr, Kafka Mahout and Oozie.
Programming Languages: Java, SQL, PL/SQL, Shell Scripting, Python, Perl
Frameworks: MVC, Spring, Hibernate.
Web Technologies: HTML, XML, JSON, JavaScript, Ajax, SOAP and WSDL
Databases: Oracle 9i/10g/11g, SQL Server, MYSQL
Database Tools: TOAD, Chordiant CRM tool, Billing tool, Oracle Warehouse Builder (OWB).
Operating Systems: Linux, Unix, Windows, Mac, CentOS
Other Concepts: OOPS, Data Structures, Algorithms, Software Engineering, ETL
PROFESSIONAL EXPERIENCE
Confidential, Irvine, CA
Hadoop Admin
Responsibilities:
- Handle the installation and configuration of a Hadoop cluster.
- Build and maintain scalable data pipelines using the Hadoop ecosystem and other open source components like Hive and HBase.
- Handle the data exchange between HDFS and different Web Applications and databases using Flume and Sqoop.
- Good understanding in Microsoft Analytics Platform System (APS) HDInsight.
- Monitor the data streaming between web sources and HDFS.
- Worked in Kerberos and how it interacts with Hadoop and LDAP.
- Worked on Kafka distributed, partitioned, replicated commit log service and provides the functionality of a messaging system.
- Close monitoring and analysis of the Map Reduce job executions on cluster at task level.
- Inputs to development regarding the efficient utilization of resources like memory and CPU utilization based on the running statistics of Map and Reduce tasks.
- Experience in a software intermediary that makes it possible for application programs to interact with each other and share data.
- Worked extensively with Amazon Web Services and Created Amazon Elastic Map Reduce cluster in both 1.0.3 and 2.2.
- Worked in Kerberos, Active Directory/LDAP, Unix based File System.
- Managed data in Amazon S3, Implemented s3cmd to move data from clusters to S3.
- Presented Demos to customers how to use AWS and how it is different from traditional systems.
- Experience in Continuous Integration and expertise in Jenkins and Hudson tools.
- Experience in Nagios and writing plugins for Nagios to perform the multiple server checks.
- Changes to the configuration properties of the cluster based on volume of the data being processed and performance of the cluster.
- Setting up Identity, Authentication, and Authorization.
- Maintaining Cluster in order to remain healthy and in optimal working condition.
- Handle the upgrades and Patch updates.
- Set up automated processes to analyze the System and Hadoop log files for predefined errors and send alerts to appropriate groups.
- Experience in architecting, designing, installation, configuration and management of Apache Hadoop, Hortonworks Distribution.
- Worked in AVRO and JSon and other compression.
- Worked in UNIX commands and Shell Scripting.
- Experience in Python Scripting.
- Worked in core competencies in java, HTTP, XML and JSON.
- Worked in Microsoft SQL Server Integration Services (SSIS).
- Worked on spark it’s a fast and general - purpose clustering computing system.
- Worked on Storm its distributed real-time computation system provides a set of general primitives for doing batch processing.
- Experience with build tools such as Maven.
- Balancing HDFS manually to decrease network utilization and increase job performance.
- Commission and decommission the Data nodes from cluster in case of problems.
- Set up automated processes to archive/clean the unwanted data on the cluster, in particular on Name node and Secondary name node.
- Handle the Massively Parallel Processing (MPP) databases such as Microsoft PDW.
- Set up and manage High Availability Name node and Name node federation using Apache 2.0 to avoid single point of failures in large clusters.
- Experience in a Web-based Git repository hosting service which offers all of the distributed revision control and source code management (SCM) functionality of Git as well as adding its own features in Git Hub.
- Discussions with other technical teams on regular basis regarding upgrades, Process changes, any special processing and feedback.
Environment: Hadoop, MapReduce, Hive, HDFS, PIG, Sqoop, Oozie, Cloudera, Flume, HBase, ZooKeeper, CDH3, MongoDB, Cassandra, Oracle, NoSQL and Unix/Linux.
Confidential, San Diego, CA
Hadoop Administrator
Responsibilities:
- Installed and configured Hadoop and responsible for maintaining cluster and managing and reviewing Hadoop log files.
- Experience in a fully managed peta byte-scale data warehouse service and it is designed for analytic workloads and connects to standard SQL-based clients and business intelligence tools in Red shift.
- Experience in architecting, designing, installation, configuration and management of Apache Hadoop, Hortonworks Distribution Platform (HDP).
- Worked in delivers fast query and I/O performance for virtually any size dataset by using columnar storage technology and parallelizing and distributing queries across multiple nodes in Red shift.
- Worked on Storm its distributed real-time computation system provides a set of general primitives for doing batch processing.
- Experience in Horton works Distribution Platform (HDP) cluster installation and configuration.
- Experience in Kerberos, Active Directory/LDAP, UNIX based File System.
- Load data from various data sources into HDFS using Flume.
- Worked in statistics collection and table maintenance on MPP platforms.
- Worked on Cloudera to analyze data present on top of HDFS.
- Worked extensively on Hive and PIG.
- Worked on kafka distributed, partitioned, replicated commit log service and provides the functionality of a messaging system.
- Experience in writing code in Python or Shell Scripting.
- Experience in Source Code Management tools and proficient in GIT, SVN, Accurev.
- Experience in Test Driven Development and wrote the test cases in JUnit.
- Worked on large sets of structured, semi-structured and unstructured data.
- Use of Sqoop to import and export data from HDFS to RDBMS and vice-versa.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Worked in bulk load tools such as DW Loader and move data from PDW to Hadoop archive.
- Participated in design and development of scalable and custom Hadoop solutions as per dynamic data needs.
- Good understanding in Change Data Capture (CDC) data modeling approaches.
- Coordinated with technical team for production deployment of software applications for maintenance.
- Good knowledge on reading data from Cassandra and also writing to it.
- Provided operational support services relating to Hadoop infrastructure and application installation.
- Handled the imports and exports of data onto HDFS using Flume and Sqoop.
- Supported technical team members in management and review of Hadoop log files and data backups.
- Participated in development and execution of system and disaster recovery processes.
- Formulated procedures for installation of Hadoop patches, updates and version upgrades.
- Automated processes for troubleshooting, resolution and tuning of Hadoop clusters.
- Set up automated processes to send alerts in case of predefined system and application level issues.
- Set up automated processes to send notifications in case of any deviations from the predefined resource utilization.
Environment: Redhat Linux/Centos 4, 5, 6, Logical Volume Manager,Hadoop, VMware ESX 5.1/5.5, Apache and Tomcat Web Server, Oracle 11, 12, Oracle Rac 12c, HPSM, HPSA.
Confidential, Plano, TX
System Admin
Responsibilities:
- Installed/Configured/Maintained Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop.
- Wrote the shell scripts to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions.
- Managing and scheduling Jobs on a Hadoop cluster.
- Deployed Hadoop Cluster in the Pseudo-distributed and Fully Distributed mode.
- Implemented Name Node backup using NFS. This was done for High availability.
- Worked on importing and exporting data from Oracle and DB2 into HDFS and HIVE using Sqoop.
- Developed PIG Latin scripts to extract the data from the web server output files to load into HDFS.
- Created Hive External tables and loaded the data in to tables and query data using HQL.
- Worked on in tuning large / complex SQL queries and manage alerts from PDW and Hadoop.
- Wrote in creating physical data models for data warehousing.
- Wrote shell scripts for rolling day-to-day processes and it is automated.
- Experience in architecting, designing, installation, configuration and management of Cloudera Hadoop Distribution.
- Collected the logs data from web servers and integrated in to HDFS using Flume.
- Implemented Fair schedulers on the Job tracker to share the resources of the Cluster for the Map Reduce jobs given by the users.
Environment: Solaris 9/10, Red Hat Linux 4/5, BMC Tools, NAGIOS, VeritasNetBackup, Bash Scripting, Veritas Volume Manager, web servers, LDAP directory, Active Directory, BEA Web logic servers, SAN Switches, Apache, Tomcat servers, Web Sphere application server.
Confidential
Linux Administrator
Responsibilities:
- Installing and upgrading OE & Red hat Linux and Solaris 8/ & SPARC on Servers like HP DL 380 G3, 4 and 5 & Dell Power Edge servers.
- Experience in LDOM’s and Creating sparse root and whole root zones and administered the zones for Web, Application and Database servers and worked on SMF on Solaris 10.
- Experience working in AWS Cloud Environment like EC2 & EBS.
- Implemented and administered VMware ESX 3.5, 4.x for running the Windows, Centos, SUSE and Red hat Linux Servers on development and test servers.
- Installed and configured Apache on Linux and Solaris and configured Virtual hosts and applied SSL s.
- Implemented Jumpstart on Solaris and Kick Start for Red hat environments.
- Experience working with HP LVM and Red hat LVM.
- Experience in implementing P2P and P2V migrations.
- Involved in Installing and configuring Centos & SUSE 11 & 12 servers on HP x86 servers.
- Implemented HA using Red hat Cluster and VERITAS Cluster Server 5.0 for Web Logic agent.
- Managing DNS, NIS servers and troubleshooting the servers.
- Troubleshooting application issues on Apache web servers and also database servers running on Linux and Solaris.
- Experience in migrating Oracle, MYSQL data using Double take products.
- Used Sun Volume Manager for Solaris and LVM on Linux & Solaris to create volumes with layouts like RAID 1, 5, 10, 51.
- Re-compiling Linux kernel to remove services and applications that are not required.
- Performed performance analysis using tools like prstat, mpstat, iostat, sar, vmstat, truss, Dtrace.
- Experience working on LDAP user accounts and configuring ldap on client machines.
- Upgraded Clear-Case from 4.2 to 6.x running on Linux (Centos &Red hat)
- Worked on patch management tools like Sun Update Manager.
- Experience supporting middle ware servers running Apache, Tomcat and Java applications.
- Worked on day to day administration tasks and resolve tickets using Remedy.
- Used HP Service center and change management system for ticketing.
- Worked on the administration of the Web Logic 9, JBoss 4.2.2 servers including installation and deployments.
- Worked on F5 load balancers to load balance and reverse proxy Web Logic Servers.
- Shell scripting to automate the regular tasks like removing core files, taking backups of important files, file transfers among servers.
Environment: Solaris 8/9/10, Veritas Volume Manager, web servers, LDAP directory, Active Directory, BEA Web logic servers, SAN Switches, Apache, Tomcat servers, Web Sphere application server.
Confidential
Linux Admin
Responsibilities:
- Installing, configuring and updating Solaris 7, 8, Red Hat 7.x, 8, 9, Windows NT/2000 Systems using media and Jumpstart and Kick start.
- Installing and configuring Windows Active directory server 2000 and Citrix Servers.
- Published and administered applications via Citrix Meta Frame.
- Creating and Authenticating Windows user accounts on Citrix server.
- Creating System Disk Partition, mirroring root disk drive, configuring device groups in UNIX and Linux environment.
- Working with VERITAS Volume Manager 3.5 and Logical Volume Manager for file system management, data backup and recovery.
- User administration which included creating backup account for new users and deleting account for the retired or deleted users.
- Implementing backup solution using Dell T120 autoloader and CA Arc Server 7.0
- Managing Tape Drives and recycling it after specific period of time as per the firm’s policies.
- Working with DBA’s for writing Scripts to take database backup and scheduling backup using cron jobs.
- Creating UNIX and PERL scripts for automated data backup, status of the storage.
- Installing and configuring Oracle 8i database and Sybase server on Solaris after creating the file systems, users and tuning the kernel.
- Installed and Configured SSH Gate for Remote and Secured Connection.
- Setting up labs from scratch, testing hardware, installing and configuring various hardware devices like printers, scanners, modems, network and communication devices.
- Configuration of DHCP, DNS, NFS and auto mounter.
- Creating, troubleshooting and mounting NFS File systems on different OS platforms.
- Installing, Configuring and Troubleshooting various software’s like Windd, Citrix - Clarify, Rave, VPN, SSH Gate, Visio 2000, Star Application, Lotus Notes, Mail clients, Business Objects, Oracle, Microsoft Project.
- Troubleshooting and solving problems related to users, applications, hardware etc.
- Working 24/7 on call for application and system support.
- Experience in working and supported SIBES database running on Linux Servers
Environment: HP Proliantservers, SUN Servers (6500, 4500, 420, Ultra 2 Servers), Solaris 7/8, VeritasNetBackup, Veritas Volume Manager, Samba, NFS, NIS, LVM, Linux, Shell Programming.