Hadoop Admin Resume
Van Buren, MI
SUMMARY
- Around 8 years of experience in Linux administration and also worked on Hadoop big data and WindowsNT, Active Directory, Exchange and Office 365
- Hands on experience in installing and configuring MapReduce, HDFS, HBase, Oozie, Hive, Sqoop, and Pig
- Installed, configured, and upgraded OS when required
- Worked with structured, semi - structured, and unstructured data
- Exceled in importing data from various data sources into HDFS like Oracle, DB2, and SQL
- Experienced in installing and configuring Cloudera
- Handled security permissions to users
- Worked closely with management and other departments to configure the system correctly
- Skilled in implementing fair scheduler to manage resources during peak times
- Gained extensive experience managing and reviewing Hadoop Log files
- Monitored the performance of the Hadoop ecosystem.
- Strong technical, administration and monitoring knowledge in Bigdata/Hadoop
- Enthusiastic, self-starter, eager to meet challenges and quickly assimilate latest technologies, skills, concepts and ideas.
TECHNICAL SKILLS
Hadoop Eco-Systems: Hive, Pig, Flume, Oozie, Sqoop, Spark, Impala and HBase
Operating systems: RedHat Linux 5.X, 6.X, Windows 95, 98, NT, 2000, Windows Vista, 7
Configuration Management Tools: Puppet
Database: Oracle (SQL) 10g, MYSQL, SQL SERVER 2008
Hadoop Configuration Management: Cloudera Manager, Ambari
Monitoring Tools: Ganglia, Nagios
Scripting Languages: Shell scripting, PowerShell.
Configuration / Protocol: DNS, DHCP, WINS, VPN, TCP/IP, SNMP, IMAP, POP3, SMTP, PKI, DFS
Ticketing Systems: Remedy, Service Now, IBM Tivoli
Backup software’s: Net-Backup,Tivoli, Com vault, NT Backup, DPM 2012
PROFESSIONAL EXPERIENCE
Confidential, Van Buren, MI
Hadoop Admin
RESPONSIBILITIES:
- Setup, configured, and managed security for the Cloudera Hadoop cluster
- Used Hive and Pig to perform data analysis
- Loaded log data into HDFS using Flume
- Created multi-cluster test to test the system's performance and failover
- Improved a high-performance cache, leading to a greater stability and improved performance
- Built a scalable Hadoop cluster for data solution
- Responsible for maintenance and creation of nodes
- Managed log files, backups and capacity
- Found and troubleshot Hadoop errors
- Worked with other teams to decide the hardware configuration
- Implemented cluster high availability
- Scheduled jobs using Fair Scheduler
- Configured alerts to find possible errors
- Handled patches and updates
- Worked with developers to setup a full Hadoop system on AWD
Environment: HDFS CDH3, CDH4, Hbase, NOSQL, RHEL 4/5/6, Hive, Pig, Perl Scripting and AWS S3, EC2
Confidential, Irvine, CA
Hadoop Admin
RESPONSIBILITIES:
- Analyzed Hadoop cluster and other big data analysis tools including Pig
- Implemented multiple nodes on CDH3 Hadoop cluster on Red hat Linux
- Built a scalable distributed data solution
- Imported data from Linux file system to HDFS
- Loaded data from UNIX to HDFS
- Installed clusters, starting and stopping data nodes, and recovered name nodes
- Assisted with capacity planning and slot configuration
- Created tables and views in Teradata
- Created HBase tables to house data from different sources
- Transmitted data from SQL to HBase using Sqoop
- Worked with a team to successfully tune Pig's performance queries
- Exceled in managing and reviewing Hadoop log file
- Fitted Oozie to run multiple Hive and Pig jobs
- Worked with management to determine the optimal way to report on data sets
- Installed, configured, and monitored Hadoop Clusters using Cloudera
- Installed, upgraded, and patched ecosystem products using Cloudera Manager
- Balanced and tuned HDFS, Hive, Impala, MapReduce, and Oozie work flows
- Maintained and backed up meta-data
- Configured Kerberos for the clusters
- Used data integration tools like Flume and Sqoop
- Setup automated processes to analysis the system and find errors
- Supported IT department in cluster hardware upgrades
Environment: Hadoop, HDFS, Pig, Sqoop, HBase, Shell Scripting, Ubuntu, Linux Red Hat.
Confidential
Sr. Server Admin/Hadoop admin
Responsibilities:
- Tested raw data and executed performance scripts
- Shared responsibility for administration of Hadoop, Hive and Pig
- Aided in developing Pig scripts to report data for the analysis
- Moved data between HDFS and RDBMS using Sqoop
- Analyzed MapReduce jobs for data coordination
- Helped find trends by creating Hive queries that compared new data with archived data
- Provided recommendations to upper management to improve processes and fix problems
- Setup, configured, and managed security for the Cloudera Hadoop cluster.
- Built a scalable Hadoop cluster for data solution.
- Responsible for maintenance and creation of nodes.
- Managed log files, backups and capacity
- Found and troubleshot Hadoop errors
- Worked with other teams to decide the hardware configuration
- Implemented cluster high availability
- Scheduled jobs using Fair Scheduler
- Configured alerts to find possible errors
- Handled patches and updates
Environment: Linux, Map Reduce, HDFS, Hive, Pig, Shell Scripting.
Confidential
Server Admin
RESPONSIBILITIES:
- Use of Windows Server 2008/2012 Active Directory and Exchange Management Console.for 2003, 2007 and 2010.
- Provided support for virtual server solutions including VMWare and ESX hosts, adding disk space from SAN and expanding partitions using diskpart and extpart. Server patching and windows server clustering set up.
- Support of client backups including Symantec local, Tivoli and commvault, creation of new backup jobs for new servers.
- Remote support of clients using Kaseya, Secret Server and Join.Me
- Managed Citrix and Xenapp environments.
- Maintenance of multiple Symantec End Point anti-virus servers at multiple client sites.
- Daily check of backups of Symantec Backup Exec at office locations. Set up and maintain tape rotation schedule, creation of new backup jobs for new servers. Set up and maintenance of Data Domain storage and replication system.
- Building new Windows 2003 and 2008 servers, added systems to What’s Up Gold SNMP tracking system, Symantec Backup and McAfee anti-virus software.
- Built HP blades servers using ILO and virtual servers using VMWare vSphere 4.1.
- Daily use of Active Directory and Exchange Management Console.
- Documented build procedures for physical and virtual servers.
- Used Automate IT software to provide replication of directories Datacenter servers.
- Maintenance of McAfee anti-virus system including maintaining 4 repositories and compliance for 200 servers and 600 workstations.
- Provide daily support for Help Desk tickets on Aldon tracking system.
- Monitor and analyze servers and resolve any problems, maintain systems reporting, tuning.
- Created users, manage user permissions, maintain User & File system quota on Linux servers.
- Configured volume groups and logical volumes, extended logical volumes for file system growth needs using Logical Volume Manager (LVM) commands.
- Maintaining integrity of security with the use of group policies across domains.
- Supporting users through email, on call and troubleshooting.
- Maintaining inventory of all components including systems and other hardware.
- Performed User Account management, data backups, and users' logon support.
- Maintaining user's data backup by creating particular user folder in File Server and applying security permission on folders.
- Monitored trouble ticket queue to attend user and system calls.
- Attended team meetings, change control meetings to update installation progress and for upcoming changes in environment.
- Perform day-to-day Technical Support, analyzing troubleshooting and resolved Technical problems related to servers.
- Perform User and Security Administration and Implementing File Permissions for the Users and Groups.
- Configuring Role-Based Access Control (RBAC) & Access Control List (ACL).
- Maintaining Service Management Facility (SMF) in Solaris 10.
- Resolving system Hardware and software errors and crashes, huge file sizes, file System full error on UNIX and Linux Servers.
- Responsible for Package and Patch Management & Installation on servers.
- Extensively worked on administering NFS, DNS, DHCP, NIS, NIS+, LDAP, Mail Servers and Samba server.
- Installing, configuring and maintaining Web logic Server, Web Sphere Application Server, Apache/ Tomcat web servers on UNIX.
- Encapsulating and Mirroring the Root Disk. Implementing RAID 0, RAID 1, RAID 0+1 & RAID 5 on multiple disks using Solaris Volume Manager.
- Increasing and Decreasing the Size of File System using Logical volume Manager.
- Provided network and servers development and support