We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

4.00/5 (Submit Your Rating)

PROFESSIONAL EXPERIENCE

  • Having 9+ Years of IT experience in the field of Systems Architect, HADOOP Cluster Administration and Big Data Ecosystem.
  • Automating Sameness tasks using available Scripting languages.
  • Strong technical backgrounds, efficient in Debugging S/W & H/W problems
  • Proficient in LINUX Operating System such as Red Hat, Cent OS & SUSE.
  • Solid Understanding on Nagios, Gangila & Cloudera Manager Tools.
  • Exceptional in Overseeing System Administration/Operations such as Performance Tuning, Storage Capacity Management & System Dump Analysis.

TECHNICAL SKILLS

GroupWare: Big Data Ecosystem, LINUX, Scripting Languages, MySQLOracle, IBM iToolsMap Reduce, HBase, Spark, Impala, HIVE, Pig, Oozie, Hbase, Cloudera Manager, Nagios, Ganglia, YARN.

Job Functions: Administration of HADOOP Cluster, Proof of Concepts in Big Data Analytics and Project Management.

PROFESSIONAL EXPERIENCE

Confidential

HADOOP Administrator

Responsibilities:

  • Installing and Upgrading Cloudera CDH & Hortonworks HDP Versions through Cloudera Manager and Ambari Management tools.
  • Moving the Services (Re - distribution) from one Host to another host wif in the Cluster to facilitate securing the cluster and ensuring High availability of the services.
  • Implementing Security on Hadoop Cluster by Kerberos/LDAP and by encrypting the data in motion & at the REST.
  • Transforming data from RDBMS to HDFS using available methodologies.
  • Moving Databases of Services from PostgreSQL to MySQL database.
  • Identifying the best solutions/ Proof of Concept leveraging Big Data & Advanced Analytics levers dat meet and exceed the customer's business, functional and technical requirements.
  • Strong working experience wif open source technology
  • Implementing TLS/SSL Security for encrypting data while it is in motion.
  • Installed Cloudera Manager Server and configured the database for Cloudera Manager Server.
  • Experience in addressing scale and performance problems
  • Store unstructured data in semi structure format on HDFS using Hbase.
  • Used Change management and Incident management process following the company standards.
  • Implemented partitioning, dynamic partitions and buckets in HIVE
  • Knowledge of java virtual machines (JVM) and multithreaded processing
  • Strong troubleshooting and performance tuning skills (TCP/IP,DNS, File system, Load balancing, etc)
  • Continuous monitoring and managing the HADOOP cluster through Cloudera Manager
  • Strong network background wif a good understanding of TCP/IP, firewalls and DNS.
  • Demonstration of Proof Of Concept Demo to Clients
  • Knowledge of Git, JIRA and Jenkins etc
  • Supported technical team members in management and review of HADOOP log files and data backups.
  • Continuous improvement processes for all process automation scripts and tasks.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop
  • Performed data completeness, correctness, data transformation and data quality testing using SQL.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Experience in providing support to data analyst in running Pig and Hive queries.
  • Managing and reviewing HADOOP log files.
  • Creating Hive tables, loading wif data and writing hive queries which will run internally in map reduce way
  • Installing and configuring Hive and also written Hive UDFs
  • Experience in large-scale data processing, on an Amazon EMR cluster
  • Efficient to handled HADOOP admin and user command for administration.
  • Supported technical team members for automation, installation and configuration tasks.
  • Wrote shell scripts to monitor the health check of HADOOP daemon services and respond accordingly to any warning or failure conditions

Confidential

HADOOP Administrator

Responsibilities:

  • Installation and configuration of HADOOP1.0 & 2.0 Cluster and Maintenance of it through Cluster Monitoring & Troubleshooting.
  • Transforming data from RDBMS to HDFS using available methodologies.
  • Identified the best solutions/ Proof of Concept leveraging Big Data & Advanced Analytics levers dat meet and exceed the customer's business, functional and technical requirements.
  • Strong working experience wif open source technology
  • Installed Cloudera Manager Server and configured the database for Cloudera Manager Server.
  • Store unstructured data in semi structure format on HDFS using Hbase.
  • Used Change management and Incident management process following the company standards.
  • Implemented partitioning, dynamic partitions and buckets in HIVE
  • Knowledge of java virtual machines (JVM) and multithreaded processing
  • Continuous monitoring and managing the HADOOP cluster through Cloudera Manager
  • Strong network background wif a good understanding of TCP/IP, firewalls and DNS.
  • Demonstration of the Live Proof Of Concept Demo to Clients
  • Supported technical team members in management and review of HADOOP log files and data backups.
  • Continuous improvement processes for all process automation scripts and tasks.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop
  • Performed data completeness, correctness, data transformation and data quality testing using SQL.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Experience in providing support to data analyst in running Pig and Hive queries.
  • Managing and reviewing HADOOP log files.
  • Creating Hive tables, loading wif data and writing hive queries which will run internally in map reduce way
  • Installing and configuring Hive and also written Hive UDFs
  • Experience in large-scale data processing, on an Amazon EMR cluster
  • Efficient to handled HADOOP admin and user command for administration.
  • Supported technical team members for automation, installation and configuration tasks.
  • Wrote shell scripts to monitor the health check of HADOOP daemon services and respond accordingly to any warning or failure conditions
  • Involved in creating Hive tables, loading wif data and writing hive queries which will run internally in map reduce way.

Confidential

HADOOP Administrator

Responsibilities:

  • Assisted in designing, development and architecture of HADOOP and HBase systems.
  • Coordinated wif technical teams for installation of HADOOP and third related applications on systems.
  • Formulated procedures for planning and execution of system upgrades for all existing HADOOP clusters.
  • Supported technical team members for automation, installation and configuration tasks.
  • Suggested improvement processes for all process automation scripts and tasks.
  • Provided technical assistance for configuration, administration and monitoring of HADOOP clusters.
  • Conducted detailed analysis of system and application architecture components as per functional requirements.
  • Participated in evaluation and selection of new technologies to support system efficiency.
  • Assisted in creation of ETL processes for transformation of data sources from existing RDBMS systems.
  • Designed and developed scalable and custom HADOOP solutions as per dynamic data needs.
  • Coordinated wif technical team for production deployment of software applications for maintenance.
  • Provided operational support services relating to HADOOP infrastructure and application installation.
  • Supported technical team members in management and review of HADOOP log files and data backups.
  • Participated in development and execution of system and disaster recovery processes.
  • Formulated procedures for installation of HADOOP patches, updates and version upgrades.
  • Automated processes for troubleshooting, resolution and tuning of HADOOP clusters.

Confidential

LINUX Administrator - Support

Responsibilities:

  • Working wif different groups to Support in-house application requirements.
  • User Profile Management, Group management and administrative delegations
  • Provide support to Account Managers, UNIX and Windows technicians, and other departments
  • Build servers using Kick Start, Red Hat Satellite Server, and vSphere Client
  • Updating patches to keep servers updated against bugs present in the operating system using Red Hat Satellite Server, yum, etc.
  • Worked exclusively on VMware virtual environment.
  • Experience in using VMware veeam to move VM's from One Datacenter to another datacenter.
  • Involved in installation and configuration of various Third party software onto servers.
  • Involved in ILMT Agent Deployments and Oracle/SQL Upgrade project which includes various LINUX builds of different OS platforms across various data centers.
  • Co-ordinated wif various cross functional teams across IT operations to make sure smooth functioning of projects.
  • Worked closely wif DBA Team in order to adjust kernel parameters as per requirements.
  • Installed, configured and provided support for Tivoli Monitoring software across various OS platforms like RHEL, AIX and Solaris.
  • Installed packages using YUM and Red hat Package Manager ( RPM ) on various servers
  • Day to day resolution on Linux based issued though SMS ticketing system in compliance to SLA
  • Automating many day to day tasks through Bash scripting.
  • Worked wif Red Hat Satellite Server which is used to push changes across various servers simultaneously.
  • Performed the daily system administration tasks like managing system resources and end users support operations and security.

Environment: Red hat Enterprise Linux 4.x/5.x/6.1, AIX 6.x, Solaris 8/ 9/10, Tivoli Storage Manager, VMware ESX5, Tivoli Net backup, and Web sphere. Windows 2008 R2 & 2008 servers, Windows 2003, IIS 7.0 & 7.5.

Confidential

IBM AS/400 - Administration Support

Responsibilities:

  • Performed IBM i OS Up grade/Installation and DR Tests wif V5R4, V6R1 & V7R1.
  • New Server Build As per the Business Need in a Multi paced Environment.
  • Managing LAPRS through HMC and Planning for IPL as and when required.
  • Storage Management on both Internal Disks and External LUNs (SAN).
  • Managed ATL and VTL tape Environments in a Big scale.
  • Managing User Profiles and their authority levels through Authorization Lists.
  • Applying PTFs & TRs based on system needs to meet business capabilities.
  • Controlling the Group Profiles to manage Existing and New users needs.
  • Providing Secure Environments to the Business using Functions like SSL,SSH etc & Ensuring the same through random Checks and through various Audit Models
  • Implementing Capacity Management to Provide Inputs for Future Enhancements/Up gradation in Existing Configuration Setup to meet Growing Business Needs.
  • Troubleshooting all Escalated problems which are pertaining to AS400 Production Env.
  • Instrumental in Preparing Periodical Service Level reports & reviewing them wif Client.
  • Helping monitoring team to handle client calls and requests.
  • Handling situations when Job overrunning, Job goes into the loop and consuming more CPU as well as ASP etc..
  • Responsible for running Bridge Calls & driving them to resolve Production Critical issues
  • Performed H/W Migration from one model to another model.
  • Managing Job Scheduling Tasks using Native and ROBOT Scheduling Utilities.
  • Prioritizing Jobs from Nightly Batch Cycle as per customer requirement
  • Initiating RCAs/PB Tickets on Recurring problems and driving them to the closure by implementing a Permanent Solution.
  • Ensuring the object level authority of data on Production Systems.
  • Initiating Service Improvement Plan on Backup, Housekeeping, and Monitoring & Restore etc for Optimized Performance and to fulfill Customer needs.
  • SLA based Service Delivery & maintaining Quality of Technical Service.
  • SPOC for iSeries Product Line to provide any kind of Information for Customer.
  • Aligning the Process wif ITIL standards to achieve better results and Customer Satisfaction.
  • Reviewing Weekend Changes and supporting the team for successful implementation.
  • Maintaining SoX regulated Environment in Compliance wif the Standards.
  • Supporting MQ Series on Multiple AS/400 environment.
  • Providing ah-hoc support for ERP application hosted on AS/400 Platform.

We'd love your feedback!