We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

0/5 (Submit Your Rating)

Roswell, GA

SUMMARY

  • Over 5+ years of experience as System Engineer/Administrator with 2 years of experience in large scale Hadoop environment built, cluster set up and Administration.
  • Over 2 years of experience in sizing, installation and configuration of clusters.
  • Expertise in different Hadoop Distributions like Hortonworks Data Platform (HDP 2.1 - 2.3), Cloudera Distribution Hadoop (CDH 4, CDH 5)
  • Efficient in building clusters in cloud environment like Confidential Cloud Platform.
  • Thorough knowledge on cloud platforms like Cloud break, Amazon Web Services, Azure.
  • Experience in NoSQL databases - Hbase, MongoDB
  • In-depth knowledge on JVM architecture and JVM Tuning.
  • Integration of Active Directory, LDAP with Ranger, Knox for authentication and authorization.
  • Experience in configuring Kerberos as a part of Hadoop Cluster Security.
  • Good knowledge on Distcp, Snapshots for backup and recovery process.
  • Experience in working on DR clusters, data governance, data management using Falcon.
  • Experience in Yarn Capacity Scheduling and sizing Yarn Containers for performance efficiency.
  • Expertise in High Availability Configuration and Monitoring on Hadoop Master Nodes.
  • Extensive knowledge on Hadoop Application Frameworks like Tez and Spark.
  • Experience in using modern Big-Data tools like SparkSQL, Jasper Soft for Data Analysis.
  • Experience in Hive Performance Tuning and Query Optimization.
  • Good knowledge on SQL based Hive Authorization, HDFS ACL’s
  • Experience in Query Optimization using Hive partitioning, bucketing and compression.
  • Hands of experience on Data ingestion, Data transformation and Analysis using Pig, Hive, Sqoop, Flume and HDFS.
  • Efficient in setting up Oozie workflow for managing and scheduling jobs.
  • Over 5+ years of experience in dealing with UNIX and Linux operating systems.
  • Expertise in Advanced Database Management Systems and Software Deployment.
  • Hands on experience in Shell Scripting and working on SSH configurations. .
  • Good experience with Databases such as SQL Server 2008R2/2012, MySQL, and Oracle 11g.
  • Excellent skills in writing SQL queries, Triggers, Stored Procedures.
  • Ability to create ER (Entity Relationship) Diagrams to the design database.
  • Highly proficient in understanding new technologies and accomplishing new goals.

TECHNICAL SKILLS

Hadoop Distribution: Hortonworks Data Platform (HDP) 2.1-2.3, Apache HadoopCloudera Distribution Hadoop (CDH 4, CDH 5)

Hadoop Eco-System: HDFS, Map Reduce, Pig, Hive, Oozie, Flume, Zookeeper, TezSpark, Kafka, Falcon

Cloud: Cloudbreak, Confidential Cloud Platform (GCP), Amazon Web Services (AWS)Azure

Databases: Hbase, Oracle 11g, MS SQL Server 2008/2012, MySQLTeradata.

Programming Languages: Shell Scripting, Java, Scala, T-SQL, PL/SQL, HTML, XML

Operating Systems: Linux (Redhat, CentOS, Ubuntu), Windows, Unix

Application Tools: Maven 3.3.9, Word Press, Remedy 6.3 & 7.5, AltirisSymantec Management Console, Confidential Cloud Console.

Reporting Tools: Zeppelin, Tableau Desktop, Solr

Security: Kerberos, Ranger, Knox, Storage base Authorization.

Software Applications: Eclipse, Microsoft Dynamics CRM 2011, AdobeMicrosoft Office Suit, Outlook, VM Ware WorkStationOracle Virtual box Manager

PROFESSIONAL EXPERIENCE

Confidential, Roswell, GA

Hadoop Administrator

Responsibilities:

  • Installing and configuring Hortonworks Data Platform 2.1 - 2.3
  • Involved in creating multiple instances and Services for Hadoop Installation using Confidential Cloud Console.
  • Monitoring Hadoop Cluster connectivity and security.
  • Performing Data transfer to DR clusters using Distcp, Falcon and Oozie.
  • Installing and managing cluster security using Ranger and creating Ranger Policies.
  • Working on different YARN Scheduling Policies for performance efficiency
  • Implementing storage base authorization on HDFS File Systems and setting queues through Yarn Capacity Scheduler.
  • Configuration of Falcon on Oozie, Hive jobs as part of Data Governance and Disaster Recovery.
  • Kerborizing and Integrating Hadoop clusters with LDAP and Active Directory for security.
  • Working with Confidential Cloud Console to create instance templates and groups.
  • Knowledge of using Confidential cloud tools like bdutil, gcloud.
  • Good knowledge on Rolling Upgrades
  • Analyzing of large volumes of Structured data using SparkSQL.
  • Worked on Maven 3.3.9 for building and managing Java based projects.
  • Creating Hive managed and external tables. Used static, dynamic partitions and bucketing to improve efficiency.
  • Performing High Availability on Name node, Resource Manager, HiveSever2.
  • Working on Hive and Pig scripts for data transformation and advanced analytics.
  • Responsible for managing and reviewing Hadoop log files on client environment.
  • Data analytics using Hive and Flume for streaming web log data.
  • Building search application using Solr and SiLK.
  • Providing significant input to the strategic direction, daily operations, best practices, security policies, and implementation procedures relating to our large scale data infrastructure.
  • Built tools and scripts to automate routine tasks and monitor system/cluster health.

Confidential, Roswell, GA

Hadoop Developer /Admin

Responsibilities:

  • Developed Big Data Solutions that enabled the business and technology teams to make data-driven decisions on the best way to acquire customers and provide them business solutions.
  • Involved in installing, configuring and managing Hadoop Ecosystem components like Hive, Pig, Sqoop.
  • Automated workflow script by developing autosys box jobs to schedule and streamline data processing.
  • Developed Hive Queries using Hive data warehouse tool to analyze data in HDFS.
  • Implemented Hive custom UDF’s to achieve comprehensive data analysis.
  • Worked with Pig to develop ad-hoc queries to cleanse and parse data in HDFS obtained from various data sources and to perform join on map side using Distributed cache.
  • Increased efficiency by defining static and dynamic partitions using merged and external tables.
  • Developed and maintained project documentation including technical specification, unit test plans, presentations, user documentations and training material.
  • Worked with the Data Science team to gather requirements for various data mining projects
  • Analysis of large data sets by running Hive queries and Pig scripts.
  • Responsible for managing data from multiple sources.
  • Worked closely with business units to define development estimates according to Agile Methodology

Confidential

System Administrator

Responsibilities:

  • Worked on configuring and tuning system and network parameters for optimum performance.
  • Wrote shell scripts to automate the tasks.
  • Developed tools to automate the deployment, administration, and monitoring of a large-scale Linux environment.
  • Performed server tuning, operating system upgrades.
  • Generated daily compliance reports for Service Releases, Emergency Releases and Quarterly Releases using Transactional SQL Queries.
  • Prepared Reports modules as a developer on MS SQL Server2012 (SSRS, T-SQL, scripts, stored procedures and views).
  • Wrote complex SQL queries using table joins (inner and outer).
  • Simplified complex queries using temporary tables, table variables and stored procedures.
  • Maintained six Symantec Management Portal 7.1 Servers with 50000 clients reporting.
  • Managed 23 Notification Servers with 40000 clients reporting.
  • Handled 800 Package servers for Notification Server 6 & Symantec Management Portal 7.1
  • Configured various security roles for providing access as per requirement.
  • Configured custom inventory tasks to collect the entire inventory from clients.
  • Implemented Global Delivery Framework (GDF)

Confidential

GIS Analyst

Responsibilities:

  • Resolved difficult customer cases through collaboration with Development, Quality Assurance Team, and Professional Services.
  • Recreated customer environments, as necessary, for troubleshooting and resolution of complex issues.
  • Bug write-up and creating Knowledge Base articles to share knowledge.
  • Worked on troubleshooting and maintain computer systems for businesses and individuals.
  • Designed a work full area on Confidential Earth (GE) which is used by the normal users for mapping purpose.
  • Deleted all the Spam and fake information on the Confidential search windows which in turn gets updated on the Confidential Earth.
  • Interacted with clients on a regular basis to work on bugs, testing and enhancements.

We'd love your feedback!