We provide IT Staff Augmentation Services!

Hadoopadministrator And Developer Resume

4.00/5 (Submit Your Rating)

SUMMARY:

Hadoop architecture, administration and development. Also development of custom interfaces and workflows for websites, content management applications, social media, communities, and blogs.

TECHNICAL SKILLS:

  • Apache Hadoop
  • Cloudera
  • Hortonworks
  • Amazon Cloud Services
  • MapReduce
  • HDFS
  • HBase
  • Zookeeper
  • Hive
  • Pig
  • Sqoop
  • Cassandra
  • Oozie
  • Flume
  • Chukwa
  • Pentaho
  • Kettle
  • Banana UI
  • Talend
  • Kafka
  • Storm
  • Spark (including Spark SQL
  • Spark Streaming
  • MLlib or GraphX)
  • Scala
  • PostGRES
  • WildFly
  • Flask
  • SciKit - Learn
  • HTML5/LESS.css
  • Angular
  • D3.js iOS
  • Scala
  • Python
  • Github Flow
  • Grunt
  • Jenkins with Slack
  • Hubot
  • Ansible and Docker
  • RabbitMQ/AMQP
  • Celery
  • ZeroMQ
  • Storm
  • Drools
  • Druid
  • HBase
  • Accumulo
  • BigTable
  • GIS programming using PostGIS
  • MapServer
  • GeoServer
  • GeoDjango
  • Facetted and boosted search using ElasticSearch
  • SOLR
  • Documentum 5.x 6.x 7.x
  • Webtop
  • WDK
  • DFC
  • DFS
  • DQL
  • XML
  • Composer
  • TBO
  • SBO
  • Java
  • C++
  • WebLogic
  • JBoss
  • Tomcat
  • Apache Web Server
  • Captiva InputAccel
  • Oracle

PROFESSIONAL EXPERIENCE:

Confidential

HadoopAdministrator and Developer

Responsibilities:

  • Installed and configured Hadoop 1.2, Hadoop 2.X, and Hadoop Clients
  • Upgraded Hadoop 1.2.1 to Hadoop 2.x
  • Installed and configured Hadoop ecosystem components (Hadoop, MapReduce, Yarn, Pig, Hive, Sqoop, Flume, Zookeeper and HBase)
  • Installed and configured Spark ecosystem components (Spark SQL, Spark Streaming, MLlib or GraphX)
  • Deployed Hadoop single node cluster in pseudo distributed mode and full distributed mode
  • Configured multiple name nodes and monitored active name nodes and standby name nodes
  • Maintained and administered HDFS through Hadoop-Java API
  • Wrote applications in Scala for large scale data processing.
  • Configured and scheduled MapReduce jobs
  • Configured FIFO and Fair Scheduler to provide service-level agreements for multiple users of a cluster
  • Involved in Hadoop cluster administration: adding and removing nodes in a cluster, cluster capacity planning, performance tuning, cluster monitoring, troubleshooting
  • Configured rack awareness. Configured Hadoop backup, whitelist and blacklist data nodes in a cluster
  • Configured high availability for major production cluster and designed automatic failover control using zookeeper and quorum journal nodes.
  • Deployed and configured Enterprise Puppet. configuration management system
  • Imported and exported structured data from different relational databases into HDFS and Hive using Sqoop
  • Developed a custom Webtop WDK interface to manage physical records. (Java, WDK)
  • Administered and upgraded Documentum repositories to 7.2. (Captiva 5.x, 6.x, 7.x, Solaris, Linux, JBoss, Weblogic, Tomcat)

Confidential

Documentum Administrator and Developer

Responsibilities:

  • Developed custom Documentum and SharePoint connectors. (Java)
  • Administered and upgraded repositories to 7.2. (Captiva 5.x, 6.x, 7.x, Solaris, Linux, JBoss, Weblogic, Tomcat
  • Developed integrations between Documentum using Documentum DFC and DFS API code within Drupal PHP code for the administration of users. (PHP and Java Code)

Confidential

HadoopAdministrator and Developer

Responsibilities:

  • Installed and configured Hadoop 1.2, Hadoop 2.X, and Hadoop Clients
  • Upgraded Hadoop 1.2.1 to Hadoop 2.x
  • Installed and configured Hadoop ecosystem components (Hadoop, MapReduce, Yarn, Pig, Hive, Sqoop, Flume, Zookeeper and HBase)
  • Imported and exported structured data from different relational databases into HDFS and Hive using Sqoop
  • Extensively involved in cluster capacity planning, hardware planning, installation, and performance tuning of the Hadoop cluster
  • Deployed Hadoop single node cluster in pseudo distributed mode and full distributed mode
  • Configured multiple name nodes and monitored active name nodes and standby name nodes
  • Maintained and administered HDFS through Hadoop-Java API
  • Configured and scheduled MapReduce jobs
  • Configured rack awareness. Configured Hadoop backup, whitelist and blacklist data nodes in a cluster
  • Configured high availability for major production cluster and designed automatic failover control using zookeeper and quorum journal nodes.
  • Developed custom Documentum DFC and DFS code for the mass import and export of content and metadata for repositories and for search interfaces. (Java, custom Documentum jobs and methods.)
  • Developed integrations between Documentum using Documentum DFC and DFS API code within Drupal PHP code for the administration of users. (PHP and Java Code)
  • Created the architecture for the access of existing content across platforms.
  • Worked with business and development teams on solutions for the use of content via web, mobile, and internal social media.
  • Mentored development teams of the best practices of using interactive technologies in collaboration with content management workflows. (Composer, Alfresco, Drupal)
  • Administered and upgraded repositories to 7.1. (Captiva 5.x, 6.x, Solaris, Linux, JBoss, Weblogic, Tomcat)

Confidential

Hadoop Administrator and Developer

Responsibilities:

  • Installed and configured Hadoop 1.2, Hadoop 2.X, and Hadoop Clients
  • Upgraded Hadoop 1.2.1 to Hadoop 2.x
  • Installed and configured Hadoop ecosystem components (Hadoop, MapReduce, Yarn, Pig, Hive, Sqoop, Flume, Zookeeper and HBase)
  • Imported and exported structured data from different relational databases into HDFS and Hive using Sqoop
  • Deployed Hadoop single node cluster in pseudo distributed mode and full distributed mode
  • Configured multiple name nodes and monitored active name nodes and standby name nodes
  • Configured and scheduled MapReduce jobs
  • Developed and administered Documentum D2 applications and servers. (Documentum 5.x, 6.x,) (Solaris, Linux, JBoss, Weblogic, Tomcat)
  • Developed custom java DFC and DFS code for the mass import and export of content for repositories.
  • Developed integrations between Documentum using Documentum DFC and DFS API code within Drupal PHP code for the administration of users. (PHP and Java Code)
  • Developed custom search services for content. (Java, WDK)

Confidential

Documentum Administrator, and Developer

Responsibilities:

  • Created a custom Documentum WDK web application interface for the publishing of content by editors to the nfl.com website. (Java, WDK, WebPublisher. (Captiva 5.x, Solaris, Linux, JBoss, Weblogic, Tomcat)
  • Created custom workflows and workflow code for the publishing of content. (Java)
  • Created custom java DFC code for the mass ingestion of multimedia.

Confidential

Documentum Administrator, and Developer

Responsibilities:

  • Created a Documentum XML application with a custom java WDK web application in which each XML section of a bill had a separate workflow handled by only the appropriate user group. (XML, Java, WDK)
  • Administered US Senate internal websites and applications. (Java, WDK, WebPublisher, Solaris, Linux, JBoss, Weblogic, Tomcat)

We'd love your feedback!