We provide IT Staff Augmentation Services!

Sr Hadoop Administrator Resume

TECHNICAL SKILLS:

Computer Skills: Languages - C, C++, ETL, SQL, PL/SQL, MS SQL, MySQL, HTML, Perl, Python, Shell scripting, CSS, JavaScript and Java,Hadoop administration, AWS / EMR, GoogleCloud, Spark / Scala, Pig, Hive, HBase, Storm, Kafka, Sqoop, Oozie, Flume, Knox,Ranger, HUE

Operating Systems: UNIX (Sun and HP), Linux (RedHat), CentOS, Ubuntu, Windows OS, Apple OS.

Development Software: Informatica, Cognos, Business Objects, Xcelsius, Object Team,Adobe, Oracle Enterprise Manager, Oracle RDBMS, Oracle Forms, Oracle Reports, Oracle Developer, TOAD (for Hadoop), SAP Business Objects Web Intelligent tools, RStudio, Druid, Superset, Tableau, Microsoft Visio, Microsoft Office software and SPSS.

WORK EXPERIENCE:

Confidential

Sr Hadoop Administrator

Responsibilities:

  • Collaborate with Confidential ’s Engineering, Data Science and, Customer teams to scope, size, install and configure components of the open source Hadoop Big Data platform on AWS / EMR, Confidential or RHEL environments. Install peripheral tools such as HUE, RStudio and OpenLDAP .
  • Manage users on the Hadoop platform.
  • Encrypt data at rest and in motion. Serve as an escalation point for the Confidential Managed Services teams that provide on - going operational support for Confidential ’s customers. Collaborate with Confidential ’s project lead to develop project level scoping plan. Research tools to accommodate customer requirements.
  • Develop test plans for initial Hadoop services component testing. Perform system administration tasks to include using Ambari to install and provision Hadoop clusters, onboard users to the Hadoop cluster, setup High Availability (HA) for key components in Hadoop such as the Namenode, Resource Manager, and any other component identified as being critical to the customer or use cases.
  • Familiar with the use of Apache Ranger for user authorization, access, and for auditing on Hortonworks data platform and the use of Apache Knox for Hadoop perimeter gateway access.
  • Use of Ansible, Puppet, Chef for automating common install tasks in Linux.
  • Use Linux shell scripting where necessary to automate tasks or to fill gaps where tools cannot perform the tasks. Familiar with Hadoop available streaming components such as Kafka, Spark, Storm and Flunk.
  • Familiar with cloud computing environments such as AWS and Google cloud environment.
  • Collaborate with systems administrators and architects when necessary to perform system designs. Familiar with Virtual Machine tools such as VMWare and Virtualbox.
  • Work with the Confidential ’s Operations Practice lead to develop and ensure consistent deployment of best practices across all of Confidential ’s projects. Design and develop tools to support proactive administration and monitoring of the open source Hadoop Big Data platform.
  • Supporting sales efforts scoping engagements and developing statements of work.

Confidential, Tucson, Arizona

Lead Data Engineer

Responsibilities:

  • Lead Hadoop Administrator for the implementation and maintenance of Hortonworks Data Platform (HDP 2.1 and HDP 2.3) for Hadoop cluster. Perform cluster implementation and continuing administration of the Hadoop ecosystem using tools such as Ambari, Ganglia and, Nagios. Align with software developers, database modelers, systems architects, data scientists, Linux system admins and enterprise security teams to meet enterprise requirements, project deadlines and POC use cases.
  • Perform initial testing of Hadoop services such as Kafka, Hive, Pig, HBase, and Flume. Support all development teams using the Hadoop cluster. Meet deadlines for test requirements and deliverables. Investigate, integrate and evaluate tool save development time and increate production efficiency. Perform research on development environments to increase development efficiency.
  • Integrate enterprise Active Directory and LDAP users into the ecosystem. Define implement and enforce security policies for users and services using tools such as Ranger, Knox and Kerberos. Define HDFS directory structure for development teams, data scientists, and research teams.
  • Perform continuous performance tuning, screen cluster performance and capacity planning using tools such as the Resource Manager and Job Tracker. Manage and review log files and automate the removal of unnecessary logs using scripts.
  • Fine tuning and optimizing and capacity management on hadoop platforms. Storage planning and management, experience with mapreduce and spark. Sizing NameNode heap memory.
  • Communicate effectively with end users on maintenance, patches and, version upgrades when necessary. Document and distribute cluster layout for user reference. Prioritize and manage service issues. Main point of contact for vendor escalation on open issues related to the cluster maintenance, performance and upgrade.

Sr. Information Systems Technologist

Confidential

Responsibilities:

  • Lead business analyst interfacing with business subject matter experts to gather requirements to implement business intelligence solutions to save time and increase productivity.
  • Interrogate databases and develop scripts to meet business requirements.
  • Interface with developers to create SAP Business Objects end solutions.
  • Familiar with relational database and data warehouse concepts and ETL tools such as Informatica.
  • Develop enterprise business intelligence dashboards reports and customized reports for internal business units using Cognos and Business Objects software.
  • Create underlying database design, such as DMR, OLAP and ROLAP, using Cognos framework manager to support business intelligence reports.
  • Implementation and administration of Simplified Sign On (SSO), on Apache web servers.
  • Create web based change request system, gathering information on request for change and assembling the correct change control board based on the change required and the skill set needed for the change. Schedule, conduct and present regular tool development update meeting to stakeholders. Assess risk and management involved with tool development and integration.
  • Create tool and software install notes for reference and tool maintenance documentation.
  • Create and update technical documentation. Create user-training manual and provide training on tools created. Participate in full life - cycle software development.

Avionics Technician

Confidential, Fort Bragg, NC

Responsibilities:

  • Installed, operated and maintained avionics electronic equipment.
  • Troubleshoot, maintained and repaired UHF, VHF, AM, FM and intercommunication systems.
  • Troubleshoot and isolate equipment malfunctions to the component level.
  • Read and interpreted electronic circuit diagrams and schematics.

Hire Now