We provide IT Staff Augmentation Services!

Sr. Hadoop Administrator Resume

0/5 (Submit Your Rating)

St Louis, MO

SUMMARY

  • Over 8+ years of IT experience with 4 years of experience as Hadoop Administration and hands on experience as Linux System Admin & Informatica administrator.
  • Hands on experience in installation, configuration, security and monitoring of Hadoop Cluster on RHEL 6.x.
  • Expertise in configuring Hadoop Security using Kerberos with LDAP/AD.
  • Experience in installing, configuring, and managing of Hadoop ecosystem components like HDFS, Yarn, HBase, Oozie, Hive, Impala, Spark, Sqoop, Pig and Flume, HUE, Ranger.
  • Expertise with Hortonworks Hadoop platform (HDFS, Hive, oozie, Sqoop, Yarn)
  • Installed, configured and maintained Hadoop cluster in High Availability environment.
  • Expertise in setting up and configuring Backup, Recovery, Replication and Disaster Recovery procedures in the cluster.
  • Setup HDFS directories and permissions for different applications.
  • Experience with standard automation and deployment tool Puppet.
  • Experience with Runbook Automation(RBA) workflow.
  • Expertise as Informatica Administrator to setup new Informatica environments and Upgrade existing Informatica environments to new versions.
  • Expertise in installing, configuring and maintaining Informatica Power Center (9.x & 10.x) versions.
  • Hands on experience with Creating folders, Groups, Roles, Users in Admin console and granting them permissions.
  • Good experience with change management, deploying code from DEV to TEST and PROD - using deployment groups in Informatica Repository manager.
  • Excellent knowledge and experience in data warehouse development life cycle, requirement analysis, dimensional modeling, Repository management and administration.
  • Knowledge about Software Development Lifecycle (SDLC), Agile, Application Maintenance Change Process (AMCP).
  • Experience in computer system maintenance, systems security, software upgrade and network administration of servers running Linux (Redhat Linux 5.X 6.X on HW like HP, Dell, & Cisco)
  • 24x7 Technical support and troubleshooting in Performance Monitoring of CPU, Memory, Disks, Network using various monitoring tools like Nagios XI & Cacti.
  • Experience in troubleshooting of issues with applications, network, user hardware, naming services etc.
  • Providing support to technicians on Network Configuration, Performance Tuning, Security Hardening
  • Technical support, testing, troubleshooting and configuration of naming services like NIS, DNS etc. on Linux and Solaris
  • Experience in architecting, setting up data centers and migration of applications
  • Strong analytical, diagnostics, troubleshooting skills to consistently deliver productive technological solutions
  • Implement Open SSH based security for the Redhat, AIX and Solaris machine
  • Experience in System performance tuning Configuration.
  • A good team player with strong communication skills and an inquisitive mind to learn more
  • Experience in troubleshooting and supporting complex technical environments while maintaining a professional, business-focused attitude and courteous manner towards clients, partners and peers.
  • Ability to manage and prioritize multiple related projects, including troubleshooting, operations, and maintenance of the enterprise in a team environment
  • Expert proficiency within VMware Virtual Center, vRealize Automation, including new VM creation, modification, and deletion
  • Experience working with ticketing systems RT and Remedy
  • Knowledge with SAN and NAS data storage technologies
  • Ability to learn new and different technologies to a working depth quickly
  • Participated in 24x7 on-call rotation, off-hour production problem resolution activities
  • A self-motivated, responsible and reliable team player with a set of strong technical skills.
  • Excellent analytical, interpersonal and communication skills.
  • Enthusiasm in learning new technologies.

TECHNICAL SKILLS

Big Data Technologies: Hortonworks 2.5/2.4/2.2, Apache Hadoop, HDFS, Pig, Hive, Sqoop, HBase, OOZIE, Kerberos, Spark, HUE, Ranger, Zookeeper.

RDBMS: Oracle 12c/11g/10g/, SQL Server 2012/2008/2005 , DB2 10.1/9.7 FP1, Sybase 15.x.

Operating Systems: HP-UX 10.x, 11.x, Sun Solaris 2.5, 2.6, 8, 9, 10, AIX 5.1, 5.2, 5.3, 5.4, Red Hat Linux 6.x/5.x, AWS EC2, MS SQL Server, Windows 98/ME/NT/2000/XP

Languages: C, C++, Java, XML, SQL, PL/SQL, UNIX Shell Scripting, Python

ETL Tools: Informatica Power Center 10.1/9.6.1/9.1/8. x, Power Exchange/Power Connect, Data Analyst, Metadata Manager, IDQ, Informatica MDM HUB 9.6.1/ 9.7

PROFESSIONAL EXPERIENCE

Confidential, St Louis, MO

Sr. Hadoop Administrator

Responsibilities:

  • Installed, configured, upgraded, and applied patches and bug fixes for Prod, Test and Dev Servers.
  • Installed/Configured/Maintained Hadoop clusters in dev/test/UAT/Prod environments.
  • Install, configure and administer Hdfs, Hive, Ranger, Pig, HBase, Ooozie, Sqoop, Spark and Yarn.
  • Worked on the installation and configuration of Hadoop HA Cluster.
  • Involved in capacity planning and design of Hadoop clusters.
  • Setting up alerts in Ambari for the monitoring of Hadoop Clusters.
  • Setting up security authentication using Kerberos security.
  • Creating and dropping of users, granting and revoking permissions to users/Policies as and when required using Ranger.
  • Commission and decommission the data nodes from cluster.
  • Write and modify UNIX shell scripts to manage HDP environments.
  • Installed and configured Flume, Hive, Sqoop and Oozie on the Hadoop cluster.
  • Administer, configure and performance tuning for Spark applications
  • Create directories and setup appropriate permissions for different applications.
  • Backup tables in Hbase to Hdfs dir’s using export utility.
  • Involved in planning and implementation of Hadoop cluster Upgrade.
  • Installation, Configuration and administration of HDP on Red Hat Enterprise Linux 6.6
  • Used Sqoop to import data into HDFS from Oracle database.
  • Detailed analysis of system and application architecture components per functional requirements.
  • Review and monitor system and instance resources to insure continuous operations (i.e., database storage, memory, CPU, network usage, and I/O contention)
  • On call support for 24x7 Production job failures and resolve the issue in timely manner.
  • Developed UNIX scripts for scheduling the delta loads and master loads using Auto sys Scheduler.
  • Have deep and thorough understanding of ETL tools and how they can be applied in a Big Data environment.
  • Troubleshoots with problems regarding the databases, applications and development tools.

Confidential, Silver Spring, MD

Hadoop Administrator

Responsibilities:

  • Worked on evaluating, architecting, installation/setup of Hortonworks 2.1/1.8 Big Data ecosystem which includes Apache Hadoop HDFS, Pig, Hive and Sqoop.
  • Perform configuration, administration and monitoring of Hadoop clusters.
  • Managing HDFS directories permissions for applications.
  • Maintaining and Monitoring of Hadoop Cluster.
  • Plan and implement the upgrades in Hadoop.
  • Responsible for adding, creating new users, groups and setup home directories and appropriate access restrictions to directories using Ranger
  • Managing the filesystem which include creating backups of HDFS, creating new file systems.
  • Detailed analysis of system and application architecture components as per functional requirements.
  • Worked on installation and configuration of Hadoop HA cluster.
  • Responsible for creating repository users, user groups, giving privileges to the users, granting users with access to the repository folders.
  • Responsible for maintaining repository backups and their restorations.
  • Responsible for migrations from Development repository to the QA and Prod repositories.
  • Bench Mark testing of Hardware and Informatica Power Center mappings to calculate the load times for batch processing and tuning the mappings.
  • Installed and Configured Informatica Power Center on Client and Server Machines and Configure Informatica Server and Registered Servers.
  • Install and configure Power Exchange for Oracle and SQL server.
  • Implemented Type II Slowly Changing Dimension methodology for accessing the full history of customers.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, dynamic Lookup, Filter, Update Strategy, and Router transformations.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Worked with the DBA to improve Informatica session performance and query performance by collecting statistics and defining relevant indexes on target tables.
  • Created Data Flow Mappings to extract data from source system and Loading to staging area.
  • Analyzed the Source data and developed the code to extract the data and implemented in the informatica.
  • Extensively worked on Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
  • Worked on various tasks like Session, E-Mail task and Command task.
  • Used mapplets, parameters and variables in mappings as per the requirement.
  • Used versioning and shared folders for better control on Informatica jobs among the team and across different projects.
  • Worked on slowly changing dimension Type2.
  • Involved in performance tuning of the process at mapping, session, source level and at target level.
  • Support Production job failures and resolve the issue in timely manner.
  • Troubleshoot the Productions failure and provide root cause analysis.
  • Worked on emergency code fixes to Production.
  • Involved in testing by using Informatica workflow manager and monitor, etc. and extensive testing on log files to detect errors.
  • Tested different data load databases by comparing source and target table counts and tested Informatica data loads from database to database.
  • Developed UNIX scripts for scheduling the delta loads and master loads using Autosys Scheduler.

Confidential, Hicksville, NY

Linux Systems Administrator

Responsibilities:

  • Following corporate procedures, creating/modifying/closing accounts for Linux/Windows systems using centralized authentication (Active Directory, NIS, LDAP) systems and shadow files.
  • Resolving tickets related to logon issues and resetting password frequently.
  • Building and configuring servers using PXE boot with kickstart and using HP's Server Automation tool for automated builds.
  • Resolving issues with OS deployments, kickstart failures.
  • Installing, updating and maintaining software on Linux systems with yum and rpm to maintain secure environment.
  • Setting up automounts for home directories in different domains.
  • Monitor the availability and health of the infrastructure and applications across multiple data centers using Nagios XI.
  • Responding to alerts generated by Nagios and Solar Winds in timely fashion.
  • Creating Logical volumes on NAS, DAS, and SAN storage also extending/reducing logical volume.
  • Resolving network & DNS related issues.
  • Performance tuning on slow systems.
  • Ensuing backups and restoring data when needed using NetBackup 7.6
  • Installing LAMP stack and resolving connectivity issues.
  • Troubleshooting and maintaining of TCP/IP, Apache HTTP/S, SMTP, and DNS applications.
  • Performing NIC bonding on Linux Systems for redundancy.
  • Working on Security related projects such as Vulnerability Management and remediation which brought tremendous improvements in securing our computing environment.
  • Building and automating VMWare operations on ESXI 5.5 with Vcenter, Vmotion, Vshpere.
  • Performing P2V conversion and migrating applications to cloud.
  • Proving level 1, 2, and level 3 support on Linux Systems in general.
  • Working closely with other technical groups.
  • Taking ownership of problems/issues/tickets until resolution.

Confidential

Oracle DBA/Developer

Responsibilities:

  • Involved in the analysis, design, build, and testing phase of the life cycle for the project.
  • Developed PL/SQL packages to capture end of day transactions.
  • Implemented UTL HTTP oracle built in package to acquire data from external websites.
  • Preparation of design specifications document.
  • Involved in designing the database, identification of the objects and creating tables.
  • Designed and created database objects such as Tables, Views and Indexes.
  • Tested and corrected errors and refined changes to data base.
  • Implementing Database Security, Management of role-based security.
  • Wrote scripts for Loading Data using SQL*Loader and Import/Export Assisting the developers in tuning the SQL for optimized performance.
  • Handling day to day activities along with schema and space management.
  • Creating primary database storage structures (tablespaces, datafiles, tables, indexes etc)
  • Planning for storage spaces requirements and placing them for optimum I/O performance.
  • Cloning databases using scripts as well as HOTBACKUP.
  • Extensively used Explicit Cursors in stored procedures.
  • Established database backup/recovery strategy for different databases.
  • Recovery procedures are applied for complete and incomplete recoveries depending on the type of failure.
  • Management of table space growth, monitoring data files additions, extent growth.
  • Constantly monitor the performance of the Databases and viewing Alert log files & Trace files.

We'd love your feedback!