We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

SUMMARY

  • Highly skilled Hadoop Data Platform Certified Administrator (HDPCA) with extensive design experience, knowledge and background in deploying, supporting, managing and configuring Hadoop Clusters.
  • Proactive, results - driven professional with skillset to acquire Big Data from diverse sources and develop Big Data solutions that support Data analysis leveraging the Hadoop Ecosystem.
  • Possess strong analytical, conceptual and problem-solving abilities as well as excellent communication, presentation and organizational skills.

TECHNICAL SKILLS

  • HDFS
  • YARN/Map Reduce
  • Pig
  • Hive
  • HBase
  • Sqoop
  • ZooKeeper
  • Knox
  • Ranger
  • Kafka
  • Oozie
  • Flume
  • Hue
  • Flume
  • Spark.Python
  • UNIX Shell scripting
  • HBase
  • Cassandra
  • MySQL
  • DB2
  • PostgreSQL
  • Oracle
  • Oracle Linux
  • SQL Server Core
  • JavaMySQL.
  • Windows AD/Kerberos
  • TrueSight
  • BladeLogic
  • Discovery
  • Atrium Orchestrator
  • Atrium Configuration Management Database
  • Vmware
  • Windows
  • Linux
  • Active Directory

PROFESSIONAL EXPERIENCE

Hadoop Administrator

Confidential

Responsibilities:

  • Install, build, configure, deploy, manage and support the Apache Hadoop Ecosystem and responsible for managing and maintaining the clusters in the development, QA and Production environments.
  • Provision, install, configure, monitor, and maintain HDFS, Yarn, HBase, Flume, Sqoop, Oozie, Pig, Hive, Ranger, Kafka services.
  • Responsible for Cluster maintenance, capacity planning, performance tuning, commissioning and decommissioning Data nodes, Cluster Monitoring, Troubleshooting, Manage and review data backups, Manage & review Hadoop log files.
  • Handle the data exchange between HDFS and different Web Applications and databases using Flume and Sqoop.
  • Close monitoring and analysis of the Map Reduce job executions on cluster at task level.
  • Set up automated processes to analyze the System andHadooplog files for predefined errors and send alerts to appropriate groups
  • Analyze and transform data with Hive, Pig and Oozie scheduler
  • Involved from the analysis phase through the implementation phase of projects.
  • Apply and tailor best practices in software processes and quality to achieve fast cycle time development
  • Formulated procedures for installation ofHadooppatches, updates and version upgrades.
  • Collaborate with multiple teams for design and implementation of big data clusters in cloud environments.
  • Evaluate business requirements and use cases and prepare detailed specifications that follow project guidelines required to develop written scripts to handle data from various data sources

Hadoop Administrator

Confidential

Responsibilities:

  • Installed/Configured/Maintained ApacheHadoopecosystem for application development andHadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop.
  • Worked on importing and exporting data from Oracle and DB2 into HDFS and HIVE using Sqoop.
  • Worked extensively on Hive and PIG.
  • Implemented HCatalog for making partitions available for Pig/Java MR and established Remote Hive Metastore using MySQL
  • Executed Hadoop streaming jobs to process terabytes of XML format data.
  • Implemented commissioning and decommissioning of data nodes, killing the unresponsive task tracker and dealing with blacklisted task trackers.
  • Supported MapReduce Programs and distributed applications running on the Hadoop cluster.
  • Scheduled and Managed Oozie Jobs to automate sequence of rotational activity.
  • Perform Continuous monitoring and managing the Hadoop cluster through Ambari.
  • Participated in development and execution of system and disaster recovery processes.
  • Automated processes for troubleshooting, resolution and tuning ofHadoopecosystem.

Hadoop Administrator

Confidential

Responsibilities:

  • Installed and configuredHadoopecosystem and responsible for maintaining cluster and managing and reviewing Hadooplog files.
  • Use of Sqoop to import and export data from HDFS to RDBMS and vice-versa.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Implemented HCatalog for making partitions available for Pig/Java MR and established Remote Hive metastore using MySQL.
  • Analyzing, writing Hadoop MapReduce jobs using Java API, Pig and Hive
  • Handled the imports and exports of data onto HDFS using Flume and Sqoop.
  • Implemented Fair schedulers on the Job tracker to share the resources of the Cluster for the Map Reduce jobs given by the users.
  • Well-versed in creating new Big Data collection systems that optimize data management, capturing, delivery and quality using Hadoop stack (HDFS, Sqoop, Flume, HBase, Hive, Pig, YARN, Spark).
  • Performed commissioning, decommissioning, balancing, and managing nodes and tuning server for optimal performance of the cluster.
  • Involved in the complete Software Design Lifecycle including design, development, testing and implementation of moderate to advanced complex systems.
  • Involved in Analyzing system failures, identifying root causes, and recommended course of actions. Documented the systems processes and procedures for future references.
  • Discussions with other technical teams on regular basis regarding upgrades, Process changes, any special processing and feedback.

BMC Enterprise Tools Engineer

Confidential

Responsibilities:

  • Designed, built and deployed BMC Middleware Automation (BMA) version 8.8 to facilitate the automation of J2EE Web Applications and configurations.
  • Deployed WebSphere, Apache Tomcat, WebLogic, JBoss application ear files and application configurations.
  • Created snapshots, snapshot with filters, WebSphere Application Configuration files, server profiles, compared differences between snapshots, rolled back configurations as applicable, defined tokens keys/values, imported/deployed token sets to server profiles.
  • Post-Install tasks included downloading applicable application client libraries, defining REST API and the Health & Value Dashboard connection URL's, and defining the location of the application server installation directories.
  • Tokenized configurations with unique values and tokens conforming to applicable systems.
  • Migrated previous versions of WebSphere, WebLogic and JBoss to current versions.
  • Created and configured WebSphere Portal servers in BMA.
  • Upgraded BMA version 8.7 to version 8.8.
  • Downloaded and updated client Application libraries as required.

BMC Enterprise Tools Engineer

Confidential

Responsibilities:

  • Administered and managed BMC TrueSight Middleware Transaction Monitoring (BMTM) solution to facilitate transaction monitoring of DataPower and MQ Queue Manager devices.
  • Created and edited BMTM Event Templates, Actions, Triggers and BEM generic scripts to facilitate Event Management processing.
  • Updated, created and applied History Rules and Templates to MQ Queue Managers, DataPower devices.
  • Integrated BMTM with AD/LDAP and BPPM.
  • Periodically updated the BMTM BEM generic script with required Event Triggers associated to their Event categories.
  • Upgraded BMTM as required.
  • Patched and applied hotfixes, fix packs and patches as required.
  • Utilized the BMTM Configuration Manager to create and edit MQ Queue Managers, queues, channels, topics, subscriptions, processes, listeners and services.
  • Provided MQ Queue Manager transaction metric reports periodically to customers.

BMC Enterprise Tools Engineer

Confidential

Responsibilities:

  • Designed, deployed built the BMC Atrium Discovery and Dependency Mapping (ADDM) version 10 solution to initiate network device, system and application discovery.
  • ADDM duties included providing periodic asset discovery scan reports to Management and Security.
  • Generated custom ADDM Tideway Pattern Language(TPL's) as required.
  • Provisioned Collaborative Application Mapping to facilitate the business application, database and services interdependencies and connections process in ADDM.
  • Uploaded and updated ADDM Technical Knowledge Updates(TKU's) as required.
  • Setup periodic sync to Atrium CMDB to keep Configuration Items(CI's) synched in the CMDB.
  • Managed, supported and created Atrium Integrator to build ETL jobs for the CMDB.
  • Create and configure accurate service modeling, normalization, reconciliation in Atrium CMDB based on discovered CI's.
  • Publish Business Service models to BPPM, associate the CI monitors to Service Model in BPPM to facilitate probable root-cause analysis of impacted services.
  • Installed, configured, deployed and managed the BMC BladeLogic Server Automation and Report Servers to centrally automate system configuration, software and file deployment, provisioning, patch management, auditing, reporting and compliance.
  • Setup and configured SQL Database for BSA.
  • Also configured, deployed and managed the BladeLogic File server and Oracle DB server.
  • BSA duties included deploying the BladeLogic RSCD agents to target nodes as applicable, deploying software jobs to target servers from BSA, conducted periodic patching of Windows and Linux boxes, created, applied Inventory and security compliance templates, created and provided compliance, patch, and asset inventory reports to Management and customers.

Hire Now