We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

4.00/5 (Submit Your Rating)

DC

SUMMARY:

  • 12+ Years of IT Experience
  • 5+ Years of Hadoop Experience.
  • 7+ Years of Software Engineer Experience.
  • 12 years of IT experience.
  • 5 years of Hadoop Administrator experience.
  • As a Hadoop Administrator, setup Hadoop distributions by vendors, like Cloudera CDH, Hortonworks HDP.
  • As a Hadoop administrator have experience in working with several Databases like Oracle, MySql etc.
  • Experienced in identifying customer Big Data problems and mapping Hadoop solutions to it.
  • Experience in Hadoop and its eco - system components with good knowledge in Hadoop architecture and ecosystem components like Map Reduce, YARN, Grafana, Spark, Hive, Kafka, Flume, Pig etc.
  • Experience implementing Hadoop on cloud (AWS).
  • Experienced in Cluster Maintenance and Monitoring as part of Administration.
  • Excellent shell scripting ability.
  • Experience in managing and reviewing Hadoop log files.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Extensive experience in Performance tuning.
  • Implemented security for Hadoop Cluster using Apache Ranger and Kerberos for various customers.
  • Excellent communication and interpersonal skills.

TECHNICAL SKILLS:

Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Oozie, Flume, Apache Kafka, Apache Spark

Programming Languages: Java, C/C++

Scripting Languages: Shell scripting

Databases: Oracle, My SQL, HBase, Mongo DB

Tools: Eclipse, Pentaho, Informatica

Platforms: Windows, Linux/Unix

Operating Systems: Ubuntu, CentOS

Application Servers: Apache Tomcat

Special Software: Hadoop, Cloudera CDH, Hortonworks HDP, Eclipse IDE, R Hadoop

PROFESSIONAL EXPERIENCE:

Confidential, DC

Hadoop Administrator

Responsibilities:

  • Maintained Hortonworks cluster with HDP Stack 2.4.2 managed by Ambari 2.2.2
  • Client wanted to migrate from In-Premise cluster to the Amazon Web services Cloud (AWS), hence Built a Production and QA Cluster with the latest distribution of Hortonworks - HDP stack 2.6.1 managed by Ambari 2.5.1 on AWS Cloud.
  • The Production and QA AWS Cluster both are 8 node cluster.
  • Various patch upgrades happen on the Cluster for different services.
  • Providing immediate support to users for various Hadoop related issues.
  • User management, involving user creation, granting permission for the user to various tables and database, giving group permissions.
  • All clusters have HA Enabled.
  • Performed HDP stack upgrade.
  • Disabled save in Hive views.
  • Ranger security enabled on all the Clusters.
  • Working closely with the Development team, providing support, fine tuning the cluster for various use cases, and resolving day to day issues in the cluster, with respect to the services health.
  • Working with the Development team, optimizing the hive queries using bucketing, partitions and Joins concept.

Environment: Hortonworks 2.6.0,2.6.2 RHEL, Ambari, Hue, JIRA

Confidential, Dallas

Hadoop Administrator

Responsibilities:

  • Designing Hadoop architecture and Implementation.
  • Designing disaster recovery models for big data environments.
  • Implementing Hadoop Security on Hortonworks Cluster (Kerberos, Two-way SSL, Knox, Ranger)
  • Creating automation scripts for installing Hadoop with Ambari Blueprints.
  • Designing and Implementing Highly Availability in Hadoop environment.
  • Performance tuning and optimization.
  • Installing and Configuring Hadoop Ecosystems (Hdfs, Yarn, Map reduce, Spark, Kafka, Hbase, Hive, Oozie, Zookeeper etc).
  • Running benchmark tests, analyze system bottlenecks and prepare solutions to eliminate them
  • Active directory integration for Hadoop
  • Implementing Hadoop cluster on top of Virtualized environments.
  • Capacity planning and screening of Hadoop job performances
  • Performance improvement in Hive Queries
  • Hadoop infrastructure administration.
  • Designing and implementing new technologies.
  • Document Installation steps, use cases, solutions and recommendations.

Environment: Linux, Ubuntu, Unix, Hadoop Administration, Shell scripting, PIG, Hive, Hbase, Hadoop, MongoDB, Hadoop, CDH 4.5 with Cloudera Manager CM 4.8

Confidential, Washington DC

Hadoop Administrator

Responsibilities:

  • Performed the role of Hadoop Admin in TCS Big Data team. There my role involved installing and configuring different Hadoop distributions on clusters - (Cloudera, Apache, Pivotal, IBM BigInsights, HortonWorks), Automation scripts and common maintenance activities.
  • Implementing Hadoop High Availability
  • Capacity planning and screening of Hadoop job performances
  • Active directory integration for Hadoop
  • Installing and Configuring Hadoop Ecosystems (Hdfs, Yarn, Map reduce, Spark, Kafka, Hbase, Hive, Oozie, Zookeeper etc).
  • Kerberos and Apache Knox security implementation.
  • Hadoop performance tuning as per the system resource utilization.
  • Running benchmark tests with different values to Hadoop configuration parameters to get an optimized system configuration.

Environment: Linux, Ubuntu, Unix, Hadoop Administration, Shell scripting, PIG, Hive, Hbase, Hadoop, MongoDB, Hadoop, CDH 4.5 with Cloudera Manager CM 4.8

Confidential

Developer/Project Lead

Responsibilities:

  • Lead the PeopleSoft HCM team.
  • First level contact for SAAS applications like TALEO, SuccessFactors, VEMO and SumTotal.
  • Developed and supported SQR interfaces between PeopleSoft and SAAS Applications.
  • Worked directly with clients in managing their requests, understanding their requirements and translating functional requirements to technical specifications through hands-on analysis and coordinating with the onsite and offshore team for development.
  • Created architecture diagrams, functional design documents, technical design documents, impact analysis documents and production review check lists.
  • Performed unit testing and added/ ran test cases in HP quality center.
  • End to end Implementation of custom Hierarchy and Workflow process for IMF.
  • Created / supported multiple application engine programs.
  • Created multiple PeopleSoft custom pages with complex functionalities written using PeopleCode.
  • Part of Implementation team for the proposed Pension Portal Design.
  • Closely worked with the .NET team who were building the front-end pages for pension portal.
  • Worked on Integration Broker, Application Messaging, created WSDL for PeopleSoft pages and exposed them to be consumed by the .NET front end.
  • End to end Implementation of custom Voluntary Savings Plan (VSP).
  • Created several SQR processes to transfer data between IMF and WELLS FARGO (401K vendor).
  • Worked on Medicare Part B changes in IMF.
  • End to end Implementation of custom Request for Personal Action system in IMF.
  • Migrated PeopleSoft projects on PHIRE.

Environment: Application-PeopleSoft HCM 9.0; Tools -PeopleTools 8.48; Operating System-Windows XP, Windows 2012 Server; Database- Oracle 11C; Migration Tool- PHIRE

Confidential

Senior Software Developer

Responsibilities:

  • Conducted required gathering sessions for custom requests and created functional design documents.
  • Created architecture diagrams, functional design documents, technical design documents, impact analysis documents and production review check lists.
  • Worked on any enhancements requests made through the maestro tool.
  • Created new SQRs and added them to the batch server.
  • Provided 24/ 7 support with utmost responsibility.
  • Proposed best practices to improve performance of pages.
  • Support and development of application engine programs.

Environment: Application-PeopleSoft HCM 9.0; Operating System-Windows XP, Database- Oracle 10G; Special Software- Maestro

Confidential

Senior Software Developer

Responsibilities:

  • Created architecture diagrams, functional design documents, technical design documents, impact analysis documents and production review check lists.
  • Created/ Modified a PeopleSoft pages.
  • Part of the upgrade team for PeopleSoft upgrade from version 8.8 to 8.9.
  • Assist with maintenance and troubleshooting of scheduled processes.

Environment: Application-PeopleSoft HCM 8.9; Operating System-Windows XP, Database- Oracle 9G; Special Software- Maestro

Confidential

Senior Software Developer

Responsibilities:

  • Conducted required gathering sessions for custom requests and created functional design documents.
  • Created architecture diagrams, functional design documents, technical design documents, impact analysis documents and production review check lists.
  • Worked on any enhancements requests made through the maestro tool.
  • Created new SQRs and added them to the batch server.
  • Provided 24/ 7 support with utmost responsibility.
  • Proposed best practices to improve performance of pages.
  • Support and development of application engine programs.

Environment: Hardware-IBM PC Pentium IV; Operating System-Windows XP, Windows 2003 Server; Languages- PeopleCode, SQR, Application Engine Special Software- Ultraedit

Confidential

PeopleSoft Technical Analyst

Responsibilities:

  • Conducted required gathering sessions for custom requests and created functional design documents.
  • Created architecture diagrams, functional design documents, technical design documents, impact analysis documents and production review check lists.
  • Worked on any enhancements requests.
  • Created new SQRs based on customer needs.
  • Support and development of application engine programs.

Environment: Hardware-IBM PC Pentium IV; Operating System-Windows XP, Windows 2003 Server; Languages- PeopleCode, SQR, Application Engine Special Software- Ultraedit

Confidential

Systems Developer

Responsibilities:

  • Monitor PeopleSoft nightly jobs.
  • Administer PeopleSoft server and database.
  • Perform database refresh.
  • Prepare technical documentation for any process followed.

Confidential

Support Engineer

Responsibilities:

  • Provide first level technical support for Linksys home products.
  • Configuring routers, print servers, wireless cameras etc.
  • Training users about the various products of Linksys.
  • Documenting new issues to the knowledge base.

We'd love your feedback!