We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

4.00/5 (Submit Your Rating)

PennsylvaniA

SUMMARY:

  • Having 9+ years of overall IT experience as a Developer, Designer & Database Administrator with cross platform integration experience using Hadoop and PeopleSoft.
  • Having Hadoop ecosystem experience in HortonWorks, HDP & HDF, Cloudbreak, HDFS, MapReduce, Hive, Hive LLAP, Pig, Spark, Kafka, ZooKeeper, Yarn, Oozie, Yarn, Storm, Nifi, Ranger, Flume, Hbase, Knox, Zeppelin, Ambari administration, Strong knowledge of Linux systems (RHEL/CentOS). Having good knowledge and experience on VMware & Virtualbox.

PROFESSIONAL EXPERIENCE:

Confidential

Hadoop Administrator

Responsibilities:
  • Responsible for implementation and ongoing administration of Hadoop infrastructure.
  • Aligning with the systems engineering team to propose and deploy new hardware and softwareenvironments required for Hadoop and to expand existing environments.
  • Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for thenew users.
  • Cluster maintenance as well as creation and removal of nodes using tools.
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files.
  • File system management and monitoring.
  • HDFS support and maintenance.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  • Collaborating with application teams to install operating system and Hadoop updates, patches,version upgrades when required.
  • Point of Contact for Vendor escalation
  • Desirable: Be able to troubleshoot issues with hive, hbase,pig,spark /scala scripts to isloate/fixissues if infrastructure relatedSkills Required / Desirable:
  • General operational expertise - good Linux troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.
  • Hadoop skills like HBase, Hive, Pig, Sqoop, Spark, Zookeeper, Oozie, Flume, Green Plum, Tez, Kafka etc.
  • Able to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitorcritical parts of the cluster, configure name-node high availability, schedule and configure it andtake backups.
  • Familiarity with open source configuration management and deployment tools such as Puppetor Chef and Linux scripting.

Confidential, Pennsylvania

Hadoop Admin

Responsibilities:
  • Experience in using Cloudera Manager for installation and management of single-node and multi-node Hadoop cluster.
  • Monitor job performances, file system/disk-space management, cluster & database connectivity, log files, management of backup/security and
  • Troubleshooting various user issues.
  • Experience in HDFS data storage and support for running map-reduce jobs.
  • Decommissioning and commissioning the Node on running Hadoop cluster
  • Hadoop cluster performance monitoring and tuning, disk space management
  • Expertise in Designing and developing a distributed processing system running into a Data Warehousing platform for reporting.
  • Extensive experience in Big Data Analytics.
  • Experience with Cloudera CDH3, CDH4 and CDH5 distributions.
  • Extensive experience with big data query tools like Pig Latin and HiveQL.
  • Experience with Sequence files, AVRO and HAR file formats and compression.
  • Experience in tuning and troubleshooting performance issues in Hadoop cluster.
  • Worked in the project Apache Kafka which aims to provide a unified, high-throughput, low-latency platform for handling real-time data feed.
  • Experience on monitoring, performance tuning, SLA, scaling and security in Big Data systems.
  • Hands on NO-SQL database experience with HBase, MongoDB.
  • Extensive experience in Data Ingestion, In-Stream data processing, Batch Analytics and Data Persistence strategy.
  • Exploring with the Spark improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frame, Pair RDD's, Spark YARN.
  • Prepared, arranged and tested Splunk search strings and operational strings.
  • Experience in bulk load tools such as DW Loader and move data from PDW to Hadoop archive.
  • Experience in collecting the logs from log collector into HDFS using up Flume.
  • Experience in setting up and managing the batch scheduler Oozie.
  • Good understanding of No SQL databases such as HBase and Mongo DB.
  • Experience in analyzing data in HDFS through MapReduce, Hive and Pig.
  • Experience in integration of various data sources like Oracle, DB2, Sybase, SQL server, MS access and non-relational sources like flat files into staging area.
  • Experience in Data Analysis, Data Cleansing (Scrubbing), Data Validation and Verification, Data Conversion,Data Migrations and Data Mining.
  • Holds strong ability to handle multiple priorities and work load and has ability to understand and adapt to new technologies and environments faster.

Confidentia

PeopleSoft/LinuxAdministrator

Responsibilities:
  • Working as PS Admin\DBA at client location.
  • Worked on PeopleSoft environment upgrade.
  • Here we are providing a support and solution related to PeopleSoft and Database related queries.
  • Daily has to work on several SR’s like Migrations, Creation of new environments, Re-pointing of DB or environment, Renaming of DB or environment, Apply Images, deploy bugs on environment etc.
  • Extensively participated in Administering PS Admin Tool and setting up and correcting the configuration settings of Tuxedo, Process Scheduler, Web Server, Jolt, and App Engine.
  • Created PeopleSoft environment on various blank servers.
  • Migrated multiple applications into single portal through PeopleSoft Interaction Hub.
  • Configured and troubleshoot the Process Scheduler for the batch process programs.
  • Installed & Configured PUM & Change assistant.
  • Installed SSL certificates.
  • Implemented SSO to access multiple module through single login.
  • Implemented Load balancer
  • Configured SES over multiple environments like HCM,FIN& CRM.
  • Installed and configured PUM servers for deploy Images, bugs on various environments.
  • Installed, Configured and Maintained MS SQL 2012.
  • Moved PeopleSoft objects by using Application Designer and then ran compare report to check the changed and unchanged objects.
  • Used STAT for maintain the migration versions.
  • Updated databases by applying Updates/Fixes/Patches after backup on timely basis and kept the record of the updates/fixes/Patches.
  • As requested by developer, cloned the Database from production database to build the development database.
  • Ran SYSAudit and DDDAudit reports for auditing purpose, also remove the orphaned records if found
  • Monitoring the Back Up jobs and size of the database, also has to archived the older DB copies.
  • Also worked on configuration of PSEMAgent and PSEMHub while applying Image through PUM.
  • Worked on Installation and configuration of Oracle SES for PeopleSoft HRMS.

Environment: Windows XP, Windows 7, PeopleSoft 9.2, MS SQL2012, People Tools 8.53, PUM, Application Designer, Data Mover.

Confidential

PeopleSoft Admin/DBA

Responsibilities:
  • Worked as a PeopleSoft Admin for HRMS application.
  • Worked on Installation and configuration of PeopleSoft Application Servers, Web servers and Process Scheduler.
  • Extensively participated in Administering PSAdmin Tool and setting up and correcting the configuration settings of Tuxedo, Process Scheduler, Web Server, Jolt, and App Engine.
  • Created HRMS 9.1 Demo, 8.8 Demo.
  • Configured and troubleshoot the Process Scheduler for the batch process programs.
  • Identified, applied and tested patches/ bundles.
  • Installed, Configured and Maintained DB2.
  • Installing and moved the objects of PeopleSoft using Application Designer and done compare and report to check for changed and unchanged objects.
  • Configuring People Books and F1 Help Configuration.
  • Updated databases by applying Updates/Fixes/Patches after backup on timely basis and kept the record of the updates/fixes/Patches.
  • Ran SYSAudit and DDDAudit reports for auditing purpose, also remove the orphaned records if found
  • Monitoring the Back Up jobs and size of the database, also has to archived the older DB copies.
  • Also worked on configuration of PSEMAgent and PSEMHub while applying Image through PUM.
  • Worked on Installation and configuration of Oracle SES for PeopleSoft HRMS.

Environment: Windows XP/2003, Windows 7, AIX 5.3, PeopleSoft 9.1, DB2, Application Designer, Data Mover, Change Assistant.

Confidential

Team Member/PeopleSoft Admin

Responsibilities:
  • Worked as a PeopleSoft admin and hanledPeopleSoft HRMS & Finance.
  • Had Setup and configured PeopleSoft Application Servers, Web servers and Process Scheduler.
  • Extensively participated in Administering PSAdmin Tool and setting up and correcting the configuration settings of Tuxedo, Process Scheduler, Web Server, Jolt, and App Engine.
  • Configured and troubleshoot the Process Scheduler for the batch process programs.
  • Identified, applied and tested patches/ bundles.
  • Installed, Configured and Maintained Oracle 11g.
  • Installing and Moved the objects of PeopleSoft using Application Designer and done compare and report to check for changed and unchanged objects.
  • Configuring People Books and F1 Help Configuration.
  • Updated databases by applying Updates/Fixes/Patches after backup on timely basis and kept the record of the updates/fixes/Patches.
  • Cloning the Database when required for developers to use.
  • Refreshed the build/Test/Dev environment from production database as per the user request..
  • Involved in crystal report modifications, installer modifications and build support.

Environment: UNIX, Oracle 10g, SQR, PeopleSoft Application Designer.

We'd love your feedback!