We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • IT Professional with total 14+ years of industry experience across Big data, Linux Administration, DB2, SQL Server
  • Hadoop Administrator/Developer with 4+ years of experience
  • Expertise in Big Data technologies such Hadoop (HDFS, MapReduce), Pig and Hive, Spark.
  • Expertise working in Hadoop ecosystem comprising of Apache Hadoop, Pig, Hive, Hbase, Zookeeper, Sqoop, Oozie, Java
  • Expertise in Database Administration of DBs such as DB2, IMS, SQL Server
  • Expertise in Linux(RHEL) Administration
  • Experience working with IDE such as Eclipse 3.5
  • Expertise in creation of databases manually/through tools in different Environments.
  • Expertise in backup and recovery strategies of critical databases
  • Expertise in performance tuning of DBs such as DB2
  • Expertise working on distributions likes Hortonwork and Cloudera.
  • Experience working with MDM applications on DB2 UDB
  • Experience with UNIX shell scripts and PERL scripts.
  • Expertise with performance monitoring and tuning for OLAP workload of MDM application on DB2
  • Experience in Data modeling for Data Warehouse/Data Mart development, Data Analysis for Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/Business Intelligence (BI) applications.
  • Exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production.
  • Involved in various projects related to Data Modeling, System/Data Analysis, Design and Development for both OLTP and Data warehousing environments.
  • Facilitated data requirement meetings with business and technical stakeholders and resolved conflicts to drive decisions.
  • Comprehensive knowledge and experience in process improvement, normalization/de - normalization, data extraction, data cleansing, data manipulation.
  • Expertise in RFI, RFP work.
  • Good leadership qualities and people management skills.
  • Good communication skills.
  • Experience in working with clients with different geographical orientation.

TECHNICAL SKILLS:

Big Data Technologies: Apache Hadoop, Map-Reduce, HDFS, Pig, Hive, Hbase, Zookeeper, Sqoop, Flume, Oozie, Spark

Platforms: RHEL 6, Unix, Windows 95, 98, 2000, XP

DBMS: DB2, SQL Server

IDE: Eclipse 3.5

Languages: Perl Scripting, REXX, JCL.

PROFESSIONAL EXPERIENCE:

Confidential

Hadoop Administrator

Responsibilities:

  • Adding and deleting User spaces as required
  • Managing groups, users and permissions
  • Migrating data from one Hadoop cluster to another Hadoop cluster using distcp
  • Managing Hadoop cluster capacity by adding nodes
  • Working on data analysis in HDFS using MapReduce, Hive and Pig scripts/jobs
  • Working on writing MapReduce jobs
  • Working on developing Pig and Hive scripts
  • Providing day to day technical support to developers and testers
  • Working on developing Sqoop scripts for sourcing data from other DBMS into Hadoop HDFS
  • Monitoring and supporting Hadoop cluster using Cloudera Manager, Hortonworks
  • Working on installing/setting-up Hive, Hbase, Pig on Hadoop nodes
  • Analysing business procedures and problems to refine data.
  • Studying existing data programming systems to improve work flow

Environment: Apache Hadoop, Pig, Hive, MapReduce, HDFS, Sqoop, Spark, RHEL, UNIX, Java

Confidential

Hadoop Administrator/Developer

Responsibilities:

  • Working and managing Hadoop and it’s ecosystem
  • Developing and maintaining MapReduce jobs
  • Developing and maintaining Pig and Hive scripts
  • Using Crontab to automate the scripts
  • Writing Pig scripts to run ETL jobs on HDFS data
  • Extracting data from DB2 using Sqoop
  • Analysing data by using HiveQL and Pig scripts
  • Performing Hadoop cluster administration, Cloudera, Hortonworks.
  • Adding and deleting User spaces as required
  • Worked on porting/offloading Mainframe z/OS Workload onto Hadoop (HDFS)/Hive using Veristorm’s vStorm and zDoop. This helped in saving huge MIPS cost for mainframe applications.

Environment: Hadoop, Pig, Hive, Sqoop, Linux, DB2, Windows

Confidential

Database Admin

Responsibilities:

  • Creating and modifying DB2 databases and database objects
  • Maintaining and populating Test/Prod Databases
  • Utility support including Copy, Reorg, Load jobs
  • Monitoring, trouble-shooting and resolution for DB issues
  • Doing capacity planning for DB environments
  • Recovering DB objects as and when required
  • Monitoring and fixing DB performance issues
  • Implementing ITIL processes to manage DB environments for DB2, SQL Server

Environment: Mainframe Z/OS, RHEL 6, JCL

DBMS: DB2, SQL Server

Confidential

Linux Administrator

Responsibilities:

  • Worked as a System Administrator on Linux & Unix platforms.
  • Administered Linux Red Hat servers - RHEL 3/4/5/6.
  • Monitoring and tuning system to make sure of the optimum system performance.
  • Good experience in LVM, User System Resource Management and Job Scheduling.
  • Performed system administration task such as adding users, creating file systems, configuring volumes
  • Adding Oracle ASM disk to the server.
  • Create users with limited and full root privileges.
  • Create and manage sudoers.
  • Develop Linux shell scripts to accomplish redundant tasks.
  • Responsible for resolving network issues using network tools like ping, tcptraceroute, traceroute, tcpdump.
  • Installing and configuring services like DHCP, NFS, DNS, Apache Web Server, SSH, FTP/SFTP,

Confidential

Database Admin

Responsibilities:

  • Creating and modifying DB2 databases and database objects
  • Maintaining and populating Test/Prod Databases
  • Storage allocations utilizing SMS and Stogroups
  • Utility support including Copy, Reorg, Load and Modify jobs
  • Monitoring, trouble-shooting and resolution for DB2 issues
  • Designing SMS policies for DB2 environment
  • Creating IMS databases objects such as DBDs, PSBs, ACBs.
  • Using DBRC to perform Backup/Recovery of IMS objects
  • Doing capacity planning for DB2 environment
  • Recovering DB2 objects as and when required
  • Monitoring and fixing DB2 performance issues
  • Implementing ITIL processes to manage DB2 mainframe environments
  • Language: Mainframe Z/OS, JCL, Linux, AIX
  • Dbms: DB2, IMS DB, DB2 UDB
  • Tools: BMC Catalog Manager, Change Manager, Endevor, SMART, Task Tracker

Confidential, New Jersey, NJ

Data Modeler/DBA

Responsibilities:

  • Conducted major stakeholder interviews involving SME’s, Business Analysts and other stakeholders to determine the requirements.
  • Translated the business requirements into workable functional and non-functional requirements at detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modeling.
  • Identified and compiled common business terms for the new applications that become central part of communication among project stakeholders to avoid ambiguity
  • Worked at conceptual/logical/physical data model level using Erwin according to requirements.
  • Involved in exhaustive documentation for technical phase of the project and training materials for all data management functions
  • Used Reverse Engineering approach to redefine entities, relationships and attributes in the data model as per new specifications in Erwin after analyzing the database systems currently in use
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design
  • Create and modify databases and database objects
  • Maintaining and populating Test/Prod Databases
  • Reviewing of DB2 access paths.
  • Running DB2 IBM and BMC utilities for Test/Prod databases.
  • Worked on converting BMC reorg utilities to IBM reorg utilities.
  • Language: Mainframe Z/OS, JCL, Linux, AIX
  • Rdbms: DB2 z/OS, Oracle, SQL server, DB2 UDB

Environment: Erwin, BMC Catalog Manager, Change Manager, C.A Insight, Endevor.

We'd love your feedback!