We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Irvine, CA

SUMMARY

  • 8+ years of IT experience in the field of Information Technology that includes analysis, design, development and testing of complex applications.
  • Extensive experience in Java Server side programming, Client/Server technology and Web based software development using J2EE and XML.
  • Experience working with Open source communities.
  • Provided technical direction to development team
  • Experience in writing SQL scripts including stored procedures, functions, packages and optimization, indexes.
  • Around 2+ years of strong working experience with Big Data and Hadoop Ecosystems.
  • Strong experience with Hadoop components: Hive, Pig, HBase, Zookeeper, Sqoop and Flume.
  • Excellent understanding / knowledge of Hadoop architecture and various components of Hadoop such as HDFS, JobTracker, TaskTracker, Name Node, Data Node, Map Reduce & YARN.
  • Worked with Hive/HQL to query data from Hive tables in HDFS.
  • Working Experience in writing Map Reduce Programs in Java.
  • Managed Apache Hadoop jobs using Oozie Workflow manager.
  • Good understanding in Cassandra & MongoDB implementation.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Experience in working with flume to load the log data from multiple sources directly into HDFS
  • Experience in configuring the Zookeeper to coordinate the servers in clusters and to maintain the data consistency.
  • Experience in designing both time driven and data driven automated workflows using Oozie.
  • Good understanding in writing Python Scripts.
  • Experience in providing support to data analyst in running Pig and Hive queries.
  • Used Pig to analyze large data sets as a part of data consolidation process.
  • Experienced in using Flume to transfer log data files to Hadoop Distributed File System (HDFS)
  • Experience in Multiple Relational Databases like Oracle 10g and NOSQL database Hbase.
  • Extensive exposure in RDBMS architecture, modelling, design, development, loading migration including Oracle, SQL Server with excellent experience in SQL.
  • Developed many comprehensive requirements & design documents and also conducted many design reviews and other meetings.
  • Ability to learn and master new technologies, deliver outputs in short deadlines and having good programming skills.
  • Excellent with coordinating, mentoring and exchanging information with colleagues.
  • Always committed to team success. Capable of assuming responsibility, exercising and working independently as required.

TECHNICAL SKILLS

Big Data Technologies: Hadoop, HDFS, Map Reduce, Pig, Hive, Oozie, Sqoop, Zookeeper, Flume, Amazon EMR, Cloudera, MapR

NOSQL: HBase

Languages: Core Java, C, C++, Python

Web Skills: Java Script, CSS, JQuery

RBMS Databases: Oracle 10g, SQL, SQLite, MySQL

Tools: &Utilities Eclipse, Toad, SQL Navigator, VMware WorkstationEnvironment Linux, Windows

Version Control Tool: SVN, IBM Synergy, CVS

Scripting Languages: UNIX Shell Scripting, Python

Scheduling Tool: Cron, $Universe

PROFESSIONAL EXPERIENCE

Confidential, Irvine, CA

Hadoop Developer

Responsibilities:

  • Involved in architecture design, development and implementation of Incentives Optimization Engine project.
  • Write data processing jobs to launch and monitor processing-intensive computations on Real Time Clusters.
  • Developed new Hbase Purger and custom Hbase exporter.
  • Developed Generic Solr Query API’s using SOLRJ.
  • Created Oozie workflows and coordinators to schedule map reduce, Hive jobs.
  • Wrote various Hive queries to extract data and generate visualizations using Tableau.
  • Wrote Pig UDF’s and Scripts to redact rmation of historical data.
  • Processed HDFS data and created external tables using Hive, in order to analyze spikes, faults and customer experience.
  • Developed multiple MapReduce jobs for data cleansing & preprocessing huge volumes of data.
  • Improve control and visibility of the query jobs
  • Designed & Developed framework for provisioning of data to multiple consumers.
  • Providing support for deployment of code to multiple environments.
  • Interacting with client for requirement gathering, analysis and modularization of the requirement.
  • Used Hadoop Streaming to write jobs in a Python scripting language
  • Requirements Study, Software Development Specification, Development and Unit Testing use of MRUnit
  • Fine tuning, stabilizing Hadoop platform for allowing real time streaming and batch style big data applications to run smoothly with optimal cluster utilization.
  • Performed upgrades on Cloudera Clusters.

Environment: Cloudera CDH 5.0.3, Mapr 1.0, Hbase 0.98

Confidential, East Hartford, CT

Java/Hadoop Developer

Responsibilities:

  • Analyzed business requirements and existing software for High Level Design.
  • Involved in complete lifecycle of multiple Hadoop Implementation projects specializing write MapReduce Programs, Hive Queries (HQL) and use Flume to pull the log files
  • Extensively used Hive queries to query or search as per business requirement in Hive tables in HDFS.
  • Scheduled, monitored and debugged various MapReduce nightly jobs using Oozie Workflow.
  • Involved in analysis, specification, design, implementation and testing phases of Software Development Life Cycle (SDLC) and developed the project using Rational rose.
  • Installed and Configured Hadoop Cluster.
  • Developed MapReduce programs in Java to search production logs and consolidate billing event as business requirements.
  • Monitored Nightly jobs to export data out of HDFS to be stored offsite as part of HDFS backup.
  • Used ANT scripts to build application
  • Prepared Test Cases and Unit Testing performed using JUnit
  • Used Log4J for logging and tracing the messages
  • Used CVS for version control across common source code.
  • Responsible for coding Map Reduce program, Hive query's, testing, debugging, Peer code review, troubleshooting and maintain status report.
  • Involved in identifying possible ways to improve the efficiency of the system
  • Conduct technical/business evaluations of business area requirements and recommend appropriate solutions to clients
  • Provide in-depth technical and business knowledge to ensure efficient design, programming, implementation and on-going support for the application
  • Interacting with the Business Requirements and the design team and preparing the Low Level Design and high level design documents
  • Written Oozie workflow to Copy fsimage into HDFS.
  • Written Oozie Co-ordination workflow to run MR jobs when fsimage file available in HDFS.
  • Written Oozie workflow to move MR output files to RDBMS using Sqoop.
  • Written a MR Job code using MR API to process fsimage file taken through Offline Image Viewer.

Environment: MapR Cluster, Java, Linux

Confidential, Seattle

Java Developer

Responsibilities:

  • Involved in design, analysis, development, testing and deployment activities.
  • Coding using Core Java, C++ and HTML.
  • Oracle used as the relational backend.
  • Involved in Code Review.
  • Releasing the code for deployment.
  • Developer as part of the TISS team- a Traffic Data Collection and Statistical Process Analysis module developed in C++/STL’s/Solaris
  • Played a technical role in the design and prototyping of performance enhancements in TISS server.
  • Implemented C++ interface layer between TISS and LTDM to integrate TISS with LTDM on UNIX from the scratch.
  • Replaced Third party tool Syncsort, which performs sorting and pre-load aggregation of data elements and reformats the output with VZSort (in house) using TREE data structures effectively.
  • Developed new Unix shell scripts, MAKE files,
  • Designed and Developed Test Plan/Test cases for Quality Analysis Team. Also Actively Participated in Writing Developer Unit Test cases.
  • Involved in Quality Analysis and Testing of the software product in various Software Cycle.

Environment: C++, Java, SQL, Oracle 10g TOAD, Eclipse, Python, UNIX Shell Script, Linux.

Confidential

Java Developer

Responsibilities:

  • Involved in the Development, enhancement, QA Interaction and PIT for SystematICS releases.
  • Gathering & Understanding the functional specification from the functional document.
  • Involve in Functional Requirement and Design walkthrough with the Business Analyst.
  • Participation in design and implementation plan reviews with MasterCard to understand new business requirement.
  • Developed front end validations using JavaScript and developed design and layouts of JSPs.
  • Coding using Core Java, C++ in Linux environment.
  • Oracle used as the relational backend.
  • Designed and developed several SQL Scripts, Stored Procedures and Triggers for Oracle10g Database.
  • Involved in writing Unix Shell Script.
  • Implementation/Promotion of code using Linux production environment.
  • Attend implementation walkthrough meeting with client to understand impact of new application on the existing Production processing and discuss the concerns, issues raised during review phase.
  • Work with Onsite and Offshore development teams for application development and delivery.

Environment: Ubuntu 10.4, C++, Java, Java script, CSS, Eclipse, Technology, Oracle 10g, Unix Shell Script

Confidential

Java Developer

Responsibilities:

  • Coding using Java and Java script for new enhancements and development supports.
  • Reported, tracked and fixed the defects using defect tracking tool.
  • Involved in doing basic Performance testing using shell scripts.
  • Developed and enhanced test scripts to validate data in IRIS database and front end Dashboard
  • Involved in creating various database configuration scripts using Oracle and SQL Server.
  • Coding using C++, Java and UNIX shell script.
  • Analyzing the requirements, understanding the requirements from BA team, Technical design document preparation and UTP preparation.
  • Releasing the code for build.

Environment: Java, C++, Java Script, CSS, Eclipse, HP Unix, Oracle 9i, Unix Shell Script

We'd love your feedback!