We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Menomonee Falls, WI

SUMMARY

  • Overall 7 years and 2 months of work experience as a software developer.
  • Hadoop Developer with 3 years of working experience on designing and implementing complete end - to-end Hadoop Infrastructure using MapReduce, PIG, HIVE, Hbase.
  • Java Programmer with 4.2 years of Extensive programming experience in developing web based applications and Client-Server technologies using Java, J2EE.
  • Good knowledge of Hadoop Architecture and various components such as HDFS, YARN, Name Node, Data Node and MapReduce concepts
  • Experience in working with MapReduce programs using Hadoop for working with Big Data.
  • Experience in analyzing data using Hive QL, Pig Latin and custom MapReduce programs in Java
  • Experience in importing and exporting data using Sqoop from Relational Database Systems to HDFS and vice-versa.
  • Collecting and Aggregating large amount of Log Data using Apache Flume and storing data in HDFS for further analysis.
  • Job/workflow scheduling and monitoring tools
  • Experience in designing both time driven and data driven automated workflows using Oozie
  • Worked in complete Software Development Life Cycle (analysis, design, development, testing, implementation and support) using Agile Methodologies.
  • Experience on Hadoop clusters using major Hadoop Distributions - Cloudera(CDH3, CDH4).
  • Experienced in using Integrated Development environments.
  • Good Knowledge on oracle MySQL & Postgre SQL, Hawq & Hbase
  • Migration from different databases (i.e. Oracle, DB2, MYSQL, Mongo DB) to Hadoop.
  • Prior experience working as Software Developer in Java/J2EE and related technologies.
  • Experience in designing and coding web applications using Core Java and J2EE Technologies
  • Excellent knowledge in Java and SQL in application development and deployment.

Areas of Strength

  • Extensive experience in MapReduce, Pig and Hive.
  • Experience with Sqoop and Flume.
  • Good Experience with -Oozie and Zookeeper
  • Good Experience with Cloudera-Hadoop Distribution.
  • Experience with Oracle, MySQL,hbase and Mongo DB.
  • Extensive experience in JSPs,Servlets and JDBC.

TECHNICAL SKILLS

Big Data Technologies: Apache Hadoop, YARN, Pig,Hive,Sqoop,Flume,Zookeeper,Oozie

BI tools: Tableau,Jmp

Languages: Java,C/C++,Unix Shell Scripting

Operating System: Windows, UNIX, Linux

Web Technologies: HTML, JSP, CSS, Java Script.

Web/App Servers: Apache Tomcat, Glass Fish

IDE: Eclipse,NetBeans,JBOSS

Web Services: SOAP and RESTful

Build Tools: Maven, ant

DataBase: Oracle, MySQL, Postgre SQL, HBase(NOSQL), Hawq, HQL, MongoDB, SQL Server

Methodologies: Agile, Waterfall

Version Control: SVN and IBM Rational Clear Case tool

PROFESSIONAL EXPERIENCE

Confidential, Menomonee falls, WI

Hadoop Developer

Responsibilities:

  • Involved in Data Mapping and Data Ingestion phases of the project.
  • Extracted the data stored in the flat files to Sql Server then used sqoop to load the data into Hive.
  • Designed an Incremental data loading strategy to hive.
  • Automated the incremental data loading process on to Hive tables.
  • Involved in designing a customer match model which is aggregation of the customer information.
  • Created Pig and Map Reduce jobs for initial data processing.
  • Written map reduce and pig scripts for loading custom input format.
  • Created Hive Tables, loading data and writing Hive queries, which will invoke and run Map Reduce jobs in backend.
  • Optimized the pig scripts for the efficiency.
  • Written Hive queries to read HBase tables for Data validation.
  • Used a Custom Scheduler for scheduling the tasks.
  • Used Zookeeper to manage the HBase cluster.
  • Generated statistical reports for business analysts for further analysis.

Environment: Hadoop 2.x, MapReduce, Hive, Pig, Sqoop, Java 7.0, XML, SQL Server, Junit, SVN, Scheduler, Cloudera Hadoop Distribution.

Confidential, Cary, NC

Hadoop Developer

Responsibilities:

  • Used Amazon Elastic Map Reduce (Amazon EMR)
  • Involved in Raw and clean phases of the project
  • Written pig scripts for extensive data cleansing in Raw phase.
  • Created the required Hive tables in the Clean Phase
  • Written incremental hive scripts across the updated data.
  • Used schema less AVRO format as storage while writing PIG Scripts.
  • Written the automated scripts to create tables from the avsc format.
  • Used airflow scheduler to automate the jobs

Environment: Hadoop 1x, Hadoop 2x, HDFS, Map Reduce, Hive, Impala, Pig, Sqoop, HBase, Shell Scripting, Linux, PIG (0.8.1), HIVE (0.7.1), Sqoop (V1), Oozie (V2.3.2), Core Java, Oracle 11g, Cloudera Hadoop Distribution CDH4, SQL*PLUS, Toad 9.6, LINUX

Confidential, Webster, NY

Hadoop Developer

Responsibilities:

  • Used airflow scheduler to automate the jobs
  • Developed Simple to complex Map/reduce Jobs using Hive and Pig.
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from Oracle into HDFS using Sqoop.
  • Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior.
  • Used UDF's to implement business logic in Hadoop.
  • Installed and configured Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Developed Pig Latin scripts for data cleansing and Transformation.
  • Monitor System health and logs and respond accordingly to any warning or failure conditions.
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts.
  • Implemented Partitioning, Dynamic Partitions and Buckets in HIVE for efficient data access.
  • Extracted data from databases like SQL Server and Oracle 10g using Sqoop Connectors into HDFS for processing using Pig and Hive.

Environment: Hadoop, Map Reduce, HDFS, Hive, Oracle 11g/10g, HBase, Oozie, Java (jdk1.6), UNIX and Zookeeper

Confidential

Java Developer

Responsibilities:

  • Involved in design, development and testing of the application.
  • Extensively worked with Spring MVC for developing J2EE Components.
  • Developed servlets and JSPs with Custom Tag Libraries for control of the business processes in the middle-tier and was involved in their integration.
  • Effective implementation of Twilio API to send the SMS to customers.
  • Involved in writing the test cases for the application using JUnit.
  • Apache Tiles is used extensively to develop presentation layer to make it more user friendly and to support the MVC design pattern.
  • Involved in creating various Data Access Objects for Addition, modification and deletion of records using various specification files.
  • Involved in each and every phase of SDLC.
  • Responsible for parsing XML documents using SAX parser.
  • Used spring with Hibernate for relational mapping of the database.
  • Wrote Operations and corresponding handlers to communicate with Oracle database sitting on Linux/Unix server
  • Responsible for creating Restful Web services using JAX-RS.
  • Continuous Integration is done using Jenkins to continuously integrate code and to do the builds.
  • Added logging and debugging capabilities using Log4j and using SVN.
  • Prepared the design documents and effort estimations.
  • Interacted with the client directly while capturing the requirements and project closure.
  • Involved in the development of JSPs and Servlets for different User Interfaces.
  • Added client side validations using Java script and server side validation using Validator Framework.

Environment: Java, JSP, HTML, Spring, JavaScript, Tiles, CSS, Twilio API, Restful Web services, Eclipse, Hibernate, MYSQL, SVN, Quality Center, LOG4j, Tomcat Server, Quartz Scheduler.

Confidential

Java Developer

Environment: Java, JSP, HTML, Spring, JavaScript, CSS, Twilio API, Restful Web services, Eclipse, MYSQL, SVN, LOG4j, Tomcat Server.

We'd love your feedback!