We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Austin, TexaS

OBJECTIVE

  • To succeed in an environment of growth and excellence and earn a job which provides me job Satisfaction and self - development and help me achieve personal as well as organization goals.

SUMMARY

  • Software Professional having around 4+ years of experience in IT Industry which includes 3 years of Big Data Hadoop Ecosystems Experience.
  • Working experience on Hadoop technologies - HDFS, MapReduce, YARN, Pig, Hive, Sqoop and Hbase
  • I have a knowledge on Flume, Oozie.
  • Knowledge on different file formats likeAvro, Parquet, JSON
  • Hands on experience with Apache Hadoop, Cloudera(CDH)
  • Having experience on analysis using BI tool Tableau.
  • Good experience in working with MapReduce programs using ApacheHadoopfor working with Big Data to analyze large data sets efficiently.
  • Extending HIVE and PIG core functionality by using Hcatalog and custom UDF's.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database and vice-versa.
  • Strong knowledge of all phases of Software Development Life Cycle (SDLC), and experience in Waterfall, Agile/Scrum development Methodologies.
  • Had good knowledge in Spark Streaming.
  • Had training from Edureka in Spark using Java .
  • Willingness to learn new concepts and technologies I am excellent team player, always flexible to the team requirements.

TECHNICAL SKILLS

Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Sqoop, Spark Oozie, Flume

Languages: C, Java, Unix, SQL.

Databases: Oracle 10g, My Sql, HBase.

Operating Systems: Windows XP/Vista/7, UNIX.

Software Package: MS Office 2010.

Testing Tools: ALM, QTP, Jira

J2EE IDE: J2EE IDE

Version control: GitHub

PROFESSIONAL EXPERIENCE

Confidential - Austin, Texas

Hadoop Developer

Responsibilities:

  • Involved in daily scrum meetings to provide status of work.
  • Responsible to manage data coming from different sources and loaded into HDFS .
  • Written Java Programs to load data into HDFS.
  • Writing Map Reduce programs to convert JSON, XML data to CSV data and loaded in HDFS.
  • Written Java program to move data to archive automatically while deleting data from HDFS.
  • Created and written queries in SQL to update the changes in MySql when we upload or delete file in HDFS.
  • Extended support for application to work with Hive, Pig, Oozie and Sqoop.
  • Worked in Apache Tomcat for deploying and testing the application.
  • Working experience Jmeter Performance testing for Hbase Java API’s
  • Delivered couple of POC’s by using above application.

Confidential

Hadoop Developer

Responsibiliaties:

  • Involved in discussion with client in understanding business scenarios and raise clarifications if required.
  • Responsible to manage data coming from different sources and load into HDFS .
  • Maintain the data metrics and reporting in in time.
  • Loading files in Hadoop distributed file system from UNIX.
  • Writing Map Reduce programs
  • Writing Pig scripts on data in HDFS and moved to hive by using Hcatalog.
  • Writing Hive queries on the data resided in HDFS.
  • Loading data from different databases to HDFS and Hive using Sqoop.
  • I was involved in preparing understanding document of the Business scenarios.
  • Data cleaning in Dev and Test Environments.
  • Prepared complex queries for the adhoc reporting and designed dashboards in Tableau.
  • I have created actions, calculated fields, parameters, filters and maps in Tableau.

Confidential

Hadoop Developer

Responsibilities:

  • I involved in discussion in understanding business scenarios and raise clarifications if requried.
  • Involved in daily scrum meetings to provide status of work.
  • Responsible to manage data coming from different sources and loaded into HDFS .
  • Written Java Programs to load data into HDFS.
  • UsesTalendOpen Studio to load files intoHadoopHIVE tables and performed ETL aggregations in HadoopHIVE.
  • Designing & Creating ETL Jobs throughTalendto load huge volumes of data into Cassandra, HadoopEcosystem and relational databases
  • Writing Map Reduce programs to convert JSON, XML data to CSV data and loaded in HDFS.
  • Written Java program to move data to archive automatically while deleting data from HDFS.
  • Created and written queries in SQL to update the changes in MySql when we upload or delete file in HDFS.
  • Extended support for application to work with Hive, Pig, Oozie and Sqoop.
  • Worked in Apache Tomcat for deploying and testing the application.
  • Delivered couple of POC’s by using above application.
  • Used different file formats like Text files, Sequence Files,Avro.

Confidential

Java Developer

Responsibilities:

  • Involved in various SDLC phases like Design, Development and Testing.
  • Developed front end using Struts and JSP.
  • Developed web pages using HTML,JavaScript, JQuery and CSS.
  • Used various CoreJavaconcepts such as Exception Handling, Collection APIs to implement various features and enhancements.
  • Developed server side components servlets for the application.
  • Involved in coding, maintaining, and administering Servlets and JSP components to be deployed on a Web Sphere application server.
  • Implemented Hibernate ORM to Map relational data directly tojavaobjects.
  • Worked with Complex SQL queries, Functions and Stored Procedures.
  • Involved in developing spring web MVC framework for portals application.
  • Implemented the logging mechanism using log4j framework.
  • Developed REST API, Web Services.
  • Wrote test cases in JUnit for unit testing of classes.
  • Used Maven to build the J2EE application.
  • Used SVN to track and maintain the different version of the application.

Confidential

Trainee

Responsibilities:

  • Developed a web based application using three tier architecture
  • Used JSP for the GUI and Java Servlets to handle requests and responses
  • Implemented DAO connections to establish connections with database (Oracle 11g) to save and retrieve data
  • Performed Unit testing and security testing for the application.

We'd love your feedback!