We provide IT Staff Augmentation Services!

Hadoop Developer Resume

4.00/5 (Submit Your Rating)

GA

SUMMARY

  • Having over 7+ years of experience in IT industry which includes experience in Big Data Hadoop Ecosystem and Java in Healthcare, Insurance, and Banking sectors.
  • Around 2.5 years of hands - on experience in working on Apache Hadoop ecosystem components like MapReduce, Sqoop, Flume Pig, Hive, HBase, Oozie and Zookeeper.
  • Experience in different Hadoop distributions like Cloudera (CDH4, CHD 3) and Horton Works Distributions (HDP).
  • Strong end-to-end experience in Hadoop Development.
  • Experience in importing and exporting teh data using Sqoop from HDFS to Relational Database systems and vice-versa.
  • Experience in data transformations using Map-Reduce, HIVE and Pig scripts for different file formats like Sequential file, Text, JSON.
  • Expertise in analyzing teh data using HIVE and writing custom UDF’s in JAVA for extended HIVE and PIG core functionality.
  • Hands on experience in configuring and administering teh Hadoop Cluster.
  • Good understanding of HDFS Designs, Daemons and HDFS high availability (HA).
  • Good understanding of NoSQL databases and hands on experience with Apache HBase.
  • Experience with various scripting languages like Linux/Unix shell scripts, Python.
  • Experienced with data warehousing and ETL processes.
  • Good understanding of cloud infrastructure like Amazon Web Services (AWS).
  • Has flair to adapt to new software applications and products, self-starter, has excellent communication skills and good understanding of business work flow.

TECHNICAL SKILLS

Big Data Technologies: Hadoop, HDFS, Hive, MapReduce, Pig, Sqoop, Flume, Zookeeper, Oozie

Scripting Languages: Python, Linux Shell Scripts

Databases: Oracle, Teradata, MySQL, MS-SQL Server

NoSQL Databases: Hbase

Operating Systems: LINUX (Centos and Ubuntu), Windows 8.1

Office Tools: MS-OFFICE - Excel, Word, PowerPoint.

PROFESSIONAL EXPERIENCE

Hadoop Developer

Confidential, GA

Responsibilities:

  • Extracted files from MySQL, Oracle, and Teradata through Sqoop and placed in HDFS and processed.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Responsible to manage data coming from different sources.
  • Assisted in exporting analyzed data to relational databases using Sqoop
  • Experience in using Apache Flume for collecting, aggregating and moving large amounts of data.
  • Analyzed large data sets by running Hive queries and Pig scripts
  • Worked with teh Data Science team to gather requirements for various data mining projects
  • Involved in creating Hive tables, and loading and analyzing data using hive queries
  • Migrated ETL processes from Oracle, MSQL to Hive to test teh easy data manipulation
  • Developed Simple to complex MapReduce Jobs using Hive and Pig
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in MapReduce way
  • Involved in running Hadoop jobs for processing millions of records of text data
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
  • Developed multiple MapReduce jobs in java for data cleaning and preprocessing
  • Involved in loading data from LINUX file system to HDFS
  • Responsible for managing data from multiple sources
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts
  • Develop Shell scripts for automate routine tasks.
  • Used Oozie and Zookeeper operational services for coordinating cluster and scheduling workflows

Environment: Hadoop, HDFS, Pig, Hive, MapReduce, Sqoop, Flume, ETL tools LINUX, and Big Data

Sr. Java Developer

Confidential, Norwalk,CT

Responsibilities:

  • Teh application was developed in J2EE using an MVC based architecture.
  • Implemented MVC design usingStruts1.3 framework, JSP custom tag Libraries and various in-house custom tag libraries for teh presentation layer.
  • Created tile definitions, Struts-config files, validation files and resource bundles for all modules using Struts framework.
  • Wrote prepared statements and called stored Procedures using callable statements in MySQL
  • Executing SQL queries to check teh customer records are updated appropriately
  • Used Apache Tomcat as teh application server for deployment.
  • Used Web services for transmission of large blocks of XML data over HTTP.

Environment: Java/J2EE, JSP, MySQL, Struts 1.3, Apache Tomcat, Eclipse, XML.

Java Developer

Confidential

Responsibilities:

  • Involved in Development of Screen.
  • Involved in coding for given functionality using JSP and JDBC. Action Classes in Struts.
  • Responsible in designing teh layers, components of teh project using OOAD methodologies and standard J2EE patterns & guidelines.
  • Done teh Front-End validations using JavaScript
  • Documented teh use cases, class diagrams and sequence diagrams using Rational Rose.
  • Involved in creating complex Stored Procedures, Triggers, Functions, Indexes, Tables, Views and other PL/SQL code and SQL joins for applications.
  • Involved in JavaScript coding for validations, and passing attributes from one screen to another.
  • Writing efficient SQL queries for data manipulation

Environment: Java/J2EE, JSP, JavaScript, SQL, Struts 1.2, Apache Tomcat, Eclipse, XML.

Java developer

Confidential

Responsibilities:

  • Developed JavaScript behavior code for user interaction.
  • Used HTML, JavaScript, and JSP and developed UI
  • Used JDBC and managed connectivity, for inserting/querying& data management including stored procedures and triggers.
  • Involved in teh design and coding of teh data capture templates, presentation and component templates.
  • Developed an API to write XML documents from database.
  • Used JavaScript and designed user-interface and checking validations.
  • Participated in Unit, Integration, System, Stress testing. Wrote Junit test cases for testing teh functionality of teh code.
  • Part of a team which is responsible for metadata maintenance and synchronization of data from database.

Environment: Java script, JSP, JDBC, HTML, XML.

We'd love your feedback!