We provide IT Staff Augmentation Services!

Hadoop Developer Resume

2.00/5 (Submit Your Rating)

Bloomington, IL

PROFESSIONAL SUMMARY:

  • Overall 8+ years of experience in IT industry which around 3 years of experience in Big Data in implementing complete Hadoop solutions along with 5 years of experience in Java.
  • Good working experience in using ApacheHadoop ecosystem components like Map Reduce, HDFS, Hive, Sqoop, Pig, Oozie, Flume, HBase, Cassandra and Zoo Keeper.
  • Strong experience in data analytics using Hive and Pig, including by writing custom UDFs.
  • Performed importing and exporting data into HDFS and Hive using Sqoop.
  • Good Knowledge on Apache Cassandra database for better performance and scalability.
  • Knowledge of working in ETL tools like Informatica and Pentaho Kettle.
  • Experience in job workflow scheduling and monitoring tools like Oozie and Zookeeper.
  • Extensive knowledge in using SQL Queries for backend database analysis.
  • Good understanding of NoSQL databases like MongoDB and REDIS.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and vice - versa.
  • Experience in analyzing data using Hive QL, Pig Latin, and custom MapReduce programs in Java.
  • Experienced in installing, configuring, and administrating Hadoop cluster of major Hadoop distributions.
  • A good knowledge implementing Apache Spark with Scala.
  • In depth knowledge of creating Map Reduce codes in Java as per teh business requirements.
  • Experience in developing multi-tier Java based web application.
  • Expertise in Core Java, J2EE, Multithreading, JDBC, Shell Scripting and proficient in using Java API’s for application development.
  • Good experience in developing applications using JavaEE technologies includes Servlets, Struts, JSP, and JDBC.
  • Well-versed in Agile, other SDLC methodologies and can coordinate with owners and SMEs.
  • Experienced in creating and analyzing Software Requirement Specifications (SRS) and Functional Specification Document (FSD) .
  • Strong knowledge of Software Development Life Cycle (SDLC).
  • Experienced in preparing and executing Unit Test Plan and Unit Test Cases after software development.

TECHNICAL EXPERTISE:

Big Data Ecosystems: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, HBase, Zookeeper, Oozie, Flume, Cassandra, Spark

Programming Languages: Core Java, JSP, JDBC, Linux

Scripting Languages: JSP & Servlets, PHP, JavaScript, XML, and HTML

Databases: Oracle 11g/10g/9i, MySQL, DB2, MongoDB, REDIS

Tools: Eclipse, JDeveloper, JUnit, Ant, MS Visual Studio, Nagios

Application Servers: Apache Tomcat, Jetty

ETL Tools: Pentaho Kettle and Talend

Testing Tools: NetBeans, Eclipse, JUnit, MR Unit

Methodologies: Agile, Scrum and Waterfall

PROFESSIONAL EXPERIENCE:

Hadoop Developer

Confidential, Bloomington, IL

Roles and Responsibilities:

  • Understanding business needs, analyzing functional specifications and map those to develop and designing MapReduce programs and algorithms.
  • Execution of Hadoop ecosystem and Applications through Apache HUE.
  • Optimizing Hadoop MapReduce code, Hive and Pig scripts for better scalability, reliability and performance.
  • Developed teh OOZIE workflows for teh Application execution.
  • Feasibility Analysis (For teh deliverables) - Evaluating teh feasibility of teh requirements against complexity and time lines.
  • Performing data migration from Legacy Databases RDBMS to HDFS using Sqoop.
  • Writing Pig scripts for data processing.
  • Implemented Hive tables and HQL Queries for teh reports.
  • Import/export data from Oracle data base to/from HDFS using Sqoop, Hue and JDBC.
  • Experience in performing data validation using HIVE dynamic partitioning and bucketing.
  • Written and used complex data type in storing and retrieved data using HQL in Hive.
  • Implemented Spark using Scala and SparkSQL for faster testing and processing of data.
  • Implemented algorithms for analyzing using spark.
  • Developed Hive queries to analyze reducer output data.
  • Highly involved in designing teh next generation data architecture for teh unstructured data.
  • Implemented Hadoop Hive Cluster using Cloudera Distributed Vendor (CDH4). Planning, installation and configuration of Hadoop cluster and its ecosystem components.
  • Developed PIG Latin scripts to extract data from source system.
  • Created and maintained technical documentation for executing Hive queries and Pig scripts.
  • Involved in Extracting, loading Data from Hive to Load an RDBMS using Sqoop.
  • Managed a 4-node Hadoop cluster for a client conducting a Hadoop proof of concept. Teh cluster had 12 cores and 3 TB of installed storage.

Environment: HDFS, Map Reduce, Hive, Oozie, Java, PIG, CDH4, Shell Scripting, Linux, HUE, Sqoop, Spark, Flume, DB2, and Oracle 11g

Hadoop Developer

Confidential, Louisville, KY

Roles and Responsibilities:

  • Explored and used Hadoop ecosystem features and architectures.
  • Worked on debugging, performance tuning of Hive & Pig Jobs.
  • Worked on analyzing Hadoop cluster using different big data analytic tools including Pig, Hive, and MapReduce.
  • Worked closely with business team to gather their requirements and new support features.
  • Developed Map-Reduce jobs for Log Analysis and Analytics.
  • Involved in loading data from LINUX file system to HDFS using Kettle.
  • Wrote Map-Reduce job to generate reports for teh number of activities created on a particular day, during a time interval etc. for teh Analytics module.
  • Teh MR Job read teh data from HDFS, where teh data was dumped from teh multiple sources and teh output was written back to HDFS.
  • Configured Sqoop and developed scripts to extract data from MySQL into HDFS.
  • Writing Spark programs to load, parse, refined and store sensor data intoHadoopand also process analyzed and aggregate data for visualizations.
  • Developing data pipeline programs with Spark Scala APIs, data aggregations with Hive, and formatting data (Json) for visualization, and generating.
  • Used Hive for analysis of web site traffic.
  • Wrote programs using scripting languages like Pig to manipulate data.
  • Implemented teh workflows using teh Apache Oozie framework to automate tasks.
  • Created production jobs using Oozie work flows that integrated different actions like MapReduce, Sqoop, and Hive.
  • Wrote Hadoop Job Client utilities and integrated them into monitoring system.
  • Managing and scheduling batch Jobs on aHadoopCluster using Oozie.
  • Reviewed teh HDFS usage and system design for future scalability and fault-tolerance.
  • Prepared Extensive Shell scripts to get teh required info from logs.
  • Performed white box testing and monitoring all teh logs in Dev and Prod environments

Environment: HDFS, Map/Reduce Java, Sqoop, Pig, Hive, Oozie, Flume, REDIS, Core Java, Nexus, Apache Derby, Spark, MySQL and Linux.

Java/J2EE Developer

Confidential, Bloomfield, CT

Roles and Responsibilities

  • Responsible for understanding teh scope of teh project and requirement gathering.
  • Review and analyze teh design and implementation of software components/applications and outline teh development process strategies
  • Coordinate with Project managers, Development and QA teams during teh course of teh project.
  • Used Spring JDBC to write some DAO classes to interact with teh database to access account information.
  • Used Tomcat web server for development purpose.
  • Involved in creation of Test Cases for JUnit Testing.
  • Used Oracle as Database and used Toad for queries execution and also involved in writing SQL scripts, PL/SQL code for procedures and functions.
  • Used CVS, Perforce as configuration management tool for code versioning and release.
  • Developed application using Eclipse and used build and deploy tool as Maven.
  • Used Log4J to print teh logging, debugging, warning, info on teh server console.
  • Extensively used Core Java, Servlets, JSP and XML

Environment: Java1.5, J2EE, XML, Spring 3.0, Design Patterns, Log4j, CVS, Maven, Eclipse, Apache Tomcat 6, and Oracle 11g.

Java/J2EE Developer

Confidential

Responsibilities:

  • Interacting with teh client on a regular basis to gather requirements.
  • Understanding teh business, technical, and functional requirements.
  • Checking for timely delivery of various milestones.
  • Using Spring Framework, Axis, developed web services including design of teh XML request/response structure.
  • Implemented Hibernate/Spring framework for Database and business layer.
  • Configured Oracle with Hibernate, wrote hibernate mapping and configuration files for database processing (Create, Update, select) operations.
  • Involved in creating Oracle stored procedures for data/business logic.
  • Created PL/SQL stored procedures for Contract generation module.
  • Involved in configuring and deploying of code to different environments Integration, QA and UAT.
  • Preparing and designing system/acceptance test cases and executing them.
  • Created ant build script to build Artifacts.
  • Worked on fine-tuning teh response time of Web Service components.

Environment: Java, JSP, EJB, Servlets, Struts, Tomcat, Web logic, Oracle 10g

Java Developer

Confidential

Responsibilities:

  • Designed User Interface using Java Server Pages (JSP) and XML.
  • Developed teh Enterprise Java Beans to handle different transactions such as online funds transfer, bill payments to teh service providers.
  • Implemented Service Oriented Architecture (SOA) using JMS in MDB for sending and receiving messages while creating web services.
  • Worked on Web Services for data transfer from client to server and vice versa using SOAP, WSDL, and UDDI.
  • Involved in testing teh web services using SOAP UI
  • Extensively worked on JMS using point-point, publisher/subscriber messaging Domains for implementing Exchange of information through Messages.

Environment: Windows, Java 1.4, HTML, JavaScript 1.6, XML, JUnit, JMS, Web Services, SOAP 1.1, UDDI 2, Maven 2.0, Eclipse IDE, CVS, Oracle 10g.

Java Developer

Confidential

Responsibilities:

  • Object Oriented Analysis and Design using UML include development of class diagrams, Sequence diagrams, and State diagrams and implemented these diagrams in Microsoft Visio.
  • Developed action classes and form beans and configured teh struts-config.xml.
  • Involved in writing client side validations using Java Script.
  • Involved in teh design of teh Referential Data Service module to interface with various databases using JDBC.
  • Prepared deployment plans for production deployments.
  • Prepared documentation and participated in preparing user’s manual for teh application.
  • Attending teh SFD, project kick-off meetings and ST meetings.
  • Contributed in designing of test plans and test cases.

Environment:Core Java, JDBC, JavaScript, MySQL, JUnit, Eclipse, QA

We'd love your feedback!