We provide IT Staff Augmentation Services!

Java /hadoop Developer Resume



  • 8 years of strong experience in object oriented programming, design and development of Multi - Tier distributed, Enterprise applications using Java and J2EE technologies with Software Development Life Cycle (SDLC).
  • 3+ years of experience in all phases of Hadoop Technologies. Excellent understanding in depth knowledge of Hadoop architecture and various components such as HDFS, Map Reduce programming and other Ecosystem components
  • Experienced on major Hadoop ecosystem projects such as PIG, HIVE and HBASE.
  • Knowledge on SPARK, KAFKA, STORM
  • Good knowledge of NoSQLdata base HBase and Mongo dB.
  • Good knowledge on scripting languages like PYTHON, SCALA
  • Experience in SQL programming including SQL queries using stored procedures and Triggers in Oracle, SQLServer.
  • Used Maven extensively for building MapReduce jar files and deployed it to Amazon Web Services(AWS)using EC2 virtual Servers in the cloud and Experience in build scripts to do continuous integrations systems like Jenkins.
  • Extensive experience in developing applications using Java, Java Beans, JSP,Servlets, Spring MVC framework, Spring JDBC, JDBC,JNDI, Spring, Hibernate, Ajax, JUnit and Oracle, Test DrivenDevelopment, MS SQL Server and MS Access.
  • Hands on experience in Atlassian tools (JIRA Service Desk, Confluence, Crucible and Bamboo) and Subversion.
  • Expertise in client-side design and validations using HTML 4/5, XHTML, CSS, Bootstrap,Java Script, JSP, JQuery, Angular JS, Cache and JSTL.
  • Excellent working experience in developing applications using J2EE Design Patterns like creational, structural and behavioral design patterns (Singleton Factory).
  • Extensive experience in Deploying, Configuration on Application & Web Servers such as BEA Web Logic, IBMWebSphere, JBOSS and Apache Tomcat Web Server.
  • Excellent experience in the design, development of Design Patterns and DAO’s using Hibernate, J2EE Architecture, Object Modeling, Data Modeling, UML.
  • Proficient in Core Java and Multi-Threading. In depth knowledge of Java application server configuration and tuning, JVM tuning, software architecture, application code assessment and deployment procedures for a large scale J2EE applications. Specializes in Java Application performance and stability problem detection.
  • Strong working experience using XML, DTD and XML Schemas.
  • Experienced in parsing (DOM and SAX) XML using JAXP API.
  • Experienced in JMS over messaging to exchange the information in more reliable and asynchronous way in enterprise Applications. Used Apache ActiveMQ as JMS provider.
  • Good Experience in using tools like Maven, Antand Log4J.
  • Expertise in REST and SOAPWeb Services,Micro Services,CXF, JAX-WS, JAX-RS and AXIS, REST API.
  • Proficient in SQL and wrote Stored Procedures, Triggers besides writing DDL, DML and transaction queries with development tools like SQL Developer and TOAD.
  • Ability to work in tight schedules and efficient in meeting deadlines. Good Knowledge in Touch point Xpress.
  • Excellent Analytical, Communication and Interpersonal skills.
  • Good experience in developing applications using Agile Scrum methodology.
  • Good exposure on using various tools like Spring tools suite, Net Beans IDE, Eclipse IDE in implementing the applications.
  • Involved in all different phases of SDLC including Strategic Systems Planning, Designing, Programming, Testing, Documentation, and Presentation to clients and preparing customer specifications as part of software technical requirements and Customer support.


Languages: java, C, C++, SQL, PL/SQL, and Python

Application Frameworks: J2EE Struts, spring, spring IOC, spring AOP, spring JPA, EJB, Hibernate, Cache, node.js, Backbone Js, Bootstrap,Css3, Angular Js

Hadoop Ecosystem: Hadoop, HDFS, Hive,Pig,Scoop,HBase, Oozie,Scala

Technologies/API: JSP, JavaBeans, JDBC, JMS, OSGI, JNDI, Servlets, AJAX, JSF, JUnit, Log4j, JPA, JAX-B, JAX-P


Web Technologies: XML, XSL, XSLT, HTML5, JavaScript,jquery

Web/Application Servers: Apache/Jakarta Tomcat, Web Logic, IBM WebSphere, JBoss.

Design Patterns: MVC, Front Controller, Singleton, DAO patterns

Database: MS SQL Server, Oracle, MS Access, NO SQL (MongoDB)



Operating System: Windows XP/2000/98, UNIX, Linux, DOS.


Confidential, IL

Java /Hadoop Developer


  • Analyzing the requirements and estimating the Level of effort and providing the timeline to business and giving updates every week. And achieving the timeline and delivering quality output to Business. And also fixing production issues.
  • Migrated the existing data to Hadoop from RDBMS using Sqoop for processing the data.
  • Used Hive data warehouse tool to analyze the data in HDFS and developed Hive queries.
  • Responsible for creating Hive tables, loading the structured data resulted from Mapreduce jobs into the tables and writing Hive Queries to further analyze the data
  • Wrote SQLqueries and performed Back-End Testing for data validation to check the data integrity during migration from bac-kend to front-end.
  • Worked on setting up Pig,Hive,RedShift and HBase on multiple nodes and developed using Pig,Hive and MapReduce.
  • Wrote MRUnit test’s for unit testing the MapReduce jobs.
  • Implemented daily workflow for extraction processing and analysis of data with Oozie.
  • Responsible for implementing the business requirements using Spring MVC Framework.
  • Developed front ends using HTML, CSS, Bootstrap, JavaScript, jQuery, AJAX, JSON, Angular JS, hibernate, spring. And wrote Indexing process by using SOLRJ API.
  • JavaScript animations and interactive HTML components using jQuery & AJAX.
  • Developed RESTFUL and SOAP services on Apache SOLR Java search server data.
  • Deployment of new enhancements of the application proposed by business.
  • Troubleshooting the application code.
  • Created usability prototypes for the UI screens using Angular JS, JavaScript and jQuery.
  • Updates in the ORACLE database using SQL, PL/SQL by writing materialized views, procedures, functions and triggers. Deployment of application changes \in test and prod environments (UNIX boxes) using LINUX commands.
  • Writing detailed functional requirements by going through the requirements with clients with respect to end user, system perspective and functional perspective.
  • Worked with production team to study, analyze and fix bugs. Participated in team agile scrums and meetings.

Environment: Core JAVA, J2EE, Spring MVC, Maven, SQL Java Script, Angular JS, HTML, CSS, AJAX, JSON, REST, SOAP, SVN, JIRA, Tomcat8,Hadoop,Hive,Mapreduce,Scoop,Flume,Hue

Confidential, Manhattan, NY

Hadoop Developer


  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables
  • Installed & maintained cloudera Hadoop distribution
  • Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
  • Involved in loading the data from Linux file system to HDFS.
  • Implemented Mapreduce programs on log data to transform into structured way to find user information.
  • Performed performance tuning and troubleshooting of MapReduce jobs by analyzing and reviewing Hadoop log files.
  • Exported the analyzed data to the relational databases using sqoop for virtualization and to generate reports for the BI team
  • Monitored workload, job performance and capacity planning using Cloudera Manager.
  • Installed Oozie workflow engine to run multiple Mapreduce, Hive and Pig jobs.
  • Responsible for creating Hive tables, loading the structured data resulted from MapReduce jobs into the tables and writing hive queries to further analyze the logs to identify issues and behavioral patterns.
  • Imported data frequently from MySQL to HDFS using Sqoop.
  • Supported operations team in Hadoop cluster maintenance activities including commissioning and decommissioning nodes and upgrades.
  • Used ETL tool, Talend to do transformations, event joins, filter and some pre-aggregations
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts.
  • Used Tableau for visualizing and to generate reports.

Environment: Hadoop,Cloudera, MapReduce, Hive, Sqoop, Flume,Talend, Python, MS-SQL Server, Tableau.


Java /Hadoop Developer


  • Participated in all stages of software development life-cycle including architecture, design, implementation, and unit testing.
  • Developed facelet (xhtml) pages representing the view. Implemented business logic on server-side while keeping focus on developing standard, optimized, generic and loosely coupled code.
  • Integrated JSF with Spring to implement MVC (Model-View-Controller) pattern.
  • Developed java script methods to perform various client side validations, display of errors, update component and invoking various other events.
  • Developed validators to perform additional validation on JSF components.
  • Created composite components for common functionality of different pages.
  • Developed custom components to perform special functionalities as per requirement.
  • Performed test cases throughout the integration and the regression environments.
  • Consumed SOAP based web Services in the application. Created usability prototypes for the UI screens using node. Js, JavaScript and jQuery.
  • Implemented several design patterns like Singleton, MVC and Factory design patterns.
  • Performed Smoke testing, Functional testing, White box testing, Black box testing, Integration testing and Regression testing to find bugs. Logged messages using Log4j to catch all the system events. Developed JUnit test cases to test the java base code.
  • Configured servers and resolved server issues. Developed and ran ant automated scripts to do project builds. Worked with production team to study, analyze and fix bugs. Participated in team agile scrums and meetings.

Environment: J2EE, Java JDK 5, JSF, JUnit, JavaBeans, JavaScript, REST API, spring, Hibernate, JNDI, Ant, Log4j, JavaScript, node.js, CSS, JBOSS, Agile.

Confidential, CT

Java, J2EE Developer


  • Involved in creation of Low Level Design including sequence diagrams and class diagrams to comprehend the existing architecture.
  • Involved in the integration of spring for implementing Dependency Injection.
  • Developed code for obtaining bean s in Spring IOC framework.
  • Focused primarily on the MVC components such as Dispatcher Servlets, Controllers, Model and View Objects, View Resolver.
  • Involved in creating the Hibernate POJO Objects and utilizing Hibernate Annotations.
  • Involved in development of Web Services using Spring MVC to extract client related data from databases. Worked in Agile development environment.
  • Developed Web Services using WSDL, SOAP to communicate with the other modules.
  • Developed Graphical User Interfaces using UI frameworks and Webpage's using HTML and JSP's for user interaction.
  • Involved in the implementation of DAO using Spring-Hibernate ORM.
  • Involved in the creation of exhaustive JUnit Unit Test Cases using Test Driven Development (TDD) technique.

Environment: JDK 5, J2EE, Spring, Hibernate, Web Services, AWS, JMS, JavaScript, JSP, JUnit, Agile/Scrum Methodology, Oracle 10g, Web Logic Server, Eclipse IDE, DAO, Design patterns, Log4j.


Java Developer


  • Worked extensively in creation of the Land Parcel database.
  • Designed and developed In corporation of existing digital data with the main Oracle database. Highly Involved in High volume production of hardcopy farm maps.
  • Created Programming, Procedures and Queries and Documentation
  • Developed data manipulation Servlets with help of ARCSDE APIs.
  • Developed Enterprise Java Beans like Entity Beans, Session Beans.
  • Developed different modules using J2EE (EJB’s and JDBC). Designed and developed JSP pages using Struts. Wrote client side validations using Java Script.
  • Involved in the design of the Referential Data Service module to interface with various databases using JDBC. Extensively worked on PL/SQL, SQL.
  • Developed Database Objects like PL/SQL packages,Stored Procedures, Triggers, Cursors, Views to maintain referential Integrity of the database
  • Interacted with the Users and Documented the Application.
  • Experienced in ArcSDE setup and Arcims.
  • Debugged and unit tested the Java Beans and other Java classes.
  • Prepared documentation and participated in preparing user’s manual for the application.

Environment: Java, Oracle 9i, Map Objects, HTML, JSP, JDBC and Servlets, Arc Macro Language, Arc info 8.3,ArcSDE and ArcIMS.

Hire Now