We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Boston, MA

SUMMARY

  • Over 8 yearsof experience in Analysis, Design, Development, Implementation, Integration and testing of Application Software in web - based environments, distributed n-tier products and Client/Server architectures.
  • Experience in SDLC and Object Oriented Application Design and Programming.
  • Experience in OO Design using IBM Rational Rose and UML.
  • Extensive knowledge and programming skills in SCALA
  • Strong programming skills in Core Java, and J2EE technologies
  • Strong programming skills in advanced frameworks like spring, Struts and Hibernate.
  • Strong experience with SOA and WebServices
  • Strong hands on experience with Big Data Technologies including Hadoop (HDFS & MapReduce), PIG,HIVE, HBASE, ZOOKEEPER, SQOOP
  • Extensive knowledge in Kafka, Scala Streaming, Storm Mapreduce.
  • Experience working on NoSQL databases including Hbase, Cassandra.
  • Experience in creating complex SQL Queries and SQL tuning, writing PL/SQL blockslike stored procedures, Functions, Cursors, Index, triggers and packages.
  • Experience with databases like DB2, Oracle 9i, Oracle 10g, MySQL, SQL Server and MS Access.
  • Experience with all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation, and support.
  • Experience with advance technologies like MongoDB and advanced framework like Spring Roo.
  • Extensive experience in creating Class Diagrams, Activity Diagrams, Sequence Diagrams using Unified Modeling Language(UML)
  • Experienced in SDLC, Agile (SCRUM) Methodology, Iterative Waterfall Model
  • Extensive experience in Software design methodologies. Proficient in web application development using J2EE components on JBoss, Web Logic and Web Sphere (Servlets, JSP, JSF JNDI, RMI, JDBC) Apache Struts, applying design patterns, and Web Services
  • Expertise in developing web based GUIs using Applets, Swing, Servlets, JSP, HTML, XHTML JavaScript and CSS.
  • Extensive experience in Java and J2EE technologies like Servlets, JSP, JSF, JDBC, RMI, JNDI
  • Expertise in creating XML, DTD, XML Schemas, XSLT, XPath, DOM/SAX Parser and web designing using HTML, CSS and JavaScript (jQuery).
  • Good knowledge in Web Services and Hibernate (O/R Mapping tool)
  • Expertise in deploying applications in JBoss 3.0/4.0, Apache and Tomcat 4.0/5.0/5.5, WebSphere 6.0, Weblogic 8.1
  • Strong hands on experience with Production Support
  • Experience in developing ANT scripts to build and deploy Java web applications.
  • Experience in developing test cases, performing Unit Testing, Integration Testing, experience in QA with test methodologies and skills for manual/automated testing using tools like WinRunner, JUnit.
  • Experience with various version control systems Clear Case, CVS, PVCS and VSS, SVN.
  • Experience with Business Rules engines like JRules and Drools.
  • Excellent communication skills, team player, quick learner, organized, resilient and self-motivated.

TECHNICAL SKILLS

Big Data Ecosystem: Hadoop 0.22.0, MapReduce, HDFS, HBase, Zoo Keeper, Hive, Pig, Strom, Sqoop, Cassandra,Impala Kafka.

Java/J2EE: Java 6, Ajax, Log4j, JSP 2.1 Servlets 2.3, JDBC 2.0, XML, Java Beans

Methodologies: Agile, UML, Design Patterns

Frameworks: Struts, Hibernate, Spring

DataBase: Oracle 10g, PL/SQL, MySQL

Application Server: Apache Tomcat 5.x 6.0, JBoss 4.0

Web Tools: HTML, Java Script, Scala, XML, XSL, XSLT, XPath, DOM

IDE/ Testing Tools: NetBeans, Eclipse

Scripts: Bash, ANT, SQL, HiveQL, Shell Scripting

Testing API: JUNIT

PROFESSIONAL EXPERIENCE

Confidential, Boston, MA

Hadoop Developer

Responsibilities:

  • Installed/Configured/Maintained Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop.
  • Wrote the shell scripts to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions.
  • Managing and scheduling Jobs on a Hadoop cluster.
  • Deployed Hadoop Cluster in the following modes.
  • Standalone Pseudo-distributed Fully Distributed
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS
  • Developed the Pig UDF’S to pre-process the data for analysis
  • Develop Hive queries for the analysts
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.
  • Cluster co-ordination services through Zookeeper.
  • Collected the logs data from web servers and integrated in to HDFS using Flume.
  • Implemented Fair schedulers on the Job tracker to share the resources of the Cluster for the Map Reduce jobs given by the users.
  • Managed and reviewed Hadoop log files

Environment: Hadoop, Hbase, HDFS, Hive, Java (jdk1.6), Pig, Zookeeper, Oozie, Flume.

Confidential, West Street, NY

Hadoop Developer

Responsibilities:

  • Proactively monitored systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures.
  • Involved in analyzing system failures, identifying root causes, and recommended course of actions.
  • Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
  • Monitored multiple Hadoop clusters environments using Ganglia and Nagios. Monitored workload, job performance and capacity planning using Cloudera Manager.
  • Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
  • Used Strom Cluster Nimbus Node, Zookeeper and Supervisor.
  • Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page- views, visit duration, most purchased product on website.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Using Spring IOC for injecting the beans.
  • Developed rich user interface using JavaScript, JSTL, CSS, JQuery and JSP’s.
  • Integrated Oozie with the rest of the Hadoop stack supporting several types of Hadoop jobs out of the box (like Java MapReduce, Pig, Hive, Sqoop) as well as system specific jobs (such as Java programs and shell scripts).
  • Involved in installing and configuring Kerberos for the authentication of users and Hadoop daemons.

Environment: CDH4, Flume, Hive, Sqoop, Pig, Oozie, Cloudera Manager, Java, Linux, CentOS

Confidential, East Hanover, NJ

Hadoop Developer

Responsibilities:

  • Developed self-contained micro-analytics on Search platform, leveraging the content and prior capabilities associated with the MRL Search Pilot.
  • Developed search/analytics product features and fixes following an Agile methodology.
  • Applied Big Data tools (e.g. Hadoop, Accumulo, Lucene, Hive, AWS-ec2) to enable search and analytics.
  • Developed test cases for codebase using JMockit API.
  • Wrote Hive queries for ingesting and indexing data from Merck Chemical Identifier Database (MCIDB) which is used by Merck scientists for performing research activities throughout the Drug Discovery phase.
  • Developed custom UDFs in Hive for enhancing search capabilities by using Chemistry Development Kit (CDK) Library.
  • Created cron jobs to ingest and index data periodically.
  • Developed Puppet scripts to install Hive, Sqoop, etc on the nodes.
  • Actively took part in daily scrum meetings, bi-weekly sprint planning and closeout meetings.
  • Worked with highly engaged Informatics, Scientific Information Management and enterprise IT teams.

Environment: Hadoop, Hive, Accumulo, Lucene, Sqoop,AWS-EC2, Puppet

Confidential, San Francisco, CA

Java Developer

Responsibilities:

  • Involved in gathering system requirements for the application and worked with the business team to review the requirements, and went through the Software Requirement Specification document and Architecture document.
  • Involved in designing UML Use case diagrams, Class diagrams, and Sequence diagrams using Rational Rose.
  • Created Functional specification documents (FSD) and JSON contracts
  • Developed the application using Spring Framework that uses Model View Controller (MVC) architecture with JSP as the view.
  • Developed presentation layer using JSP, HTML and CSS and JQuery.
  • Developed JSP custom tags for front end.
  • Written Java script code for Input Validation.
  • Extensively used Spring IOC for Dependency Injection.
  • Developed J2EE components on Eclipse IDE.
  • Used Restful web services with JSON.
  • Used Apache CXF open source tool to generate java stubs form WSDL.
  • Used Oracle10g database for tables creation and involved in writing SQL queries using Joins and Stored Procedures.
  • Used Hibernate for Object-Relational Mapping and for database operations in Oracle database.
  • Developed Action classes and DAO classes to access the database.
  • Developed several POJO class to map table data into Java Object.
  • Used SQL developer database tool to build, edit, and format database queries, as well as eliminate performance issues in the code.
  • Used tortoise SVNtool tokeep track of all work and all changes in source code.
  • Used JUnit for testing the application and Maven for building Projects
  • Deployed the applications on Web sphere Application Server.

Environment: Java 6 - JDK 1.6, JEE, Spring 3.1 framework, Spring Model View Controller (MVC),Java Server Pages (JSP) 2.0, Servlets 3.0, JDBC4.0, AJAX, Web services, Rest full, JSON, Java Beans, JQuery, JavaScript, Oracle 10g, IBM RAD, Web sphere, Agile Methodology, Design Patterns, SVN, Apache Maven, JUnit, HTML Unit, XSLT, HTML/DHTML.

Confidential, Greenville, SC

Java Developer

Responsibilities:

  • Involvement with full cycle development of software which include Requirement Gathering, Design and Develop Applications.
  • Actively participated in Project Designing using JSF Framework and SDLC phases of the MTM-Project.
  • Developed user interface using JSF Primefaces with Java Beans, MDB’s Custom Tag Libraries and AJAX to speed the application.
  • Used Java/J2EE Design Patterns like Session Façade, DAO Pattern, and MVC Pattern.
  • Implemented controllers, services part.
  • Used web services concepts like SOAP, WSDL, JAXB and JAXP to interact with other projects for sharing information.
  • Involved in initial designing and creating Usecase diagrams, sequence diagrams and class diagrams using STAR-UML tool.
  • Used SVN for software configuration management and version control.
  • Implemented Spring Dependency Injection.
  • Implemented springsecurity features to different modules of the Project.
  • Implemented POJO based domain model integrated with Hibernate ORM for persistence.
  • Developed client modules for the SOA Integration.
  • Added and modified existing Business JRules based on continuously changing business requirements and performed Unit testing to ensure system stability and consistency.
  • Implemented Routing Rule Engine, using IBM - ILOG JRules.
  • Written shell script files to update the configuration and application data corrections.
  • Responsible to write UNIX shell scripts to automate the Build process.
  • Wrote JUnit Test Cases to test workflow.
  • Monitored logs by using LOG4J.

Environment: JDK1.6,Core Java, EJB 3.0, Eclipse Helios, SQL-Server, Servlets 2.5,Spring DI, Spring Security, JSF & Prime Faces 3.2,IBM-DB2, IBM-Data Studio, Hibernate, Web services(Restful),JUnit 4.8, UNIX, Windows XP, IBM - ILOG JRules 7.2, SVN, Maven and Log4J.

Confidential

Java Developer

Responsibilities:

  • Analyzed Business Requirements and Identified mapping documents required for system and functional testing efforts for all test scenarios.
  • Performed Requirement Gathering & Analysis by actively soliciting, analyzing and negotiating customer requirements and prepared the requirements specification document for the application using Microsoft Word.
  • Developed Use Case diagrams, business flow diagrams, Activity/State diagrams.
  • Adopted J2EE design patterns like Service Locator, Session Facade and Singleton.
  • Configuration of application using spring, Hibernate, DAO’s, Actions Classes, Java Server Pages.
  • Configuring Hibernate and Tiles related XML files.
  • Developed presentation layer using JavaServer Faces (JSF) MVC framework.
  • UsedJSP, HTML and CSS, JQueryas view components in MVC.
  • Extensively used Spring IOC for Dependency Injection and worked on Custom MVC Frameworks loosely based on Struts.
  • Developed Servlets and Java Server Pages (JSP), to route the submittals to the EJB components and render-retrieved information using Session Facade, Service Locator (design pattern).
  • Developed J2EE components on Eclipse IDE.
  • Used JDBC to invoke Stored Procedures and also used JDBC for database connectivity to SQL.
  • Deployed the applications on WebsphereApplication Server
  • Developed Web servicesusing Restful and JSON.
  • Used Oracle11g database for tables creation and involved in writing SQL queries using Joins and Stored Procedures.
  • Used Toad database tool to develop oracle quires.
  • Writing complex SQL queries and reviewing SQL queries for the other team members.
  • Developed JUnit Test Cases for Code unit test.
  • Worked with configuration management groups for providing various deployment environments set up including System Integration testing, Quality Control testing etc.

Environment: J2EE,Hibernate,RAD,RSA, SQL Developer, Oracle 11g, Rational Clear Case, Rational Clear Quest, Rational Requisite Pro, UML, MS Visio, MS Office.

We'd love your feedback!