We provide IT Staff Augmentation Services!

Hadoop Java/j2ee Developer Resume

0/5 (Submit Your Rating)

Columbus, OH

SUMMARY

  • Have more than 7+ years of professional experience in IT, including 3 years of work experience in Hadoop Eco system.
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode and MapReduce concepts.
  • Good knowledge on Hortonworks Data platform 2.2
  • Hands on experience on major components in Hadoop Ecosystem like Hadoop Map Reduce, HDFS, HIVE, PIG, Spark, Pentaho, HBase, Sqoop, Kafka, Oozie, Scala, Storm and Flume.
  • Experience in ingesting unstructured data using Flume and Kafka and also managing and reviewing the Hadoop Log files.
  • Experience with Oozie Workflow Engine in running workflow jobs with actions that run Hadoop Map/Reduce and Pig jobs.
  • Experience in working with the NoSQL Mongo DB, Apache Cassandra
  • Experience in managing and reviewing Hadoop Log files.
  • Extending Hive and Pig core functionality by writing custom UDFs.
  • Have good knowledge on installation, configuration, supporting and managing of Big Data and underlying infrastructure of Hadoop Cluster using Apache and Horton works.
  • Good Knowledge in Amazon AWS concepts like EMR and EC2 web services which provides fast and efficient processing of Big Data.
  • Hands on experience in designing and coding web applications using Core Java and J2EE technologies.
  • Experienced the integration of various data sources like Java, RDBMS, Shell Scripting, Spreadsheets, and Text files.
  • Experience in working with Oracle and DB2.
  • Experience in Web Services using XML, HTML and SOAP.
  • Excellent Java development skills using J2EE, J2SE, Servlets, JUnit, JSP, JDBC.
  • Familiarity working with popular frameworks likes Struts, Hibernate, Spring, MVC and AJAX.
  • Committed to timely and quality work, Quick learner, able to adapt effortlessly to new technologies, ability to work within a team as well as cross - team
  • Proven competencies: problem solving and analytical skills, excellent presentation and documentation skills, application development, project management, leadership
  • Highly motivated and a self-starter with effective communication and organizational skills, combined with attention to detail and business process improvements

TECHNICAL SKILLS

Big Data: Hadoop, HDFS, MapReduce, Spark, Scala, Hive, Kafka, Sqoop, Pig, HBase, MongoDB, Flume, Zookeeper, Oozie.

Operating Systems: Windows, Ubuntu, Red Hat Linux, Linux, UNIX

Java Technologies: JDBC, JAVA, SQL, JavaScript, J2EE, C, JDBC, SQL, PL/SQL

Programming or Scripting Languages: Java, SQL, Unix Shell Scripting, C.

Database: MS-SQL, MySQL, Oracle, MS-Access

Middleware: Web Sphere, TIBCO

IDE’s & Utilities: Eclipse and JCreator, NetBeans

Protocols: TCP/IP, HTTP and HTTPS.

Testing: Quality Center, Win Runner, Load Runner, QTP

Frameworks: Hibernate 3.0, Spring 3.x, Servlets, JSP,XML, Struts, EJB 2.x/3.x, JDBC,MVC

PROFESSIONAL EXPERIENCE

Confidential, Columbus, OH

Hadoop Java/J2ee Developer

Responsibilities:

  • Analyzed, designed and developed Spark streaming and real time analytics jobs to meet the requirements of clients.
  • Worked on Performance Enhancement.
  • Worked with Distributed n-tier architecture and Client/Server architecture.
  • Designed and built Kafka-Spark-Hdfs streaming prototype.
  • Worked in implementation & maintenance of applications in Web-based environment
  • Proficient in using OOPs Concepts (Polymorphism, Inheritance, Encapsulation) etc.
  • Used Design Patterns like MVC (Model-View-Controller) and Singleton, Factory etc.
  • Experience using XML, XML parser.
  • Design and buildHadoopsolutions for big data problems.
  • Worked on setting up pig, Hive and Hbase on multiple nodes and developed using Pig, Hive and Hbase, MapReduce.
  • Extracting logs from spooled directory source onto hdfs using Flume jobs.
  • Developed MapReduce application usingHadoop, MapReduce programming and Hbase.
  • Implemented data access using Hibernate persistence framework
  • Developed the configuration files and the class’s specific to the spring and hibernate
  • Utilized Spring framework for bean wiring & Dependency injection principles
  • Expertise in server-side and J2EE technologies including Java, J2SE, JSP, Servlets, XML, Hibernate, Struts, Struts2, JDBC, and JavaScript development.
  • Excellent working experience in J2EE Architecture, MVC Architecture, Design Patterns.
  • Design of GUI using Model View Architecture (STRUTS Frame Work).
  • Integrated Spring DAO for data access using Hibernate
  • Created hibernate mapping files to map POJO to DB tables
  • Involved in the Development of Spring Framework Controllers
  • Performed unit testing for all the components using JUnit
  • Designed and developed the XSD for WSDL
  • Developed user interface using JSP, JSP Tag libraries JSTL, HTML, CSS, JavaScript to simplify the complexities of the application
  • Involved in developing the Pig scripts.
  • Involved in developing the Hive Reports.
  • Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database.

Environment: Hadoop 1x, Hive, Pig, HBASE, Sqoop and Flume, Spring, JQuery, Java, J2EE, HTML, Javascript, Hibernate

Confidential, SFO, CA

Hadoop Java/J2ee Developer

Responsibilities:

  • Performed Sqoop imports of data from Data warehouse platform to HDFS and built hive tables on top of the datasets.
  • Built ETL workflow to process data on hive tables.
  • Used HUE to create Oozie workflows to perform different kinds of actions such as hive,java& MapReduce.Worked extensively in Hive used features like UDF and UDAFs.
  • Used sequence and avro file formats and snappy compressions while storing data in HDFS.Used Efficient Columnar Storage like parquet for data used by business.
  • Worked extensively in Map Reduce usingJava Well versed with features like multiple output in MapReduce.
  • Worked on features like reading a hive table from MapReduce and making it available for all data nodes by keeping in distributed cache.Used both Hue and xml for Oozie.
  • Participated in building CDH4 test cluster for implementing Kerberos authentication.Installing Cloudera manager and Hue.

Environment: Hadoop, CDH4, Hue, MapReduce, Hive, Pig, Sqoop, Oozie, Impala, core java/J2EEJSON, Netezza,Maven, SVN, and Eclipse.

Confidential

Java Developer

Responsibilities:

  • Requirements gathering, preparation Low Level Design.
  • Responsible for the design and development of the application
  • Implemented struts MVC framework with tiles and validators.
  • UI development using Struts Framework and Tiles.
  • Involved in Server-side programming as handlers for dynamic Content generation and User Interface (UI) using XML, XSLT, HTML, CSS, DHTML and Java Script (AJAX), JQuery.
  • Application UI development using AJAX, HTML, JSP, XML and CSS.
  • Implemented Four Design Patterns and Core J2EE patterns
  • Implemented the functionalities using Java, J2EE, JSP, and AJAX, Servlets
  • Developed Restful web services using JAX-RS API and integrated them with the struts framework.
  • Dynamic chart generation using JFreeChart API in java
  • Developed automation mail notification system using JavaMail API in java
  • Java FTP programing
  • Involved Database programming in oracle10g.
  • Worked as a module/tech lead for various modules like GCSP, ORNIS of the application.
  • Created the Stored Procedures, functions and triggers using PL/SQL.
  • Responsible for developing design and development of the application

Environment: Java, J2EE, JSP, Struts MVC, AJAX, JDBC, WAS5.1, Eclipse, Oracle 10g, PL/SQL, HTML, DHTML,XML, Java Script, Log4j, MS-Visio, Toad, Plsql Developer, IBM Synergy, MTG

Confidential

JAVA Developer

Responsibilities:

  • Designed and developed JSP, Servlets.
  • Wrote Build Script for compiling the application
  • Developed stored procedures, triggers, and queries using PL/SQL.
  • Deployed application in the Websphere application server
  • Maintained responsibility for database design, implementation and administration.
  • Testing the functionality and behavioral aspect of the software.
  • Responsible for customer interaction, analysis of the requirements and project scheduling.
  • Responsible for designing the system based on UML concepts, which included data flow diagrams, class diagrams, sequence diagrams, state diagrams using Rational Rose Enterprise Edition.

Environment: UNIX, Windows, Core Java, SQL, JDBC, JavaScript, HTML, JSP, Servlet, Oracle.

We'd love your feedback!