We provide IT Staff Augmentation Services!

Hadoop/java Developer Resume

4.00/5 (Submit Your Rating)

NJ

SUMMARY

  • Around 10 years of IT experience as a Developer, Designer & quality reviewer with cross platform integration experience in Java, J2EE, Hadoop and SOA. Worked across different domain such as Airline, Telecom and Retail.
  • Hands on experience with Hadoop, HDFS, MapReduce and Hadoop Ecosystem (Pig, Hive, Oozie and Hbase).
  • Expertise in writing data stitching program using map reduce.
  • Extensively worked on No SQL database like HBASE.
  • Experience in developing web pages using Angular JS, JSP, Servlet, Hibernate, HTML, XHTML, XML, CSS, JavaScript, Spring and Struts MVC
  • Extensive experience in design, development and support Model View Controller using Struts and Spring framework and good experience in using Design Patterns and UML.
  • Proficiency with the application servers like WebLogic, JBOSS and Tomcat
  • Developed core modules in large cross - platform applications using JAVA, J2EE, spring, Struts, Hibernate, JAX-WS (SOAP)/JAX-RS(REST) Web Services JMS and Message driven Bean.
  • Extensively worked with the Java / J2EE design patterns like Facade, Data Access Object, service locator, Business Delegate, MVC etc.
  • Expertise in debugging and optimizing Oracle and java performance tuning with strong knowledge in Oracle 9i/10g and MySQL (SQL/PLSQL)
  • Worked in multiple tools like Eclipse 3.4, Weblogic Workshop, JDeveloper, DB Visualizer, TOAD and SQL Developer.
  • Experience in multiple tools like Assent PMD, ANT, MAVEN, JavaHelp, JUNIT, Sonar and Cobatura, AXIS2, CXF.
  • Hands on experience with writing custom defined PMD and shell script.
  • Worked in Continuous Integration tool like Hudson/Jenkin.
  • Expertise in Service Oriented Architecture (SOA) and extensively worked in oracle ESB.
  • Strong working knowledge of Test driven and performance driven architecture.
  • Experienced in onsite/offshore model with significant experience in client facing role in an Agile development environment.
  • Strong experience in drafting and reviewing HLD, LLD and service definition document.

TECHNICAL SKILLS

Languages: Java 1.4/5/6/7, J2EE 1.4/1.5

Big Data Ecosystems: Hadoop, MapReduce, HDFS, Hbase, Zookeeper, Hive, Pig, SqoopOozie, Flume

Technologies: JSP, Servlet, EJB, JDBC, JMS, JavaScript, XML, JUnit, JAXB, Webservice (JAX-RS/JAX-WS), Sonic ESB, shell script, Java bean, Message driven Bean

Frameworks: Struts 1.2, Spring 3.2, Hibernate

Backend Technologies: Oracle SQL, PL/SQL, Weblogic MQ, Sonic MQ

Build Technologies: Ant, Maven

Application/Web server: BEA WebLogic 10.3/8.1, Websphere, JBOSS 5.0, Jakarta, Tomcat 6.0Apache

Web Authoring Tools: XHTML, JQuery, HTML, JavaScript, Angular JS

Data Formats: XML, Json, CSV

Design Methodologies: OOP, UML, Design Patterns, IOC

IDE: Eclipse 3.4.2, SOAP-UI, Toad, SQL Developer

Version Control tools: Subversion, GIT, WinCVS

Other Tools: PMD, Axis2, CXF, Cobatura, Sonar, Enterprise ArchitectAdobe Photoshop

Environment: Windows 7/XP, Linux/Unix

PROFESSIONAL EXPERIENCE

Confidential, NJ

Hadoop/Java Developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Worked in webservice to consume/expose the service.
  • Used Spring MVC, Java script and angular JS for web page development.
  • Extensively worked in Log4J for log auditing
  • Worked in Hadoop Mapreduce, HDFS Developed multiple Mapreduce jobs in java for data cleaning and processing.
  • Worked extensively in creating Mapreduce jobs to power data for search and aggregation.
  • Designed a data warehouse using Hive
  • Handling structured, semi structured and unstructured data
  • Worked extensively with Sqoop for importing and exporting the data from HDFS to Relational Database systems and vice-versa.
  • Optimized Mapreduce Jobs to use HDFS efficiently by using various compression mechanisms.
  • Handled importing of data from various data sources, performed transformations using Hive, Mapreduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Extensively used Pig for data cleansing.
  • Managed and reviewed Hadoop log files.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in Mapreduce way.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
  • Responsible to manage data coming from different sources
  • Created partitioned tables in Hive.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.

Environment: Hadoop, Mapreduce, HDFS, Hive,Apache HBase, Sqoop, Java (jdk1.6), Pig, Flume, Oracle 11/10g, DB2, Teradata, MySQL, Eclipse, PL/SQL, Java, Linux, Shell Scripting, SQL Developer, Toad, Putty, XML/HTML, JIRA

Confidential, Richmond, VA

Hadoop Developer

Responsibilities:

  • Worked on importing and exporting data between DB2 and HDFS using Sqoop.
  • Used Flume to collect, aggregate and store the web log data from web servers and pushed to HDFS.
  • Developed Mapreduce programs in Java to convert from JSON to CSV and TSV formats and perform analytics.
  • Wrote Hive queries and also analyzed the partitioned and bucketed data to compute various analytical metrics for reporting.
  • Developed Pig Latin scripts for data cleansing and analysis of semi-structured data.
  • Used Pig as ETL tool to perform transformations, event joins and pre aggregations before storing the curated data into HDFS.
  • Involved in writing Hive and Pig UDF's to perform aggregations on customers' data.
  • Involved in migration of ETL processes from Relational Databases to Hive to test easy data manipulation.
  • Performed Mapreduce integration to import large amount of data into HBase.
  • Performed CRUD operations using HBase Java Client API.

Environment: CDH, Hadoop, HDFS, Mapreduce, Hive, Pig, Sqoop, Flume, DB2, HBase, Java, JUnit, Agile.

Confidential

Java/Hadoop Developer

Responsibilities:

  • Used Spring MVC frame work for displaying the report.
  • Involved in developing web pages using JSP, Servlet, CSS, and JavaScript/JQuery.
  • Involved in various POC activity using technology like Map reduce, Hive, Pig, and Oozie.
  • Developed the Pig UDF's to preprocess the data for analysis.
  • Involved in designing and implementation of service layer over Hbase database.
  • Importing of data from various data sources such as Oracle and Comptel server into HDFS using transformations such as Sqoop, MapReduce.
  • Involved in loading data from LINUX and UNIX file system to HDFS.
  • Developed Hive queries for the analysts.

Environment: Struts, Spring, Angular JS, Hadoop, MapReduce, HDFS, Hive, Oozie, Java (jdk1.6), Cloudera, NoSQL, Oracle 11g, Toad 9.6, Windows NT, UNIX

Confidential

Senior Java Developer

Responsibilities:

  • Worked as a lead developer with a team of 5 members and developed application using technologies Java/J2EE, spring. JAX-WS, SOAP and Hibernate.
  • Used Spring MVC frame work for View page.
  • Responsible for reviewing the code developed by the team members and making changes for performance tuning.
  • Worked in Continuous Integration tool Jenkins to fetch, build and deploy the code.
  • Rational Rose was used for the UML diagrams such as Use Case Diagrams, Object Diagrams, Class Diagrams and Sequence Diagrams to represent the detail design phase.
  • Worked in JUnit for unit testing.

Environment: Java (jdk1.6), J2EE, Webservice, JAX-WS, JAX-RS, spring, Hibernate, Smooks, Oracle Weblogic, SOAP-UI.

Confidential

SOA Developer

Responsibilities:

  • Organizing and presenting technical design document to Stakeholder/customer.
  • Worked in an Agile/Scrum based environment and involved in sprint planning of my team members.
  • Designing ESB processes and preparing Low Level Design document.
  • Develop Webservice and expose to interfacing systems.
  • Develop and review the code of peers.
  • Developed reusable components such as Error management framework.
  • Leading team of five.

Environment: Sonic ESB, DXSI, Sonic MQ, Actional Intermediary (AI), Webservice, Jenkin, Java, Linux.

Confidential

Senior Java Developer

Responsibilities:

  • Involvedin coding and reviewing the code of peer.
  • Involved in unit testing JUnit and security testing bug fixing
  • Responsible for reviewing the code developed by the team members and making changes for performance tuning.
  • Analysed the Performance bottle neck and tune the code to meet the NFR requirement.
  • Involved Developing components such as audit, online help content, exception and logging.
  • Involved in writing shell script
  • Worked in CI tool Hudson/Jenkin
  • Application server performance (JBOSS).
  • Worked in MAVEN, ANT, Sonar

Environment: Struts1.2,Spring3.2,Hibernate,Java,XML,Jboss4.2,Apache,Mysql5.0,Zimbra,Sonar,Cyber source Jenkin, JAX-RS,JAX-WS, Jenkin, JUnit, JunitPerf, JavaHelp, Linux.Java Script.

Confidential

Senior Java Developer

Responsibilities:

  • Involved Worked as a lead developer with a team of 5 members and developed application from the scratch based on the existing ordering application using technologies Java/J2EE, Hibernate and XML, JMS, JAX-WS.
  • Used XML DOM/SAX API for parsing XML.
  • Used ANT for compilation and building EAR files.
  • Worked in CI Tool (Jenkin)
  • Used JUnit/Eclipse for the unit testing of various modules.
  • Responsible for reviewing the code developed by the team members and making changes for performance tuning.
  • Involved in writing Shell Script.
  • Involved in unit testing using JUnit framework.
  • Providing support to ASG

Confidential

Senior Java Developer

Responsibilities:

  • Involved Interact with customer and make changes as per the customer requirement.
  • Responsible for bug fixes and code debugging throughout the SDLC process.
  • Involved in all the designing activities and helped the team to implement the designed frameworks.
  • Responsible for reviewing the code developed by the team members and making changes for performance tuning.

Environment: Struts, Hibernate, Java Script, Oracle 10g database, Oracle9iAS, Toad, Oracle Forms and Report builder, UNIX.

We'd love your feedback!