We provide IT Staff Augmentation Services!

Java/hadoop Developer Resume

3.00/5 (Submit Your Rating)

NC

SUMMARY

  • Around 8+ years of diverse IT industry experience through all phases of SDLC right from Requirement Gathering, Analysis, Design, Development, Testing, Deployment and production support as an individual and in onsite - offshore strategies.
  • Over 3+ Years as an Application Integrator, SDLC System Architecture/Design, Business Process Automation.
  • Over 3+ years of Lead, Solution Architect role to develop an appropriate solution for the myriads of clients.
  • Sound knowledge of Core Java and Hadoop ecosystems: HDFS, Big Data, RDBMS, Map/Reduce, Hive, HBase, ZooKeeper, Sqoop, Flume, Kafka, Spark, Scala and OOZIE with hands on working experience.
  • Primary technical skills in Apache Hadoop, Map-Reduce frame work, Pig, Hive.
  • Around 4+ years of programming experience in developing web based applications and Client-Server technologies using Java, JavaScript, J2EE (JDBC, JMS, JSP, Servlets, JavaBeans), Struts, HTML, XHTML/DHTML & XML.
  • Developed SOA, EAI, J2EE, Web Services and Workflow based solutions for cost reduction, quick response time, integration with disparate systems and efficiency improvements.
  • Working effectively with Stakeholders (business and technical including Executive Management) within large organizations; liaising with third party vendors, global team and system integrators.
  • Strong expertise in the implementation of EAI solutions on UNIX (Sun Solaris), Linux and Windows platforms for large-scale enterprises.
  • Strong Foundation in Java Technology (J2SE1.5/1.4, J2EE) and good understanding of Object Oriented Concepts.
  • Extensively worked on XML technologies like XML, XSL, XSD, XSLT, Xquery, Xpath and DTD.
  • Experienced in front end and middle ware frameworks like Struts, EJBs, RMI and JUnit.
  • Strong Experience in using IDE’S such as Eclipse 3.2/3.0/2.1.1 , NetBeans5.5/3.3, JCAPS eDesigner and TIBCO designer.
  • Sound knowledge of Gang of Four Design Patterns such as Front Controller, Session Façade, Singleton, Business Delegate and DAO patterns and implementation of MVC architecture using struts for reusing the most efficient and effective strategies for effective development.
  • Expertise in working with Oracle 10g/9i/8i and DB 2 9.1/8.1/7.2 databases and writing SQL queries, Triggers and Stored Procedures.
  • Work Experience on different platforms like Unix/Windows 2000/2003/XP/Vista/Window 7/Sun solaris.
  • Fulllifecycle experience, including customer reviews/meetings, requirements gathering, architectural review, high- and low-level design, coding, peer walkthroughs, testing, acceptance, delivery/installation, technical support and maintenance tasks.
  • 3+ Years work experience in Release management, documenting the release activities, communicating with different teams and ensuring to have smooth release.
  • Involved in Development of sequence diagrams using Microsoft Visio.
  • Good at Object Oriented Concepts and Design (OOC/OOD).
  • Very good experience in developing Test Cases.
  • Strong technical skills, high sense of ownership, good problem-solving skills, Client focused approach, fast learner and cohesive team player.
  • Currently exploring and working on cloud technologies from AWS.

TECHNICAL SKILLS

Big Data Echo Systems: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, Zookeeper.

No SQL Data Bases: HBase, MongoDB2.6.0.

Languages: Java (J2SE1.7), Spark 1.6 with Scala 2.10.x, Hibernate, SQL, PL/SQL.

J2EE Technologies: JSP2.1/2.0/1.2, Servlets2.x, JavaBeans, JDBC, Struts2.x/1.x, RMIWeb Services (SOAP and REST FULL), SOA, JMS1.1, SAX and DOM Parsers.

Web Technologies: HTML/DHTML, JavaScript1.x, XML1.0, XSL, XSLT, CSS, JSON.

Development Tools (IDEs): Eclipse 3.2/3.0/2.1.1 , MyEclipse6.0/5.1.1, MS Visual Studio 2005.

Web/Application Servers: Apache Tomcat6.x/5.x, WebLogic 10.3/9.2/8.1/7.0.

Cloud Technologies: AWS.

Design Patterns: Factory Pattern, Abstract Factory Pattern, Session Façade, SingletonCommand State and DAO patterns

RDBMS: Oracle10g/9i/8i, DB 2 9.1/8.1/7.2

Platforms: Windows, UNIX.

Testing Tools: JUnit 4.x

Version Control: CVS, Perforce, PVCS, XML Cannon.

Methodologies: Agile Methodology, Water flow methodology.

Build Tools: Ant 1.7, Maven 2.x

Other Tools: DB Visualizer, SOAP UI, WinSCP, PLSQL Developer, Razor SQLSplunk, Tleaf and Oracle SQL developer.

PROFESSIONAL EXPERIENCE

Confidential, NC

Java/Hadoop Developer

Responsibilities:

  • Evaluated business requirements and prepared detailed specifications that follow projects guidelines required to develop written programs.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Implemented Data Interface to get information of customers using RestAPI and Pre-process data using MapReduce and store into HDFS.
  • Handled importing of data from various data source performed transforming using Hive, MapReduce, and loaded data into HDFS.
  • Creating Hive tables and working on them using Hive QL.
  • Used SQL Developer, to connect to the database and analyze the data.
  • Written stored procedures to communicate between application and database tables.
  • Imported data from RDBMS to HDFS using Sqoop import/export options.
  • Involved in integration of Oozie with the rest of the Hadoop stack supporting several types of Hadoop jobs (such as Hive and Sqoop) as well as system specific jobs (such as Java programs and shell scripts)
  • Extraction of data from data warehouses and weblogs into HIVE tables is automated by developing workflows and coordinator jobs in Oozie.
  • Implemented Optimized joins to performs analysis on different data sets.
  • Worked on different open source J2EE technologies like spring core, spring JDBC, spring data and spring boot.
  • Working on different architectural patterns, diagrams to create and refine.
  • Port JAX-WS and JAX-RS web services to Spring based on Swagger contracts.
  • Develop unit test cases using JUnit and Mockito.
  • Jenkins, Sonar and deploy for continuous deployment.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Developed Pig Latin scripts to extract the data and load into HDFS.
  • Implemented test scripts to support test-driven development and continuous integration.
  • Responsible to manage data coming from different sources.
  • Created Hbase tables to store various data formats of data coming from different portfolios.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network device and pushed to HDFS.
  • Overall understanding of AWS infrastructure.
  • Hands on AWS services like EC2, S3 Services.
  • Experience in managing and reviewing Hadoop log files.
  • Used SVN version control tools for code management (check-ins, checkouts and synchronizing the code with repository).

Environment: Hadoop, MapReduce, Spark, Hive, HDFS, Oracle 11g, PL/SQL, SQL*PLUS, Spring core, Spring JDBC, Spring data, Spring boot, Toad, UNIX shell scripting, Eclipse, DataStage.

Confidential, MO

Java /Hadoop Developer

Responsibilities:

  • Involved in various stages of Software Development Life Cycle(SDLC) during application development.
  • Used Sqoop as data ingestion tool to import data from RDMS to HDFS and Hive.
  • Log data collected from the web servers was channelled into HDFS using Flume and spark streaming.
  • Data was also processed using spark such as aggregating, calculating the statistical values by using different transformations and actions.
  • Large data sets were analysed using Hive queries.
  • Implemented bucketing concepts in Hive tables and tables were designed to enhance the performance.
  • Developed Spark scripts by using Scala shell commands as per the requirement.
  • Processing the schema oriented and non-schema oriented data using Scala and Spark.
  • Developed and designed system to collect data from multiple portal and then processed it using spark.
  • Involved in developing Pig scripts to transform raw data into data that is useful to gain business insights.
  • Extensively used HQL in analysing, testing, prototyping the Data solutions in Hive.
  • Worked on Snappy compression for Avro and Parquet files.
  • Worked on different open source J2EE technologies like spring core, spring JDBC, spring data and spring boot.
  • Oozie workflow engine configuration to run multiple Hive and Pig Jobs.
  • Developed Spark code using Scala and Spark-SQL/Streaming for faster testing and processing of data.
  • Migration of ETL processes from RDBMS to Hive to test easy data manipulation .
  • Implemented test scripts to support test-driven development and continuous integration.
  • Supported QA engineers in understanding, Troubleshooting and Testing.
  • Mentored analyst and test team for writing Hive queries.
  • Cluster co-ordination service through Zookeeper.

Environment: Hadoop, HDFS, Map Reduce, Hive, Pig, HBase, Sqoop, Spark, Oozie, ZooKeeper, RDBMS/DB, MySQL, CSV, Oracle 11g, PL/SQL, SQL*PLUS, Spring core, Spring JDBC, Spring data, Spring boot.

Confidential

Java/J2EE Developer

Responsibilities:

  • Involved in analysis and design phase of software Development Life cycle(SDLC).
  • Analysis of Business Requirements and Technical Requirements.
  • Participated in requirements gathering for the application. Co-ordinated with the business team to review the requirements and went through Software Requirement Specifications(SRS) document.
  • Participated in developing different UML diagrams such as Class diagrams, Use case diagrams and Sequence diagrams.
  • Designed database connections using JDBC.
  • Extensively used Eclipse IDE for all the J2EE applications.
  • Modifications on the database were done using Triggers, views, Stored procedures, SQL,PL/SQL.
  • Implemented multi-threading for faster processing .
  • JDBC was used for database connectivity to SQL and to invoke stored procedures.
  • Developed Action classes and Dao classes to access the database.
  • Used Hibernate for ORM mapping and persistence layer of the Application and Writing POJO’s.
  • Involved in data modelling the oracle tables for java J2EE applications.
  • Used GIT version control software to monitor and track all the changes that are done to the source code.
  • Used Tomcat Application Server to deploy the applications .
  • Involved in Unit testing .
  • Actively involved in customer interaction to strengthen customer relationship.

Environment: Oracle 11g, Java 1.5, Struts, Servlets, HTML, XML, SQL, J2EE, Junit, Tomcat 6, MVC, JavaScript, Git.

Confidential

Java/J2EE Developer

Responsibilities:

  • Specified the system architecture and design utilizing the UML analysis and Design Models.
  • Design and development of UI Screens by using JSP, HTML and CSS.
  • Developed client-side validations using JavaScript.
  • Configure and implemented log4j for entire application.
  • Development of server side programming using Servlets and JSP. Interacted with MySQL database by using JDBC.
  • Used Web services to extract customer related product data from machines and servers using WSDL, XML and SOAP using Service Oriented Architecture.
  • Worked with JMS Queues for sending messages in point-to-point mode. Performed unit testing using Junit framework.
  • Created and build project using Ant. Interacting with other teams closely by gathering information to resolve issues.
  • Designed and developed Optimization UI screens for Rate Structure, Operating Cost, Temperature and Predicted loads using JSF, JSP, JavaScript and HTML.
  • Created functions, subqueries and stored procedures using PL/SQL.

Environment: Java, J2EE, Log4J, JSP, JMS, Servlets, JDBC, MySQL, HTML, XML, JBoss, Eclipse, Unix.

We'd love your feedback!