We provide IT Staff Augmentation Services!

Java/hadoop Developer Resume

4.00/5 (Submit Your Rating)

Round Rock, TexaS

SUMMARY

  • Over 7+ years of programming and software development experience with skills in data analysis, design and development, testing and deployment of software systems from development stage to production stage with giving emphasis on Object oriented paradigm.
  • Excellent knowledge on Hadoop Architecture and ecosystems such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • More than +3 years’ experience with the tools in Hadoop Ecosystem including Pig, Hive, HDFS, MapReduce, Sqoop, Oozie, Zookeeper.
  • Experience in migrating the data using Sqoop from HDFS to Relational Database System and vice - versa according to client’s requirement.
  • Experience in analyzing data using HiveQL, Pig Latin, HBase and custom Map Reduce programs in Java.
  • Good knowledge of Data warehousing concepts and ETL processes.
  • Good understanding of NoSQL databases and hands on work experience in writing applications on NoSQL databases like HBase and Cassandra.
  • Experienced in writing Map Reduce jobs using Java.
  • Knowledge of manipulating/analyzing large datasets and finding patterns and insights within structured and unstructured data.
  • Knowledgeable of Spark and Scala mainly in framework exploration for transition from Hadoop/MapReduce to Spark.
  • Knowledge of Apache Solr/Lucene developing open source enterprise search platform
  • Experience in Core Java, JSP, JDBC, Hibernate, Spring MVC Framework.
  • Knowledge of installing, configuring, debugging and troubleshooting Hadoop clusters.
  • Knowledge of Machine Learning implemented by Apache Mahout
  • Experienced with Unix and Linux distro (Unbuntu, CentOS, Debian)
  • Quick learning skills and effective team spirit with good communication skills.
  • Strong analytical and Problem solving skills.

TECHNICAL SKILLS

Hadoop Ecosystem: Apache Hadoop (HDFS/MapReduce), Pig, Hive, HBase, Sqoop, Flume, Oozie, Hue, HiveQL, Pig Latin

Advanced Big Data Technologies: DataStax Cassandra, Cloudera CDH4, Apache Mahout Library

Languages: Java, Scala, J2EE,SQL, UNIX, R(Statistics),Pig Latin, HiveQL

Data Analysis and Statistics/Machine Learning: Linear Regression Models, Principal Component Analysis, Statistical Distributions(Normal, Binomial, Poisson),Recommender System, Clustering

RDBMS: MySQL, MS SQL server

Java Technologies and Framework: JDBC, Multi-threading, JSP, XML

Web Server: Apache Tomcat and Oracle Web logic server

Operating System: Windows, Unix and Linux

IDE and Software: Eclipse, Net beans, R studio, Revolution R Enterprise, Minitab, Tableau 8.2

Version Control: VSS(Visual Source Safe), GitHub

Others: PuTTY, WinScp, Cygwin

PROFESSIONAL EXPERIENCE

Confidential, Round Rock, Texas

Java/Hadoop Developer

Responsibilities:

  • Created Hive tables according to business requirement.
  • Wrote MapReduce jobs to find out the user trends.
  • Extended Hive and Pig core functionality by writing custom UDF.
  • Carried out the importing and exporting data into HDFS and Hive using Sqoop.
  • Evaluated suitability of DataStax Cassandra and its components to the ongoing project and implementing/validating with various proof of concept (POC) applications to eventually adopt them to benefit from using the Cassandra Database.
  • Wrote queries in DataStax Cassandra to create, alter, insert and delete elements from lists, sets and maps.
  • Used the data from DataStax Cassandra for searching, sorting and grouping the data.
  • Involved in performance tuning of Cassandra cluster by changing the parameters of Read operation, Compaction, Memory Cache, Row cache.
  • Gained experience in storing large volumes of data on Cassandra for High availability of analytical data.
  • Used Solr/Lucene for developing open source enterprise search platform in a testing and developing environment
  • Exported the analyzed data into relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Collaborated with the customer services to develop products implementing the insights obtained from analysis. Financial management tools were created and deployed for use by customers to help them understand their spending/saving pattern.
  • Gained knowledge in installing cluster, commissioning & decommissioning of DataNode, NameNode recovery, capacity planning, and slots configuration.
  • Assist the team in their development & deployment activities.
  • Experienced in defining job work flows as per their dependencies in Oozie.

Environment: Hadoop (HDFS/MapReduce),Pig, Hive, UDF, Sqoop, Oozie, Datastax Cassandra, Scala/Spark, Linux.

Confidential, Chicago

Java/Hadoop Developer

Responsibilities:

  • Wrote job work flows as per the requirements and their dependencies.
  • Migration of data using Sqoop from RDBMS (MySQL) to HDFS on regular basis from various sources.
  • Implemented the Hive queries for aggregating the data and extracting useful information by sorting the data according to required attributes.
  • Involved in the functional team meeting to gather and understand business requirements.
  • Converted the feasible business requirements to technical tasks in Design Documents.
  • Worked on implementing Partition, Dynamic Partition and Buckets in Hive for efficiently accessing data.
  • Wrote and implemented Pig UDF to preprocess the data and use it for analysis.
  • Exported the results from Pig and Hive script to RDBMS (MySQL) using Unix Shell Scripts.
  • Responsible for developing simple and complex MapReduce Jobs using Hive and Pig for analysis.
  • Involved in using R software for developing linear regression models.
  • Used Tableau for Reporting /Dash boarding.
  • Maintained and updated various configurations in the cluster through Zookeeper.
  • Gained Knowledge in building scalable and distributed data solutions using Hadoop.
  • Gained working knowledge in NOSQL database (HBase and Datastax Cassandra).
  • Worked according to production environment configuration and functional change requests.

Environment: Hadoop (HDFS/MapReduce), Sqoop, Pig, Hive, Zookeeper, UDF, HBase, Datastax Cassandra.

Confidential, Memphis, TN

Java/Hadoop Developer

Responsibilities:

  • Worked on designing application components using Java Collection framework and used multithreading to provide concurrent database access.
  • Explored and used Hadoop ecosystem features and its architectures.
  • Involved in meeting with the business team to gather their requirements.
  • Migrated the data from staging database into HDFS using Sqoop.
  • Wrote custom MapReduce codes, generated JAR files for user defined functions and integrated with Hive to help the analysis team with the statistical analysis.
  • Load and transform large sets of structured, semi-structured and unstructured data into HDFS.
  • Monitored Hadoop Scripts which migrates the input from HDFS and load data into Hive.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way
  • Developed Hive queries for the analyst team.
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.
  • Involved in Installing and Configuring Hadoop cluster on the development cluster.
  • Performed clickpath/clickstream analysis to provide actionable feedback to improve the website design. It further helped to improve product visibility and customer service to customers

Environment: Apache Hadoop, core java, Hive, Unbuntu, Eclipse, Cloudera Manager.

Confidential

Java Developer

Responsibilities:

  • Attended User group Meeting to gather system requirements.
  • Analyzed and designed document specifications to design J2EE application.
  • Involved in the design and development phase of the application.
  • Implemented the business logic using Session Beans in EJB.
  • Developed User Interface and end user screens using Java Swing, JSP and Servlet.
  • Implemented web services using SOAP.
  • Responsible for periodic generation of reports.
  • Performed Unit testing of the application using JUnit.
  • Carried out necessary validations of each developed screen by writing Triggers, Procedures and Functions available along with the objects, events and methods.
  • Designed and developed menus in order to navigate from once screen to another screen.
  • Used Hibernate framework with JDBC drivers to connect to the database and manipulate the data.
  • Use of Joins, Triggers, Stored Procedures and Functions in order to interact with backend database using SQL.
  • Review the changes on the weekly basis and ensure the deliverables to be quality.
  • Actively documented the common problems during testing and developing phase and also in production phase.
  • Coordinated with other Development teams, System managers and web master and developed good working environment

Environment: J2EE, EJB, JSP, SOAP, Java Script, Servlet, JDBC, SQL, UNIX, JUnit

Confidential

JAVA DEVELOPER

Responsibilities:

  • Involved in all the development phases of SDLC including gathering requirements, documenting the requirements as Use case documents.
  • Designed, deployed and tested Multi-tier application using the Java technologies.
  • Involved in front end development using JSP, HTML & CSS.
  • Implemented the Application using spring MVC Framework
  • Deployed the application on Oracle Web logic server
  • Implemented Multithreading concepts injavaclasses to avoid deadlocking.
  • Used MySQL database to store data and execute SQL queries on the backend.
  • Prepared and Maintained test environment .Tested the application before going live to production.
  • Documented and communicated test result to the team lead on daily basis.
  • Involved in weekly meeting with team leads and manager to discuss the issues and status of the projects.

Environment: J2EE (Java, JSP, JDBC, Multi-Threading), HTML, Oracle Web logic server, Eclipse, MySQL, JUnit

Confidential

JUNIOR SOFTWARE DEVELOPER

Responsibilities:

  • Involved in the complete software development life cycle (SDLC) of the application from requirement analysis, reviewing to testing.
  • Involved in Object Oriented Design and development using OOA/OOD methodology to capture and model business requirements.
  • Designed the front end of the application using HTML, JavaScript and CSS
  • Involved in preparation of Process Flow diagrams, UML Class diagrams, Sequence Diagrams and Technical Design Document(TDD)
  • Designed and developed the database for the project
  • Tested the system using Apache Tomcat server and later deployed at client location
  • Developed unit test cases and test logs of multiple modules used in project
  • Used JUnit for unit testing of the application
  • Involved in setting of development and review environment, which involved installation and configuration of Apache Tomcat server on Debian Linux server

Environment: Java, JavaScript, Html, Apache Tomcat, Debian Linux, UML, JUnit

We'd love your feedback!