We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • Over 8 years of programming and software development experience with skills in data analysis, design, development, testing and deployment of software systems from development stage to production stage in Big Data and Java technologies.
  • 3 years’ of experience in Big Data and tools in Hadoop Ecosystem including Pig, Hive, Sqoop, Oozie, Zookeeper, Flume & Impala.
  • Excellent knowledge on Hadoop Ecosystem Architecture and components such as Hadoop Distributes File System (HDFS), MRv1, MRv2, Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager and MapReduce programming.
  • Experience in analyzing data using Hive (HiveQL), Pig Latin, HBase & MapReduce programs (Java).
  • Experience in extending Hive and Pig core functionality by writing User defined Functions (UDF).
  • Experience in migrating the data using Sqoop from HDFS to Relational Database System and vice - versa.
  • Good understanding on NoSQL databases and hands on work experience in writing applications on NoSQL databases like HBase.
  • Extensive programming experience in developing web based applications using Core Java, J2EE, JSP, Servlets, Hibernate, JDBC, HTML and Java Script.
  • Experience in deploying applications in Web/Application Servers like Tomcat, WebLogic and Oracle Application Servers.
  • Experience in working with different relational databases like SQL, My SQL and Oracle
  • Extensive experience in Unix Shell Scripting.
  • Good experience in extracting and generating statistical analysis using Business Intelligence tool Tableau.
  • Knowledge of installing, configuring, debugging and troubleshooting Hadoop clusters.
  • Strong knowledge of Software Development Life Cycle and expertise in detailed design documentation.
  • Extensive experience with Waterfall and Agile Scrum Methodologies
  • Experience in fixing bugs, tracking defects, issues & risks using Clear Quest, Quality Center and Jira.
  • Good experience in handling client calls and working in Offshore-Onshore model.
  • Excellent analytical, logical, debugging skills, programming skills, Self-Motivated, Quick Learner, & Team Player
  • Excellent Communication skills & Organizational skills & Time management skills.

TECHNICAL SKILLS

Hadoop Ecosystem: Apache Hadoop (HDFS/MapReduce), Pig, Hive, HBase, Sqoop, Flume, Oozie, Hue, HiveQL, Pig Latin, Impala, Scala, Spark.

Programming Languages: Core Java, J2EE, SQL, Unix Shell, Pig, HiveQL, C++, C, Scala, JavaScript

Reporting Tools: Tableau

Java Technologies and Framework: Hibernate, JDBC, JSP, Servlets, JSP, XML, Multithreading, HTML.

Databases: MySQL, Oracle

Configuration Management: SVN, Clearcase.

Operating Systems: Unix, Windows

Webservers: Apache Tomcat and Oracle Web logic server

IDE’s: Eclipse, Net beans.

Others: Putty, Winscp, Cygwin.

PROFESSIONAL EXPERIENCE

Confidential, Atlanta, GA

Hadoop Developer

Responsibilities:

  • Evaluated suitability of Hadoop and its ecosystem to the above project and implemented various proof of concept (POC) applications to eventually adopt them to benefit from the Big Data Hadoop initiative.
  • Estimated Software & Hardware requirements for the NameNode and DataNode & planning the cluster.
  • Extracted the needed data from the server into HDFS and Bulk Loaded the cleaned data into HBase.
  • Written the Map Reduce programs, Hive UDFs in Java where the functionality is too complex.
  • Involved in loading data from LINUX file system to HDFS
  • Develop HIVE queries for the analysis, to categorize different items.
  • Designing and creating Hive external tables using shared meta-store instead of derby with partitioning, dynamic partitioning and buckets.
  • Given POC of FLUME to handle the real time log processing for attribution reports.
  • Sentiment Analysis on reviews of the products on the client’s website.
  • Exported the resulted sentiment analysis data to Tableau for creating dashboards
  • Used Map Reduce JUnit for unit testing..
  • Maintained System integrity of all sub-components (primarily HDFS, MR, HBase, and Hive).
  • Reviewing peer table creation in Hive, data loading and queries.
  • Monitored System health and logs and respond accordingly to any warning or failure conditions.
  • Responsible to manage the test data coming from different sources.
  • Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.
  • Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Pig Scripts
  • Involved unit testing, interface testing, system testing and user acceptance testing of the workflow tool

Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Java, Flume, Cloudera, Oozie, MySQL, UNIX, Core Java.

Confidential, St. Louis, MO

Big Data Hadoop/Java Developer

Responsibilities:

  • Migrated the needed data from MySQL into HDFS using Sqoop and importing various formats of flat files in to HDFS.
  • Mainly worked on Hive queries to categorize data of different claims.
  • Integrated the hive warehouse with HBase
  • Involved in loading data from LINUX file system to HDFS
  • Written customized Hive UDFs in Java where the functionality is too complex.
  • Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.
  • Designing and creating Hive external tables using shared meta-store instead of derby with partitioning, dynamic partitioning and buckets.
  • Generate final reporting data using Tableau for testing by connecting to the corresponding Hive tables using Hive ODBC connector.
  • Responsible to manage the test data coming from different sources
  • Reviewing peer table creation in Hive, data loading and queries.
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.
  • Monitored System health and logs and respond accordingly to any warning or failure conditions.
  • Gained experience in managing and reviewing Hadoop log files.
  • Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs
  • Involved unit testing, interface testing, system testing and user acceptance testing of the workflow tool.
  • Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Pig Scripts

Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Core Java, Pig, Sqoop, Cloudera CDH4, Oracle, MySQL, Tableau.

Confidential, Memphis, TN

Hadoop/Java Developer

Responsibilities:

  • Worked on designing application components using Java Collection framework and used multithreading to provide concurrent database access.
  • Involved in analysis, design, construction and testing of the application
  • Developed the web tier using JSP to show details and summary.
  • Designed and developed the UI using JSP, HTML and JavaScript.
  • Utilized JPA for Object/Relational Mapping purposes for transparent persistence onto the SQL Server database.
  • Used Tomcat web server for development purpose.
  • Involved in creation of Test Cases for JUnit Testing.
  • Used Oracle as Database and used Toad for queries execution and also involved in writing SQL scripts, PL/SQL code for procedures and functions.
  • Used CVS for version controlling.
  • Developed application using Eclipse and used build and deploy tool as Maven
  • Explored and used Hadoop ecosystem features and its architectures.
  • Involved in meeting with the business team to gather their requirements.
  • Migrated the data from staging database into HDFS using Sqoop.
  • Wrote custom MapReduce codes, generated JAR files for user defined functions and integrated with Hive to help the analysis team with the statistical analysis.
  • Load and transform large sets of structured, semi-structured and unstructured data into HDFS.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.
  • Involved in Installing and Configuring Hadoop cluster on the development cluster.
  • Involved unit testing, interface testing, system testing and user acceptance testing of the workflow tool.

Environment: Apache Hadoop, Core Java, Hive, Ubuntu, Eclipse, Cloudera, Sqoop, J2EE, Junit, HTML, JSP.

Confidential, New York

Java Developer

Responsibilities:

  • Analyzed the system and gathered the system requirements.
  • Created design documents and reviewed with team in addition to assisting the business analyst / project manager in explanations to line of business.
  • Developed the web tier using JSP to show account details and summary.
  • Designed and developed the UI using JSP, HTML and JavaScript.
  • Utilized JPA for Object/Relational Mapping purposes for transparent persistence onto the SQL Server database.
  • Used Tomcat web server for development purpose.
  • Used Oracle as Database and used Toad for queries execution and also involved in writing SQL scripts, PL/SQL code for procedures and functions.
  • Developed application using Eclipse and used build and deploy tool as Maven.
  • Used Log4J to print the logging, debugging, warning, info on the server console.
  • Implemented the Business logic by efficiently utilizing the OOPS features of core Java and also Performed Unit Testing to using JUNIT.
  • Interacted with Business Analyst for requirements gathering.
  • Tracked QA defects/issues by using QC.
  • Used Websphere as a Server and CVS for version control of the code

Environment: Java, J2EE Servlet, JSP, JUnit, XML, JavaScript, Log4j, CVS, Maven, Eclipse, Apache Tomcat, and Oracle

Confidential

Java Developer

Responsibilities:

  • Involved in Requirements analysis, design, and development and testing.
  • Involved in setting up the different roles & maintained authentication to the application.
  • Designed, deployed and tested Multi-tier application using the Java technologies.
  • Involved in front end development using JSP, HTML & CSS.
  • Implemented the Application using Servlets
  • Deployed the application on Oracle Web logic server
  • Implemented Multithreading concepts in java classes to avoid deadlocking.
  • Used MySQL database to store data and execute SQL queries on the backend.
  • Prepared and Maintained test environment.
  • Tested the application before going live to production.
  • Documented and communicated test result to the team lead on daily basis.
  • Involved in weekly meeting with team leads and manager to discuss the issues and status of the projects.

Environment: J2EE (Java, JSP, JDBC, Multi-Threading), HTML, Oracle Web logic server, Eclipse, MySQL, Junit

Confidential

Java Developer

Responsibilities:

  • Involved in Requirements analysis, design, and development and testing
  • Involved in development of platform related applications on Mediation Servers.
  • Involved in Configuration management of the server using Core Java, Oracle DB.
  • Involved in development of upgrade framework for upgrading the servers.
  • Integrated upgrade framework like Rolling Upgrade, Quick Reboot Upgrade for NSN LTE mediation server/CSL Server using Unix Shell Scripting, C++.
  • Involved in bug fixing of Configuration management and upgrade framework.
  • Involved in testing of applications and upgrade functionality.
  • Developed LSNAP tool which is a log snap tool for collecting system logs and status information on MED/CSL servers.
  • Interacted with the client directly during the integration of upgrade framework.
Environment: Core Java, Oracle, UNIX Shell Scripting, C++

We'd love your feedback!