We provide IT Staff Augmentation Services!

Hadoop Developer Resume Profile

5.00/5 (Submit Your Rating)

Atlanta, GA

Professional Summary

  • Over 7 years of professional IT experience which includes experience in Big data ecosystem and Java/J2EE related technologies.
  • Excellent Experience in Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node, YARN, Pig, hive, Imapala and MapReduce programming paradigm.
  • Hands on experience in installing, configuring, and using Hadoop ecosystem components like HadoopMapReduce, HDFS, HBase, Hive, Sqoop, Pig, Impala,Shark and Flume.
  • Experience in working with ETL tools like Abinito, Talend.
  • Experience in managing and reviewing Hadoop log files.
  • Excellent understanding and knowledge of NOSQL databases like MongoDB, HBase, Cassandra.
  • Experience in Hadoop administration activities such as installation and configuration of clusters using Apache, Cloudera and AWS.
  • Implemented in setting up standards and processes for Hadoop based application design and implementation.
  • Experience in importing and exporting data using Sqoop and writing custom shell scripts from HDFS to Relational Database Systems and vice-versa.
  • Experience in Object Oriented Analysis, Design OOAD and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
  • Very good experience in complete project life cycle design, development, testing and implementation of Client Server and Web applications.
  • Extensive experience working in Oracle, DB2, SQL Server and My SQL database.
  • Hands on experience in VPN, Putty, winSCP, VNCviewer, etc.
  • Scripting to deploy monitors, checks and critical system admin functions automation.
  • Good experience in ETL, Documentation, supporting, Testing, Data Mapping, Transformation and Loading between Source and Target Databases in a complex, high-volume environment to populate all development and testing schemas.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Experience in Java, JSP, Servlets, EJB, WebLogic, WebSphere, Hibernate, Spring, JBoss, JDBC, RMI, Java Script, Ajax, Jquery, XML, and HTML.
  • Extensive experience in version control and source code management tools like GitHub, Rational Clearcase.
  • Ability to adapt to evolving technology, strong sense of responsibility and accomplishment.
  • Major strengths are familiarity with multiple software systems, ability to learn quickly new technologies, adapt to new environments, self-motivated, team player, focused adaptive and quick learner with excellent interpersonal, technical and communication skills.

Technical Skills

Big Data Ecosystem

HDFS, HBase, HadoopMapReduce, Zookeeper, Hive, Pig, Sqoop, Flume, Oozie, Cassandra, Talend, Impala, Shark

RDBMS/ Database

SQL Server 2000/2005/2008 R2, MS-Access XP/2007, Sybase. ORACLE 10g/9i, PL/SQL, Hbase, Cassandra, Teradata

Data Analysis

Tableau, R

ETL

Talend, Abinito

Languages

Java, PHP, T-SQL, PL/SQL C . NET, XML

IDE

Visual Studio, Eclipse

Web Design Tools

HTML, CSS, JSP, MVC, Ajax, Struts

Scripting Languages

Shell scripting, Perl Scripting, Java Scripting

Operating Systems

MS-DOS, Windows, MAC, UNIX, LINUX

Data Modeling

Microsoft Vision 2000, Erwin 4.0/3.5

Professional Experience

Confidential

Hadoop Developer

Responsibilities:

  • Did analysis for identifying the potential tables form source Teradata, Oracle for migrating them to Hadoop.
  • Prepared custom shell scripts for connecting to Teradata and pulling the data from Teradata tables to HDFS.
  • UsedSqoop for moving the data from the Oracle source tables to HDFS.
  • Responsible for moving the data from source Teradata, Oracle to HDFS and Unit testing of the files moved.
  • Involved in testing activities within QA environment which include doing smoke testing, System testing, Integration testing and writing test cases.
  • Studied the Abintio transformations of the source tables to replicate them using Talend Hadoop ETL .
  • Implementation of Talend jobs to extract the data from different systems.
  • Created hive tables for the moved files in HDFS.
  • Used Pig for transformations on the tables.
  • Did comparative analysis of the Hive vs Impala.
  • Managing and scheduling jobs on hadoop cluster usingOozie.
  • Responsible for code promotion using Github.
  • Used R for analysis of the data.
  • Responsible for preparing the File level and Column level metadata.
  • Prepared the source target mapping of the files.
  • Extensively used the Hue browser for interacting with hadoop components.
  • Used cdh5.1hadoop cluster with 24 nodes in development environment.
  • Worked in Agile environment with team of 7 people.

Environment: Hadoop, HDFS, Hive, Impala, Spark, Talend, Teradata, Oracle, Flume, R,HBase, Sqoop, PIG, Java JDK 1.6 , Eclipse, Ubuntu, Zookeeper, Abintio, Github, YARN

Confidential

Hadoop Developer

Responsibilities:

  • Worked on analyzingHadoop cluster and different big data analytic tools including Pig, Hive and Impala.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Installed and configured Flume, Hive, Pig, Sqoop, HBase on the Hadoop cluster.
  • Managing and scheduling Jobs on a Hadoop cluster.
  • Worked on installing cluster, commissioning decommissioning of datanode, namenode recovery, capacity planning, and slots configuration.
  • Resource management of HADOOP Cluster including adding/removing cluster nodes for maintenance and capacity needs.
  • Automating the jobs using Oozie.
  • Involved in loading data from UNIX file system to HDFS.
  • Migration of ETL processes from Oracle to Hive to test the easy data manipulation.
  • Developed Hive queries to process the data for visualizing.
  • Created HBase tables to store variable data formats of PII data coming from different portfolios.
  • Implemented best income logic using Pig scripts.
  • Implemented test scripts to support test driven development and continuous integration.
  • Responsible to manage data coming from different sources.
  • Installed and configured Hive and also written Hive UDFs.
  • Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.
  • Cluster coordination services through Zookeeper.
  • Experience in managing and reviewing Hadoop log files.
  • Exported the analysed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Analysed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.

Environment:Hadoop, HDFS, Hive, Flume, HBase, Sqoop, PIG, Java JDK 1.6 , Eclipse, Oracle, Impala, Zookeeper, Horton works

Confidential

Hadoop Developer

Responsibilities:

  • Installed and configured HadoopMapreduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Installed and configured Pig and also written PigLatin scripts.
  • Involved in managing and reviewing Hadoop log files.
  • Imported data using Sqoop to load data from SQL server to HDFS on regular basis.
  • Developing Scripts and Batch Job to schedule various Hadoop Program.
  • Creating Hive tables and working on them using Hive QL.
  • Written Hive queries for data analysis to meet the business requirements.
  • Experienced in defining job flows.
  • Got good experience with NOSQL database like HBase, Mongo DB.
  • Designed and implemented Mapreduce-based large-scale parallel relation-learning system.
  • Setup and benchmarked Hadoop/HBase clusters for internal use.

Environment: Hadoop, MapReduce, HDFS, Hive, Java, Hadoop distribution of Horton Works, Cloudera, Pig, HBase, Linux, XML, SQL server, MySQL Workbench, Java 6, Eclipse, Oracle 10g, MangoDB.

Confidential

Java Developer

Responsibilities:

  • Participated in the creation of Use Cases, Class Diagrams, and Sequence Diagrams for analysis and design of application.
  • Developing Intranet Web Application using J2EE architecture, using JSP to design the user interfaces.
  • Used JSP, HTML, Java Script and CSS for content layout and presentation.
  • Developed the application based on MVC architecture using Struts Framework, designed Action Classes, Form Beans.
  • Created web application prototype using jQuery.
  • Used Spring to implement the Business layer and Data Access layer.
  • Used JQuery to make the frontend components interact with the JavaScript functions to add dynamism to the web pages at the client side.
  • Involved in configuring hibernate to access database and retrieve data from the database.
  • Developed several web pages using JSP, HTML, and XML.
  • Used Java Script to perform checking and validations at Client's side.
  • Involved in Sever side validation based on the Business rules.
  • Developed Servlets and JSPs based on MVC pattern using Struts framework and Spring Framework.
  • Worked on JQuery, AJAX, JASON and JSF for designing highly user interactive web pages.
  • Developed Stored Procedures, Triggers and Packages in Oracle.
  • Developed DAO pattern for Database connectivity.
  • Used JDBC API to establish connection between Java and Database.
  • Designed and developed the user interface screens, database design changes and changes to user access modules.
  • Developed additional UI Components using JSF and implemented an asynchronous, AJAX JQuery based rich client to improve customer experience.
  • Developed server-side common utilities for the application and the front-end dynamic web pages using JSP, Java Script and HTML/DHTML and CSS.
  • Implemented test cases for Unit testing of modules using JUnit.

Environment: IBM Websphere, Spring, Java JDK, J2EE, JSP, Servlets, Hibernate, HTML, JavaScript, JDBC, Struts, XML, JUnit, RAD, Oracle 10g.

Confidential

Software Engineer

Responsibilities:

  • Interacting business analysts for requirements gathering for system dependencies.
  • Participated in peer reviews and walkthroughs of program, code test specifications.
  • Worked on Struts Framework components like struts-config.xml, validator-rules.xml, validation.xml, struts action classes, form-beans.
  • Worked on creation of Hibernate configuration files and mapping files for persistence layer JPA and transaction management.
  • Played typical role in application enhancement using Spring MVC framework.
  • Implemented data access layer by Object Relational Mapping ORM tool - HIBERNATE with standard DAO pattern and HQL Queries.
  • Worked on Service Oriented Architecture SOA using Web ServicesJAX-WS.
  • Extensively used AJAX for validations with Direct Web Remoting DWR .
  • RAD6.0 was used as an IDE for developing the application.
  • Written customized UNIX scripts C TC, Bourne Shell programming and moved them to production environment after stress testing.
  • Java IO API was used for reading and writing java objects.
  • Ensured minimum count of Tickets in queue by fixing errors within time constraints.
  • Report the ongoing performance issues to AMEX IT using on - call report database.
  • Designed and implemented exception handling strategies.
  • Followed RUP methodologies during the project execution time.
  • Used Apache Maven2 plug-in for Eclipse for building the application.
  • Wrote extensive SQL Queries for the purpose of data retrieval and data manipulation using JDBC and JNDI on Oracle.
  • Setup Weblogic 8.1 during deployment and testing.
  • IBM Rational Clear Case6.0 has been used for the purpose of version control.
  • IBM Rational Clear Quest6.0 has been used for bug tracking.
  • Developed JUnitTest classes for testing the application code.
  • Performed random and regress testing of the application for reducing number of defects.
  • Worked on developing the Webservices on WSAD5.x and WAS5.x.

Environment: Java, Jsp, DHTML, HTML, servlet, EJB, JDBC, JNDI, AJAX 1.5, XML, PL/SQL, Struts, Hibernate 2.0, Spring, SOA Web services JAX-WS , JAX-RS, IBM Rational Clear Quest, IBM Rational Clear Case, Log4j, Maven Plug-in, RAD, Weblogic, Toad, Jboss, UNIX, Oracle 10g.

We'd love your feedback!