We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Nyc, NY

SUMMARY

  • Over 7+ years of professional IT experience which includes experience in Big data ecosystem related technologies.
  • Excellent understanding / noledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node and MapReduce programming paradigm.
  • Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Hive, Sqoop, Pig, Zookeeper and Flume.
  • Good Exposure on Apache Hadoop Map Reduce programming, PIG Scripting and Distribute Application and HDFS.
  • Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
  • In - depth understanding of Data Structure and Algorithms.
  • Experience in managing and reviewing Hadoop log files.
  • Excellent understanding and noledge of NOSQL databases like MongoDB, HBase, Cassandra.
  • Implemented in setting up standards and processes for Hadoop based application design and implementation.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good noledge of J2EE design patterns and Core Java design patterns.
  • Experience in managing Hadoop clusters using Cloudera Manager tool.
  • Very good experience in complete project life cycle (design, development, testing and implementation) of Client Server and Web applications.
  • Experience in Administering, Installation, configuration, troubleshooting, Security, Backup, Performance Monitoring and Fine-tuning of Linux Redhat.
  • Extensive experience working in Oracle, DB2, SQL Server and My SQL database.
  • Hands on experience in VPN, Putty, winSCP, VNCviewer, etc.
  • Scripting to deploy monitors, checks and critical system admin functions automation.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Experience in Java, JSP, Servlets, EJB, WebLogic, WebSphere, Hibernate, Spring, JBoss, JDBC, RMI, Java Script, Ajax, Jquery, XML, and HTML
  • Ability to adapt to evolving technology, strong sense of responsibility and accomplishment.

TECHNICAL SKILLS

Big Data Ecosystem: HDFS, HBase, Hadoop MapReduce, Zookeeper, Hive, Pig, Sqoop, Flume, Oozie, Cassandra, Datameter, Pentaho

Languages: C, C++, Java, PHP, SQL/PLSQL

Methodologies: Agile, V-model.

Database: Oracle 10g, DB2, MySQL, MongoDB, CouchDB, MS SQL server, Amazon EC2

Web Tools: HTML, Java Script, XML, ODBC, JDBC, Java Beans, EJB, MVC, Ajax, JSP, Servlets, Java Mail, Struts, Junit

IDE / Testing Tools: Eclipse.

Operating System: Windows, UNIX, Linux

Scripts: JavaScript, Shell Scripting

PROFESSIONAL EXPERIENCE

Confidential, NYC, NY

Hadoop Developer

Responsibilities:

  • Worked on analysing Hadoop cluster and different big data analytic tools including Pig, Hbase database and Sqoop
  • Responsible for building scalable distributed data solutions using Hadoop
  • Installed and configured Flume, Hive, Pig, Sqoop, HBase on the Hadoop cluster.
  • Managing and scheduling Jobs on a Hadoop cluster.
  • Implemented nine nodes CDH3 Hadoop cluster on Red hat LINUX.
  • Worked on installing cluster, commissioning & decommissioning of datanode, namenode recovery, capacity planning, and slots configuration.
  • Setup Hadoop cluster on Amazon EC2 using whirr for POC.
  • Resource management of HADOOP Cluster including adding/removing cluster nodes for maintenance and capacity needs
  • Involved in loading data from UNIX file system to HDFS.
  • Created HBase tables to store variable data formats of PII data coming from different portfolios.
  • Implemented best income logic using Pig scripts.
  • Implemented test scripts to support test driven development and continuous integration.
  • Responsible to manage data coming from different sources.
  • Installed and configured Hive and also written Hive UDFs.
  • Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.
  • Cluster coordination services through Zookeeper.
  • Experience in managing and reviewing Hadoop log files.
  • Exported the analysed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Analysed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Supported in setting up QA environment and updating configurations for implementing scripts wif Pig and Sqoop.

Environment: Hadoop, HDFS, Hive, Flume, HBase, Sqoop, PIG, Java (JDK 1.6), Eclipse, MySQL and Ubuntu, Zookeeper, Amazon EC2

Confidential, Richmond VA

Hadoop Developer

Responsibilities:

  • Involved in review of functional and non-functional requirements.
  • Installed and configured HadoopMapreduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Installed and configured Pig and also written PigLatin scripts.
  • Wrote MapReduce job using Pig Latin.
  • Involved in managing and reviewing Hadoop log files.
  • Imported data using Sqoop to load data from MySQL to HDFS on regular basis.
  • Developing Scripts and Batch Job to schedule various Hadoop Program.
  • Written Hive queries for data analysis to meet the business requirements.
  • Creating Hive tables and working on them using Hive QL.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Experienced indefining jobflows.
  • Got good experience wif NOSQL database.
  • Involved in creating Hive tables, loading wif data and writing hive queries which will run internally in map reduce way.
  • Developed a custom FileSystem plug in for Hadoop so it can access files on Data Platform.
  • This plugin allows HadoopMapReduce programs, HBase, Pig and Hive to work unmodified and access files directly.
  • Designed and implemented Mapreduce-based large-scale parallel relation-learning system
  • Extracted feeds form social media sites such as Facebook, Twitter using Python scripts.
  • Setup and benchmarked Hadoop/HBase clusters for internal use

Environment: Hadoop, MapReduce, HDFS, Hive, Java, Hadoop distribution of Horton Works, Cloudera, Pig, HBase, Linux, XML, MySQL, MySQL Workbench, Java 6, Eclipse, Oracle 10g, PL/SQL, SQL*PLUS, Sub Version, Cassandra.

Confidential, San Francisco, CA

Hadoop Developer

Responsibilities:

  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReducejobs in java for data cleaning and preprocessing.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Experienced in defining job flows.
  • Experienced in managing and reviewing Hadooplog files.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Responsible to manage data coming from different sources.
  • Got good experience wif NOSQL database.
  • Supported Map Reduce Programs those are running on the cluster.
  • Involved in loading data from UNIX file system to HDFS.
  • Installed and configured Hive and also written Hive UDFs.
  • Involved in creating Hive tables, loading wif data and writing hive queries which will run internally in map reduce way.
  • Implemented CDH3 Hadoop cluster on CentOS.
  • Worked on installing cluster, commissioning & decommissioning of datanode, namenode recovery, capacity planning, and slots configuration.
  • Created HBase tables to store variable data formats of PII data coming from different portfolios.
  • Implemented best income logic using Pig scripts.
  • Load and transform large sets of structured, semi structured and unstructured data
  • Cluster coordination services through Zookeeper.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Supported in setting up QA environment and updating configurations for implementing scripts wif Pig and Sqoop.

Environment: Hadoop, MapReduce, HDFS, Hive, Java, SQL, Datameter, PIG, Zookeeper, Sqoop, CentOS

Confidential, Johnston, IA

IT Analyst

Responsibilities:

  • Involved in Analysis, Design, Development and Testing of application modules.
  • Analyzed the complex relationship of system and improve performances of various screens.
  • Developed various user interface screens using struts framework.
  • Worked wif spring framework for dependency injection.
  • Developed JSP pages, using Java Script, Jquery, AJAX for client side validation and CSS for data formatting.
  • Written domain, mapper and DTO classes and hbm.xml files to access data from DB2 tables.
  • Developed various reports using Adobe APIs and Web services.
  • Wrote test cases using Junit and coordinated wif testing team for integration tests
  • Fixed bugs, improved performance using root cause analysis in production support

Environment: JDK 1.4.2, Swings, EJB 1.3, XML, XML Spy, SQL, WinSQL, StarTeam, DB2, WSAD 5.1.2, Apache ant, Windows XP/7, Web services, Junit, Hyperion 8/9.3, Citrix, Mainframes, CVS, JNDI

Confidential, Lansing, MI

Java/J2EE Interface Developer

Responsibilities:

  • Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
  • Involved in complete requirement analysis, design, coding and testing phases of the project.
  • Participated in JAD meetings to gather the requirements and understand the End Users System.
  • Developed user interfaces using JSP, HTML, XML and JavaScript.
  • Generated XML Schemas and used XML Beans to parse XML files.
  • Created Stored Procedures & Functions. Used JDBC to process database calls for DB2/AS400 and SQL Server databases.
  • Developed the code which will create XML files and Flat files wif the data retrieved from Databases and XML files.
  • Created Data sources and Helper classes which will be utilized by all the interfaces to access the data and manipulate the data.
  • Developed web application called iHUB (integration hub) to initiate all the interface processes using Struts Framework, JSP and HTML.
  • Developed the interfaces using Eclipse 3.1.1 and JBoss 4.1 Involved in integrated testing, Bug fixing and in Production Support

Environment: Java 1.3, Servlets, JSPs, Java Mail API, Java Script, HTML, MySQL 2.1, Swing, Java Web Server 2.0, JBoss 2.0, RMI, Rational Rose, Red Hat Linux 7.1.

Confidential

Java/J2EE developer

Responsibilities:

  • DesignedanddevelopedStruts like MVC 2 Webframework using the front-controller design pattern, which is used successfully in a number of production systems.
  • Spearheadedthe “Quick Wins” project by working very closely wif the business and end users to improve the current website’s ranking from being 23rdto 6thin just 3 months.
  • Normalized Oracle database, conforming to design concepts and best practices.
  • Resolvedproduct complications at customer sites and funneled the insights to the development and deployment teams to adopt long term product development strategy wif minimal roadblocks.
  • Convinced business users and analysts wif alternative solutions dat are more robust and simpler to implement from technical perspective while satisfying the functional requirements from the business perspective.
  • Applied design patterns and OO design conceptsto improve the existing Java/JEE based code base.
  • Identified and fixed transactional issues due to incorrect exception handling and concurrency issues due to unsynchronized block of code.

Environment: Java 1.2/1.3, Swing, Applet, Servlet, JSP, custom tags, JNDI, JDBC, XML, XSL, DTD, HTML, CSS, Java Script, Oracle, DB2, PL/SQL, Weblogic, JUnit, Log4J and CVS.

We'd love your feedback!