We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Somerset, NJ

SUMMARY:

  • Having 8+ years of experience in IT field working on cutting edge technologies like HDFS, Java/J2EE in Telecom, Confidential, Healthcare and Insurance.
  • 2+ years of experience in developing and implementing Hadoopdistributed solutions to solve customer specific Big Data problems and create new Business insights.
  • Technical Analyst with 5+ years of IT experience in Object oriented programming (JAVA/J2EE) and Packagedapplications.
  • Experience on complete software development life cycle (SDLC) with software development models like Agile Scrum Model, Water fall model.
  • Expertise in using variousHadoopecosystem such as MapReduce, HDFS, Hive - HQL, Pig, HBase, Sqoop, Oozie, NoSQL Hbase, Flume for data storage and analysis.
  • Expertise in writing Hadoop Jobs for analyzing data using MapReduce, Hive and Pig.
  • Strong understanding ofHadoopdaemons and Map-Reduce concepts.
  • Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Have Experience developing Hive UDF and PIG UDF for Data Transformation and File Processing and processed structured and unstructured data.
  • Familiar with Hadoopstreaming, ApacheSolr, ETL toolslike Talend
  • Familiar with major Hadoop distributions Cloudera CDH, Apache Hadoop distribution, Hortonworks.
  • Deep knowledge in using front-end techniques such as HTML/HTML5, CSS/CSS3, Bootstrap, XML, JSON, AJAX, PHP, JavaScript, jQuery.
  • Extensive experience in J2EE technologies like Spring, JSF, Hibernate, Servlets, JSP, EJB, RMI and JMS.
  • Expertise in creating Business Requirements Document (BRD), preparing Functional Design Document (FDD), Detailed Technical Design Document (TDD),Effort estimation, coding standards, design / code review and test cases documents.
  • Ability to write complex SQLs needed for ETL jobs and analyzing data, and is proficient and worked with databases like Oracle 10g/9i, Flat Files and XML files.
  • Excellent communication, interpersonal and analytical skills with ability to manage and work effectively to Meet deadlines in fast paced environment.

TECHNICAL SKILLS:

Big Data & Hadoop: Hadoop 2.x, MapReduce, Hive, Pig, Oozie, HBase, HDFS, Sqoop, Flume.

Programming Languages: Java, SQL

Enterprise Components: Sip Servlet, Http Servlet, JSP, JMS, RMI,JDBC

Internet Technologies: Jquery, HTML, DHTML, JavaScript, XML,.

App Servers/ Middleware: Tomcat 5.x/6.0, OCCAS 4.0, JBoss 7.

IDE Tools: Eclipse

Design Tools: Astah, Start UML

Web Frameworks: Spring3.0, Hibernate 3.x

Databases: MySQL 5, Oracle 8i.

Operating Systems: Sun Solaries, Windows 2000/NT/XP.

Version Control: RTC, Clearcase.

Testing Tools: Sipp, Junit, Easy Mock

Build Tools: Apache ANT

Performance Monitoring Tool: JRMC 4.0

PROFESSIONAL EXPERIENCE:

Confidential, Somerset, NJ

Role: Hadoop Developer

Responsibilities:

  • Was responsible for data ingestion, data processing and data output/report phases of big data project using Hadoop ecosystem.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
  • Implemented multiple Map Reduce Jobs in java for data cleansing and pre-processing.
  • Experienced in loading data from UNIX file system to HDFS.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and extracted data from Teradata into HDFS using Sqoop.
  • CreatingSolrCloud collections to load charge code data and serve results to end users usingSolrLucene Queries with low latency requirements.
  • Created aSolrschema from the Indexer settings.
  • ImplementedSolrindex cron jobs.
  • Worked extensively with Sqoop for importing metadata from Oracle.
  • Configured Sqoop and developed scripts to extract data from SQL Server into HDFS.
  • Cluster co-ordination services through ZooKeeper.
  • Gained experience in managing and reviewing Hadoop log files.
  • Worked with business partners to gather business requirements.
  • Administrator for Pig, Hive and Hbase installing updates, patches and upgrades
  • Writing Perl, shell scripts for automation of log activity of application process andCRONjobs.

Environment: Hadoop, HDFS, Map Reduce, Apache Solr, Hive HQL, Pig, Sqoop, Hbase, Shell Scripting, Oozie, Hadoop distribution of Hortonworks, Jscript, Ajax, Unix, Cron, Json, Oracle 10g, Linux, SQL Server 2008, Ubuntu 13.04, Spring MVC, J2EE, Java 6.0, JDBC, Apache Tomcat.

Confidential, NJ

Role: Hadoop Developer

Responsibilities:

  • Application development using Hadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop.
  • Worked with the team to increase cluster from 28 nodes to 42 nodes, the configuration for additional data nodes was done by Commissioning process inhadoop.
  • Used Ambari for Managing Cloudera Distribution of Hadoop
  • Responsible for Cluster maintenance, adding and removing cluster nodes, Cluster Monitoring and Troubleshooting, manage and review data backups and log files.
  • Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
  • Designing and implementing semi-structured data analytics platform leveragingHadoop, withSolr.
  • Managing and scheduling Jobs on a Hadoop cluster.
  • Involved in defining job flows, managing and reviewing log files.
  • Installed Oozie workflow engine to run multiple Map Reduce, Hive HQL and Pig jobs.
  • Collected the log data from web servers and integrated into HDFS using Flume.
  • Responsible to manage data coming from different sources.
  • Automated the workflow using shell scripts
  • Gained experience with NOSQL database.
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts.

Environment: Java 6, J2EE 1.4, Spring, JBoss, HTML, XML, JavaScript, Web Services, Hadoop distribution of Cloudera, Linux, HDFS, Pig, Hive HQL, MapReduce, HBase, Sqoop, Oozie, and Flume.

Confidential, New York, NY

Role: Java/Hadoop Developer

Responsibilities:

  • Worked on Hadoop cluster which ranged from 4-8 nodes during pre-production stage and it was sometimes extended up to 24 nodes during production
  • Used Sqoop to import the data from RDBMS to Hadoop Distributed File System (HDFS) and later analyzed the imported data using Hadoop Components
  • Established custom MapReduces programs in order to analyze data and used Pig Latin to clean unwanted data
  • Did various performance optimizations like using distributed cache for small datasets, Partition, Bucketing in hive and Map Side joins.
  • Involved in creating Hive tables, then applied HiveQL on those tables for data validation.
  • Moved the data from Hive tables into Mongo collections.
  • Involved in loading and transforming large sets of Structured, Semi-Structured and Unstructured data and analyzed them by running Hive queries and Pig scripts
  • Participated in requirement gathering form the Experts and Business Partners and converting the requirements into technical specifications
  • Used Zookeeper to manage coordination among the clusters
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts.
  • Worked on Cloudera to analyze data present on top of HDFS.
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs which run independently with time and data availability
  • Assisted application teams in installing Hadoop updates, operating system, patches and version upgarades when required
  • Assisted in Cluster maintenance, Cluster Monitoring and Troubleshooting, Manage and review data backups and log files.

Environment: Hadoop, Pig, Hive, Sqoop, Cloudera Manager (CDH3), Flume, MapReduce, HDFS, JavaScript, Websphere, HTML, AngularJS, LINUX, Oozie, MongoDB.

Confidential

Role: Sr. Java Developer

Responsibilities:

  • Developed the application using Spring Framework that leverages classical Model View Controller (MVC) architecture.
  • Extensively worked on User Interface for few modules using JSPs, JavaScript and Ajax
  • Created Business Logic using Servlets, POJO's and deployed them on JBOSS
  • Track new Change Request, analyze requirement and design solutions.
  • Modified existing Java APIs on Performance & Fault management modules.
  • Took ownership of Implementing & Unit testing the APIs using Java, EasyMock and JUnit.
  • Designed new Tables and Modified exiting ones to in- corporate new statics and alarms in Mysql Database.
  • Involved in Build Process to package & deploy the JARs in Production Env.
  • Involved in Peer Code Review processes and Inspections.

Environment: EMS, Java 1.6, MySQL 5, Ant, Eclipse, JUnit, EasyMock, Spring, JBoss

Confidential

Role: Sr Java Developer

Responsibilities:

  • Involved in business requirement gathering & evaluation with the client.
  • Knowledge sharing with team to understand the User Stories, strategizing development methodology & task distribution.
  • Responsible for guiding team members to understand the business domain & module level functional requirements.
  • Drafting/ Review of SRS & SDS.
  • Implementation using Servlets, JSP
  • Unit Testing using Junit

Environment: Java6, J2EE1.4, MySQL 5, JSP, Servlets, java script, Eclipse IDE

Confidential

Role: Sr Java Developer

Responsibilities:

  • Documentation of Design Specifications.
  • Designing of project UML diagrams, Used Cases & ERD Diagrams.
  • Coding & Debugging done using the vi, Eclipse
  • Unit Testing done for all the applications using the STP Switches
  • Written the Junit test cases for testing.
  • Tracking and implemented CRs for the developed modules during project integration and system testing.
  • Involved in Peer Review processes and Inspections.

Environment: Core Java6, Swings, Eclipse, Sun Solaris, Clear Case, Ace tool for review and inspection.

Confidential

Role: Java Developer

Responsibilities:

  • Requirement Analysis done for the FRS base lined.
  • Designs for the SSR - Interactive, Schedule, and Monitor reports
  • Coding & Debugging done using the vi, Eclipse
  • Unit Testing done for all the applications using the STP Switches
  • Involved in Peer Review processes and Inspections.
  • System Test CR Fixes.

Environment: Java 4, Swings, Eclipse Sun Solaris, Clear Case, Ace tool for review and inspection.

We'd love your feedback!