We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Chicago, IL

PROFESSIONAL SUMMARY:

  • Over 7 years of professional IT experience which includes experience in Big data ecosystem related technologies.
  • Over 2+ years of extensive experience in BigData and excellent understanding/knowledge of Hadoop architecture and various components such as Hadoop MapReduce, HDFS, Hive, HBase, Flume, Pig,Cassandra etc.
  • Have experience in installing, configuring and using various Hadoop distributions like Cloudera, Horton Works and IBM Big Insight etc.
  • Experience in full software development life cycle (SDLC).
  • Hands on experience in installing, configuring and using ApacheHadoop ecosystem components like Hadoop MapReduce, HDFS, Hbase, Zookeeper, Oozie, Hive, Sqoop, Pig and Flume.
  • Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
  • Experience in managing and reviewing Hadoop log files.
  • Excellent understanding and knowledge of NOSQL databases like HBase, Cassandra.
  • Implemented in setting up standards and processes for Hadoop based application design and implementation.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Good understanding of HDFS Designs, Daemons, federation and HDFS high availability.
  • Experience in AWS cloud environment.
  • Experience in managing cloudera in AWS and developing MapReduce programon in AWS.
  • Experience in managing Hadoop clusters using Cloudera Manager tool.
  • Experience in installation, configuration, supporting and managing- Cloud Era’s Hadoop platformalong with CDH3&4 clusters.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Extensive experience working in Oracle, DB2, SQL Server and My SQL database.
  • Good knowledge in integration of various data sources like RDBMS, Spreadsheets, Text files and XML files.
  • Basic Knowledge of UNIX and shell scripting.
  • Ability to adapt to evolving technology, strong sense of responsibility and accomplishment.

TECHNICAL SKILLS:

Big Data Technologies: Hadoop, HDFS, Hive, MapReduce, Pig, Sqoop, Cassandra, Flume, Zookeeper, Oozie, Json

Programming Languages: C, C++, Java, Shell Scripting

Java/J2EE Technologies: Java, Java Beans, J2EE (JSP, Servlets, EJB), JDBC, MySQL, Spring.

DB Languages: SQL, PL/SQL

NoSQL Databases: Hbase

Application Servers: Tomcat

Operating Systems: LINUX, Windows XP, 7, MS DOS

Office Tools: MS-OFFICE - Excel, Word, PowerPoint, Log4J, Maven, JUnit, Netbeans, WinSCP.

IDEs/Applications: Eclipse, Spring Suite Tool, CVS, SVN, Apache ANT.

WORK EXPERIENCE:

Hadoop Developer

Confidential, Chicago, IL

Responsibilities:

  • Proactively monitored systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures.
  • Involved in analyzing system failures, identifying root causes, and recommended course of actions.
  • Documented the systems processes and procedures for future references.
  • Monitored multiple Hadoop clusters environments using Ambari.
  • Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, network devices and pushed to HDFS.
  • Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page- views, visit duration, most purchased product on website.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Integrated Oozie with the rest of the Hadoop stack supporting several types of Hadoop jobs out of the box (like JAVA MapReduce, Pig, Hive, Sqoop) as well as system specific jobs (such as JAVA programs)
  • Experience in AWS cloud environment.
  • Experience in managing cloudera hadoop in AWS and developing MapReduce programs in AWS.
  • Developed application component interacting with Cassandra.
  • Developed entire data transfer model using Sqoop framework
  • Developed functionally equivalent batches using Hive
  • Performed testing and performance tuning on the Hadoop infrastructure

Environment: Hadoop-1.1.2, Cassandra, Hive-0.10.0, Pig-0.11.1, Sqoop-1.4.3, AWS, Zookeeper-3.4.5, Flume-1.2.0, Oozie-3.3.2, Linux, Eclipse Juno, JDK-1.7.21.

Hadoop Developer

Confidential, Chicago, IL

Responsibilities:

  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase NoSQL database and Sqoop.
  • Importing and exporting data in HDFS and Hive using Sqoop.
  • Configured Flume to load data from server.
  • Extracted files from Cassandra through Sqoop and placed in HDFS and processed.
  • Experience with NoSQL databases.
  • Experience in writing MapReduce programs with Java API to cleanse Structured and unstructured data
  • Written Hive UDFS to extract data from staging tables.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Familiarized with job scheduling using Fair Scheduler so that CPU time is well distributed amongst all the jobs.
  • Developed Oozie workflow to automate Job scheduling on dependencies.
  • Worked with enterprise Schedulers to automate job scheduling process
  • Involved in the regular Hadoop Cluster maintenance such as patching security holes and updating system packages.
  • Analyzed the web log data using the HiveQL.

Environment: Java, Eclipse, Hadoop, Hive, Hbase, Cassandra, Linux, Map Reduce, HDFS, Shell Scripting, MySQL.

Java Developer

Confidential, MI

Responsibilities:

  • Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
  • Involved in complete requirement analysis, design, coding and testing phases of the project.
  • Participated in JAD meetings to gather the requirements and understand the End Users System.
  • Developed user interfaces using JSP, HTML, XML and JavaScript.
  • Generated XML Schemas and used XML Beans to parse XML files.
  • Created Stored Procedures & Functions. Used JDBC to process database calls for DB2/AS400 and SQL Server databases.
  • Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.
  • Created Data sources and Helper classes which will be utilized by all the interfaces to access the data and manipulate the data.
  • Developed web application called iHUB (integration hub) to initiate all the interface processes using Struts Framework, JSP and HTML.
  • Developed the interfaces using Eclipse 3.1.1 and JBoss 4.1 Involved in integrated testing, Bug fixing and in Production Support

Environment: Java 1.3, Servlets, JSPs, Java Mail API, Java Script, HTML, MySQL 2.1, Swing, Java Web Server 2.0, Red Hat Linux 7.1.

Java Developer

Confidential

Responsibilities:

  • Capturing project requirements and analyzing the requirements.
  • Involved in analysis, design and developing front end UI using JSP, HTML, DHTML and JavaScript.
  • Build the whole application. Application was completely build on MVC architecture using some internal custom frameworks.
  • Developed Adjustment screens using JAVA and Servlets.
  • Prepared workflow diagrams using MS VISIO and modeled the methods based on OOPS methodology.
  • Developed the Host modules using C++, DB2 and SQL.
  • Responsible for creating the front-end code and java code to suit the business requirement.
  • Written Ant scripts for build, unit testing, deployment, check styles etc.
  • Created tables, stored procedure fulfill the requirements and accommodate the business rules in Oracle 8i database.

Environment: Java, HTML, Java, JSP, Servlets, SQL, DB2, PL-SQL, JDK JDBC, Eclipse, Ant.

We'd love your feedback!