We provide IT Staff Augmentation Services!

Big Data Hadoop Developer Resume

0/5 (Submit Your Rating)

Raleigh, NC

SUMMARY

  • 8+ years of experience in all phases of SDLC including application design, development, production support & maintenance projects.
  • Good hands on experience of Hadoop architecture and various components such as HDFS, JobTracker, Task Tracker, NameNode, Data Node and MapReduce programming paradigm.
  • Implemented Hadoop based Data warehouses, integrated Hadoop with Enterprise Data Warehouse systems.
  • Good experience in installing, configuring CDH 3/ CDH 4 clusters, using Hadoop ecosystem components like Hadoop MapReduce, HDFS, Pig, Hive, HBase, Sqoop, Oozie, Flume, Zookeeper.
  • Experience on Apache Hadoop Map Reduce programming, PIG Scripting, Distribute Application.
  • Experience in managing and reviewing Hadoop log files.
  • Extending Hive and Pig core functionality by writing custom UDFs.
  • Good understanding of NoSQL Data bases.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Good experience in analyzing data using HiveQL, Pig Latin, HBase and custom Map Reduce programs in Java.
  • Good experience in Web/intranet, client/server technologies using Java, J2EE, Servlets, JSP, JSF, EJB, JDBC and SQL.
  • Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
  • Hands on experience in application development using Java, RDBMS.
  • Extensively worked on database applications using DB2, Oracle 11g/10g, SQL.

TECHNICAL SKILLS

Big Data Ecosystem: Apache HDFS, MapReduce, HBase, Zookeeper, Hive, Pig, Sqoop, Oozie, Flume, CDH3, CDH4.

Hadoop Distributions: Cloudera CDH 3, CDH 4, Amazon Web Services

Languages: C++, Core Java, SQL, PL/SQL

Methodologies: UML, Design Patterns

Database: Oracle 11g/10g.

Application Server: Apache Tomcat 5.x 6.0, Jboss 4.0

Web Technologies: JDBC, JSP, JSF, EJB 2.0, XML, DTD, SOAP, Schemas, XSL, XSLT, XPath, HTML

Frameworks: Spring

Tools: SQL developer, DB visualize

IDE / Testing Tools: Eclipse

Operating System: Windows

Scripting Languages: JavaScript

Testing API: JUNIT

PROFESSIONAL EXPERIENCE

Confidential, Raleigh, NC

Big Data Hadoop Developer

Responsibilities:

  • Involved in installing Hadoop Ecosystem components.
  • Developed Scripts and Batch Job to schedule various Hadoop Programs.
  • Worked onHadoopcluster which ranged from 8-10 nodes during pre-production stage and it was sometimes extended up to 40 nodes during production
  • Established custom MapReduce programs in order to analyze data and used Pig Latin to clean unwanted data.
  • Did various performance optimizations like using distributed cache for small datasets, Partition, Bucketing in hive and Map Side joins.
  • Used Sqoop to import the data from RDBMS toHadoopDistributed File System (HDFS) and later analyzed the imported data usingHadoopComponents.
  • Created Hive tables, then applied HiveQL on those tables, this will invoke and run MapReduce jobs automatically
  • Performed loading and transforming large sets of Structured, Semi-Structured and Unstructured data and analyzed them by running Hive queries and Pig scripts.
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs which run independently with time and data availability
  • Performed Hadoopinstallation, updates, patches and version upgrades when required
  • Performed Cluster maintenance, Cluster Monitoring and Troubleshooting, Manage and review data backups and log files

Environment: CDH3,CDH4, PIG(0.8.1), HIVE(0.7.1), Sqoop (V1),Java, Eclipse, Flume, Zookeeper, Oozie, Oracle 11g, SQL Server 2008, HBase, Oracle 11g / 10g.

Confidential, Charlotte, NC

Big data Hadoop Developer

Responsibilities:

  • Involved with ingesting data received from various providers, on HDFS for big data operations.
  • Loaded and transformed large sets of structured, semi structured and unstructured data in various formats like text, zip, XML and JSON.
  • Supported MapReduce Programs those are running on the cluster.
  • Installed and configured Pig and also written Pig Latin scripts.
  • Imported data using Sqoop to load data from Oracle to HDFS on regular basis or from Oracle server to Hbase depending on requirements.
  • Wrote Hive queries for data analysis to meet the business requirements. Created Hive tables and working on them using Hive QL.

Environment: Pig, Hive, MapReduce, Sqoop, JavaScript.

Confidential, Richardson, TX

Big data Hadoop/Java developer

Responsibilities:

  • Launching and Setup of HADOOP Cluster which ranged 40 to 60 nodes on Cloudera CDH 4.
  • Involved in configuring different components of Hadoop.
  • Managed the Hive database, which involves ingest and index of data.
  • Hands on experience on Oozie workflow.
  • Responsible for Analysis, Design, Development and Integration of UI components with backend.
  • Used J2EEtechnologies such as Servlets,JavaBeans, JSP, JDBC.
  • Used Spring Framework 3.2.2 for transaction management.
  • Used Spring for Transaction management and Hibernate3 to persist the data into the database.
  • Created controller Servlets for handling HTTP requests from JSP pages.
  • Writing JavaScript functions for various validation purposes.
  • Implemented the presentation layer using Struts2 MVC framework.

Environment: Hive, Sqoop, Oozie, HDFS, MapReduce, J2EE, Struts, Spring, SQL, JSP

Confidential, Minneapolis, MN

Java developer

Responsibilities:

  • Responsible to prepare the Low Level design document and creating the reusable components for across the application
  • Assisted with code reviews, conducted technology discussion groups and presentations, act as resource for them.
  • Responsibility to do the Code Review and preparing the IQA for other application components.
  • Participating in customer meeting to identifying the difficulties if any, and discuss about the application status and get the requirements.
  • Utilized Java and MySQL from day to day to debug and fix issues with client processes.
  • Developed, tested, and implemented financial services application to bring multiple clients into standard database format.
  • Assisted in designing, building, and maintaining database to analyze life cycle of checking and debit transactions.
  • Used Java, J2EE application development and Object Oriented Analysis.
  • Used J2SE, XML, Web Services.
  • Involved in using JSP, Servlet, Java Server Face, EJB, JDBC, JUnit, SQL language.
  • Involved in database design with large database systems: Oracle.
  • Involved with Sun One Application Server, Web logic Application Server.
  • Worked using Application Server, Web Sphere Portal Server, and J2EE application deployment technology.

Environment: Java, J2EE, Junit, J2SE, SOAP, EJB, JDBC, Servlet, XML, Eclipse, Oracle, DB2 Tomcat.

Confidential

Java developer

Responsibilities:

  • Developing /enhancing applications built on the Java platform.
  • Responsible for completing development tests within each sprint according to acceptance criteria and system component integration needs.
  • Working closely with the solution quality analysts by providing debugging and testing assistance as needed to meet sprint objectives. Performed unit testing.
  • Creating technical documentation (assets) in the code per coding standards and assisting as needed in the documentation creation for the products' customers.
  • Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
  • Involved in complete requirement analysis, design, coding and testing phases of the project.
  • Participated in JAD meetings to gather the requirements and understand the End Users System.
  • Developed user interfaces using JSP, HTML, XML and JavaScript.
  • Generated XML Schemas and used XML Beans to parse XML files.
  • Created Stored Procedures & Functions. Used JDBC to process SQL Server databases.
  • Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.
  • Created Data sources and Helper classes which will be utilized by all the interfaces to access the data and manipulate the data.
  • Developed web application called iHUB (integration hub) to initiate all the interface processes using Struts Framework, JSP and HTML.
  • Developed the interfaces using Eclipse 3.1.1.

Environment: JSP, JavaScript, JQuery HTML, CSS,Ajax,Oracle 11g, HTML, MySQL 2.1, Swing, Java Web Server 2.0, Red Hat Linux 7.1.

We'd love your feedback!