We provide IT Staff Augmentation Services!

Apache Hadoop & Spark Developer Resume

5.00/5 (Submit Your Rating)

Fort Mills, SC

SUMMARY:

Overall 9 years of total experience and 4+ years of experience in Big data and its ecosystem component.

TECHNICAL SKILLS:

Hadoop Distribution: Cloudera (CDH 5.6)

Hadoop Components: MR, Hive, Pig, Sqoop, Spark Core, Spark SQL, Oozie, Spark Streaming

Programming Languages: Java, Scala

Databases: HBase, Oracle 11g, MySQL.

Search engine: Elastic Search

Other Technical Skills: SQL, PLSQL, UNIX, Linux, Shell Scripting.

Tools: Eclipse, SBT, SQL Plus, SQL Developer, Putty, WinSCP, Maven, Git, SVN.

PROFESSIONAL EXPERIENCE:

Apache Hadoop & Spark Developer

Confidential, Fort Mills, SC

Environment: Spark, Scala, Hbase, Elastic Search, File Beats, Log stash, Web services, Kafka

Responsibilitie:

  • Worked on Auto - Regex, Message Subset Generation & Message classification module Using spark and Scala
  • Setting up File Beats, log stash to index all error messages into Elasticsearch.
  • Writing Hbase and Elastic Search Java Clients and queries.
  • Lookup API for company Search for around 16 lakhs companies
  • Developed Elastic Search Custom Scoring plug-in.
  • Scraping data and indexing it to elasticsearch

Hadoop Developer

Confidential

Environment: HDFS, Sqoop, Hortonworks, Hive, Log Stash, Pig Map Reduce

Responsibilities:

  • Worked on Hortonworks platform. Developed data pipeline using Log stash and Sqoop to ingest customer behavioral data and financial histories from traditional databases into HDFS for analysis.
  • Involved in writing Map Reduce jobs.
  • Involved in Sqoop, HDFS Put or Copy from Local to ingest data.
  • Involved in developing Pig UDFs for the needed functionality that is not available from Apache Pig.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Involved in developing Hive DDLs to create, alter and drop Hive tables.
  • Involved in processing ingested raw data using Map Reduce, Apache Pig and Hive.
  • Responsible for developing data pipeline using Log stash, Sqoop and pig to extract the data from weblogs and store in HDFS.
  • Involved in developing Hive UDFs for the needed functionality that is not available from Apache Hive.
  • Involved in developing Hive UDFs for the needed functionality that is not available from Apache Hive.
  • Involved in using SQOOP for importing and exporting data into HDFS.
  • Worked on developing Shell scripts to orchestrate execution of all other scripts (Pig, Hive, and Map Reduce) and move the data files within and outside of HDFS.

Hadoop Developer

Confidential, Basking Ridge,NJ

Environment: HDFS, MapReduce2, YARN, Hive, HBase, Sqoop, Scala, Log Stash

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Experience in Job management using Fair scheduler and Developed job processing scripts using Oozie workflow.
  • Used Spark-Streaming APIs to perform necessary transformations and actions on the fly for building the common.
  • Configured deployed and maintained multi-node Dev and Test Kafka Clusters.
  • Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
  • Developed Scala scripts, UDFFs using both Data frames/SQL and RDD/MapReduce in Spark 1.6 for Data
  • Aggregation, queries and writing data back into OLTP system through Sqoop.
  • Experienced in performance tuning of Spark Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning.
  • Loaded the data into Spark RDD and do in memory data Computation to generate the Output response.
  • Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDD's.
  • Worked on migrating Map Reduce programs into Spark transformations using Spark and Scala.
  • Worked extensively with Sqoop for importing metadata from Oracle.
  • Good Experience working with Amazon AWS for setting up Hadoop cluster.
  • Involved in creating Hive tables, and loading and analyzing data using hive queries.

Java Developer

Confidential, Jersey City, NJ

Environment: Core Java/J2EE, Servlet, EJB, JSP, Struts1.3 (MVC2), Hibernate (ORM), JDBC, HTML, XML, CSS, J-Query Ajax, UI-Development, HLD-LLD, Linux, SVN Repository.

Responsibilities:

  • Developed JSP pages, Action classes, Generates DTOs, DAOs, Form beans, Action
  • Classes, Procedure Metadata, and Reports Modules, Pie Chart implementation.
  • Worked on validations “server side & client side” using J.Query.
  • Worked on Transaction.
  • Implemented the business logic used under the various modules in the System.
  • Understanding and Analyzing requirements.
  • Responsible for generating UI for the application.
  • Developed SVN Repository for server.
  • Involved in testing & debugging.
  • Requirement gathering/collection.

Java/ J2EE Developer

Confidential, Middletown, NJ

Environment: Java, J2EE, Flex 3.0, ActionScript 3.0, LCDS, GWT 2.4.0, PureMVC framework, Cairngorm, Spring 2.5.2, JSF 1.2, ICEFaces 1.7, EJB 3.0, JMS, Ajax, Kodo 4.2, OpenJPA 1.0.2, Web

Responsibilities:

  • Responsible for Architecture, Design and Development
  • Involved in requirement review and application design.
  • Involved in integrating different projects and coordinating with different teams
  • Involved in migrating repository code bases from Clear Case to SVN.
  • Involved in mentoring the teams on Flex Cairngorm, and Restful Web Services.
  • Involved in developing POMs using Maven.
  • Developed interacting UI screens using Adobe Flex and Cairngorm Framework
  • Developed Services using Restful Web Services and Spring Configurations for MQueues.
  • Lead a team of five on both UI and Services for the Design and Development.
  • Handled Multiple Projects simultaneously, while working on the development.
  • Developed Unit Test Cases using Flex Unit.
  • Extensive usage of Clear Case to do the builds and baselines for the project.
  • Extensive Usage of HP Opsware client to create packages for UI and Services, Data Sources, MQ
  • Queues and monitored the builds deployment.
  • Developed DAO’s using Home Depot Internal Framework for interacting with DB

Java Developer

Confidential, Richmond,VA

Environment: SunOne, Java Servlets, JSP,Java script, AJAX, Amateras UML 1.3.1, Maven, MyEclipse 6.0, JUnit, CVS,SQL,PL-SQL, SQL Server 9.0, Filezilla, Struts, Jato, UNIX scripts.

Responsibilities:

  • Involved in design, development and implementation of the application.
  • Used OO techniques such as UML methodology and developed class diagrams and sequence diagrams that depict the code’s design and its compliance with the functional requirements using AmaterasUML 1.3.1 tool.
  • Involved in identifying the reengineering framework for the redesign.
  • Worked on making mainframe calls to get the reply set and returning the data as java objects/collections from Model to the View.
  • Involved in complete migration of jato to struts framework.
  • Worked on Struts framework-Struts ActionController, Model and View (JSP),
  • ActionForm, JavaBeans, Java objects and Collections, java script, Struts taglibs.
  • Made UI changes using Cascading Style Sheets (CSS) and functional enhancements for screens Loan modules that were in JATO framework.
  • Worked on Jato framework-ViewBean, TiledViewBean, Jato tag libraries and using JATO tags in JSP.
  • Used Maven to build the war (web-archive) file and deployed in Sun One Application Server.
  • Used Java Script extensively to code client side validations.
  • Supported System Testing, UAT Testing and Production.

We'd love your feedback!