We provide IT Staff Augmentation Services!

Hadoop And Spark Developer Resume

0/5 (Submit Your Rating)

SUMMARY

  • Overall 3+ years of professional IT experience in Big data ecosystem and Core Java related technologies.
  • Excellent knowledge in Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Spark framework
  • Hands on experience in installing configuring and using Hadoop ecosystem components like Apache Spark, HDFS, HBase, Spark SQL, Sqoop, Zookeeper, Kafka, and Flume.
  • Hands - on fundamental building blocks of Spark - RDDs and related manipulations for implementing business logics Like Transformations, Actions and Functions performed on RDD.
  • Depth understanding of Data-frames and Data-Sets in Spark SQL
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Good understanding of cloud configuration in Amazon web services AWS EC2 and good knowledge on AWS S3 as a storage mechanism.
  • Hands on experience with SCALA for the batch processing and Spark streaming data
  • Strong experience in Object-Oriented Design, Analysis, Development, Testing, and Maintenance
  • Worked on IDE’s such as Eclipse and IntelliJ for developing, deploying and debugging the applications.
  • Expertise in working with relational databases such as Oracle 10g, SQL Server 2012.
  • Good knowledge of stored procedures, functions, etc. using SQL and PL/SQL.
  • Expertise in Core Java technologies like Servlets, JSP, Struts, and JDBC.
  • Experience on Test Automation tools like Soap UI
  • Strong knowledge of Software Development Life Cycle and expertise in detailed design documentation
  • Excellent Problem-Solving Skills, Communication Skills, Ability to perform at a high level and meet deadlines

TECHNICAL SKILLS

Big Data: HDFS, Apache Spark, Spark SQL, Spark streaming, Zookeeper, Hive, Sqoop, HBase, Kafka, Flume, Yarn, AWS EC2

Languages: Scala, Java, Knowledge on Python, Shell Scripting

Java Technologies: JSP, Servlets, JDBC

Client Technologies: JavaScript, AJAX, CSS, HTML, XHTML

Web technologies: JSP, Servlets, Socket Programming, JDBC, JavaScript, Web Services Database MySQL, Oracle 10g/11g, Microsoft SQL Server 2012

IDE / Testing Tools: Eclipse, IntelliJ IDEA

Operating System: Windows, Unix, Linux

Tools: SQL Developer, Maven.

PROFESSIONAL EXPERIENCE

Confidential

Hadoop and spark Developer

Responsibilities:

  • Involved in requirement gathering to connect with Business Analysis.
  • Working Closely with Business Analysis & Client for creating technical Documents like High-Level Design and low-Level Design specifications.
  • Involved in Setup AWS Amazon EC2 9 Node Cluster for Hadoop
  • Worked on nine nodes CDH3 Hadoop cluster on Red Hat LINUX.
  • Implemented best income logic using Spark SQL
  • Loading and transforming of large sets of structured data, semis structured data and unstructured data.
  • Imported data using Sqoop to load data from MySQL to HDFS on regular basis.
  • Developing RDDS to schedule various Hadoop Program.
  • Written SPARK SQL Queries for data analysis to meet the business requirements.
  • Worked on defining job flows.
  • Cluster coordination services through Kafka and Zookeeper.
  • Serializing JSON data and storing the data into tables using Spark SQL.
  • Writing Shell scripts to automate the process flow.
  • Storing the extracted data into HDFS using Flume
  • Worked on multiple file formats including XML, JSON, CSV and other compressed file formats
  • Developed Kafka producer and consumer components for real-time data processing.
  • Writing queries in Spark SQL using Scala
  • Provided testing support and troubleshooting and fixing the issues

Environment: Hadoop HDFS, Apache Spark, Spark-Core, Spark-SQL, Scala, JDK 1.7, Sqoop, Eclipse, MySQL, AWS EC2, CentOS Linux and ZooKeeper

Confidential

Java Developer

Responsibilities:

  • Designed and implemented the training and reports modules of the application using Servlets, JSP, and Ajax.
  • Developed custom JSP tags for the application.
  • Used Quartz schedulers to run the jobs sequentially at given time.
  • Implemented design patterns like Filter, Cache Manager, and Singleton to improve the performance of the application.
  • Implemented the reports module of the application using Jasper Reports to display dynamically generated reports for business intelligence.
  • Deployed the application in client's location on Tomcat Server.
  • Involved in various phases of the Software Development Life Cycle (SDLC).
  • Developed user interfaces using JSP framework with AJAX, Java Script, HTML, XHTML and CSS.
  • Created tables, stored procedures in SQL for data manipulation and retrieval.
  • Responsible to manage data coming from various sources.
  • Communications Module and Inventory Management System.
  • Analyzing and preparing the requirement Analysis Document
  • Used to Core Java and EJB to handle the business flow and Functionality.
  • Monitoring of test cases to verify actual results against expected results.
  • Normalized SQL database conforming to design concepts and best practices.
  • Resolved product complications at customer sites and funneled the insights to the development and deployment teams to adopt long-term product development strategy with minimal roadblocks.

Environment: Java, Core Java, JSP, Spring, SQL server, Eclipse, JUnit, AJAX, Web services, XML Schema, HTML, CSS, JavaScript, WebLogic

We'd love your feedback!