We provide IT Staff Augmentation Services!

Big Data Developer Resume

0/5 (Submit Your Rating)

Arlington Heights, IL

SUMMARY

  • 6+ years of experience in Software Development Life Cycle (SDLC) working with Java/J2EE and BigData technologies in industries including banking, insurance and e - commerce
  • Hands on experience in Big Data Ecosystem with Hadoop, HDFS, MapReduce, Hive, HBase, Sqoop, Flume, Kafka, Oozie, Solr, ZooKeeper and Spark
  • Experience in writing MapReduce programs to manipulate and analyze unstructured data
  • Experience using Sqoop to import data into HDFS from RDBMS and vice-versa.
  • Experience in data collection, processing and streaming with Kafka
  • Knowledge on serialization formats like Sequence File, Avro, Parquet
  • Working knowledge with Spark, Scala and Python
  • Proficient in writing HiveQL and SQL queries to achieve data manipulation
  • Experienced in writing custom UDFs to extend Hive core functionality
  • Working experience NoSQL Database including HBase 0.98
  • Involved in batch job scheduling workflow using Oozie
  • Expertise in Core Java, Data Structures, Algorithms, Object Oriented Design (OOD) and Design Pattern
  • Extensive experience in web frameworks including J2EE, Spring MVC, Hibernate
  • Front End with HTML5/4, CSS3, JavaScript, jQuery, AngularJS, AJAX, Bootstrap
  • Adept in Data Visualization with D3.js, Tableau
  • Working experience with RDBMS including MySQL 5.x, MS SQL Server 2008, Oracle SQL 10g
  • Experience in TDD (Test Driven Development) methodologies
  • Hands on experience in unit testing such as JUnit, MRUnit
  • Worked in development environment like Git, SVN, JIRA, Confluence, Jenkins, Agile/Scrum and Waterfall
  • Passionate, results-driven developer with superb interpersonal skills and ability to perform well under pressure

TECHNICAL SKILLS

Hadoop Ecosystem\Web Framework: Hadoop, Spark, MapReduce, Pig, Hive, Sqoop, \Javascript, jQuery, AngularJS, HTML5/4, \ Flume, Kafka, Solr, Hbase, Oozie, Zookeeper, \CSS3, J2EE, Spring MVC, Hibernate Kerberos, MRUnit

Programming Language\Data Analysis & Viz: Java, Python, JavaScript, Scala, Ruby\Matlab, Tableau, D3.js

Scripting Language\Database: UNIX Shell, HTML, XML, CSS, JSP, SQL, \ MySQL 5.x, Oracle 10g, MS SQL Matlab\Server 2008, HBase 0.98

Operating Systems\Environment: Mac OS, Ubuntu, CentOS, Windows\Agile, Scrum, Waterfall

IDE Application\Collaboration: Eclipse, NetBeans, Sublime Text, PyCharm, \Git, SVN, JIRA, Confluence Notepad++

PROFESSIONAL EXPERIENCE

Confidential, Arlington Heights, IL

Big Data Developer

Responsibilities:

  • Used RabbitMQ & Kafka as the messaging system to collect data from various sources to either HDFS or streaming engine
  • Developed RabbitMQ & Kafka consumer program using Java
  • Involved in processing streaming data using Spark streaming with Scala
  • Used Apache Avro/protocol buffer to transform data between different format
  • Used Sqoop to transfer data between RDBMS and HBase
  • Developed Java API for BI teams to faster query data from HBase
  • Created Hive tables on top of HDFS and developed Hive queries for the analysts
  • Implemented Partitioning, Dynamic Partitions, Buckets in Hive
  • Performed unit testing using JUnit
  • Worked closely with team members, managers and other teams

Environment: Hadoop 2.x, MapReduce, Sqoop, Hive, Hbase, Kafka, RabbitMQ, Java 7, Spark, Scala, Avro, Zookeeper, eclipse

Confidential, Mayfield Village, OH

Big Data Developer

Responsibilities:

  • Used Kafka to collect data sent by the sensor in the cars
  • Implemented Spark using Scala and Spark SQL for faster testing and processing of data
  • Used Sqoop to import data from Oracle to HDFS
  • Implemented data serialization using Apache Avro
  • Used the Hbase Java API to migrated data between the HDFS and HBase
  • Used Oozie scheduler system to automate the pipeline workflow
  • Implemented external tables and dynamic partitions using Hive
  • Wrote Hive custom UDF to analyze data by given schema
  • Used Git for version control, JIRA for project tracking, Confluence for documentation collaboration
  • Weekly meetings with technical collaborators and active participation in code review sessions with other developers

Environment: Apache Hadoop 2.x, MapReduce, HDFS, Hive, Apache Avro, Sqoop, Oozie, Kafka, Oracle 11g, Linux, Java 7, eclipse

Confidential, Salt Lake City, UT

Big Data Developer/Analyst

Responsibilities:

  • Used Sqoop to transfer data between Oracle and HDFS
  • Responsible for design and creation of Hive tables, partitioning, bucketing, loading data and writing hive queries.
  • Created HBase tables to store inquiry, email and other semi-structured data
  • Responsible to manage data coming from different sources
  • Involved in managing and reviewing Hadoop log files.
  • Developed multiple Map Reduce jobs in Java for data cleaning and preprocessing
  • Automated workflows using Oozie to pull data from various databases into Hadoop

Environment: Apache Hadoop 2.x, MapReduce, HDFS, Hive, HBase, Oozie, Sqoop, Linux, Oracle 11g, Java 7, Eclipse

Confidential, Boise, ID

Java Fullstack Developer

Responsibilities:

  • Actively participated in the Agile Development Process
  • Performed in different phases of the Software Development Lifecycle (SDLC) of the application, including: requirements gathering, analysis, design, development and deployment of the application
  • Developed front end with HTML, CSS, JavaScript, JSP, AJAX, jQuery, Bootstrap and HIGHCHARTS
  • Implemented Jersey JAX-RS API to create RESTful Web Services
  • Regularly contributed new features to all layers of the project from the User Interface to the DataBase
  • Optimized one tool’s data loading performance with 60% speedup
  • Design and implemented database
  • Regularly fixed bugs to all layers of the project
  • Developed unit test cases with JUnit
  • Experienced in using Maven, Sonar, Jenkins and Atlassian suite (JIRA, Fisheye, Confluence)

Environment: NetBeans, MS SQL Server, JDBC, JSP, HTML, CSS, JavaScript, jQuery, AJAX, JUnit, JSON, Jersey, RESTful, Agile, Maven, Sonar, Jenkins, JIRA, Fisheye, Confluence

Confidential, Logan, UT

Java Developer

Responsibilities:

  • Involved in requirement review, design document preparation, UML model diagrams and UI Mockup creation
  • Designed and implemented a three-party authentication protocol based on Needham-Schroeder-Lowe protocol
  • Developed a key service for encryption purpose
  • Implemented API’s for the phMPI
  • Performed debugging and functionality enhancements
  • Developed unit test cases with Junit

Environment: Eclipse, MySQL, JDBC, JUnit, SVN, Visual Paradigm UML

Confidential

Java Fullstack Developer

Responsibilities:

  • Worked in Spring MVC Framework with Agile methodology
  • Developed front end with HTML, CSS, JavaScript, JSP, AJAX and jQuery
  • Translated designs and style guides provided by the UI Designer into functional user interfaces
  • Implemented Jersey JAX-RS API to create RESTful Web Services
  • Implemented DAO using Hibernate for database connectivity to MySQL database
  • Wrote Stored Procedures for querying, inserting and managing the database
  • Developed unit test cases with JUnit

Environment: Spring MVC, MySQL, JSP, HTML, CSS, JavaScript, jQuery, AJAX, Hibernate 3, JUnit, JSON, Jersey, RESTful, Agile

We'd love your feedback!