We provide IT Staff Augmentation Services!

Big Data/hadoop Developer Resume

4.00/5 (Submit Your Rating)

Fl, UsA

PROFILE SUMMARY:

  • 8 Years of experience in the field of Information Technology and has good working experience as a Full Stack Web Developer, C# Developer, Lead Java Developer and from the past 3 Years as Hadoop Developer.
  • Around 8 years of experience in IT industry with strong emphasis on Object Oriented Analysis, Design, Development and Implementation, Testing and Deployment of Big data Software Applications and Web enabled applications.
  • 3 years of experience with Hadoop, HDFS, MapReduce (MRV1 & MRV2 YARN) and Hadoop Ecosystem (Pig, Hive, Hbase, Sqoop, Flume, Zookeeper, Oozie)
  • Cloudera Certified Hadoop Developer EMC Certified Data Science Associate
  • Experience working with NoSQL systems such as Redis, Cassandra and HBase.
  • Experience working with ELK (ElasticSearch, Logstash, Kibana) Stack.
  • Experience writing Map Reduce Jobs, HIVEQL, Pig Scripts.
  • Good experience in optimizing MapReduce algorithm using combiners and partitioners to deliver the best results.
  • Good Experience in data loading from RDBMS system to HDFS system using Sqoop and Flume.
  • Performed data analytics using PIG, Hive and HAWQ.
  • Developed user defined functions to provide custom hive and pig additional capabilities.
  • Developed interactive visualizations using Tableau and D3.js.
  • Experience in designing Big Data solutions for traditional enterprise businesses.
  • Experience in Setting up Hadoop in various modes and integrating Hadoop Ecosystem with Hadoop.
  • Good experience in writing Map Reduce jobs using Java native code, Pig, Hive for various business use cases.
  • Built complex distributed systems that implement state of the art analytics on Hadoop.
  • Built large distributed systems that scale well.
  • Experience in debugging the platform issues that arise with the open source software being used and fix them.
  • Diving deep into specific Hadoop and non - Hadoop technologies and presenting those research activities to the team.
  • Ability to work in a fast changing environment and learn new technologies effortlessly.
  • Participate in code reviews, software design sessions, and architectural reviews.
  • Design and implement map reduce jobs to support distributed processing using java, hive and pig.
  • Build libraries, user defined functions, and frameworks around Hadoop.
  • Research, evaluate and utilize new technologies/tools/frameworks around Hadoop eco system.
  • Defined and build data acquisitions and consumption strategies.
  • Experience in developing J2EE applications with Struts, JSP and Hibernate.
  • Experience in developing Java applications using Design Patterns and best practices.
  • Experience working with various RDBMS including Oracle 11i, MS SQL Server, MySQL.
  • Strong understanding of Data warehouse concepts, ETL, Star Schema, Snowflake, data modeling experience using Normalization, Business Process Analysis, Reengineering, Dimensional Data modeling, Fact & dimensions tables, physical & logical data modeling.
  • Proficiency in programming with different Java IDE's like Eclipse, IntelliJ and Net Beans.

TECHNICAL SKILLS:

Operating Systems: Windows, Linux (Ubuntu, CentOS and Redhat)

Big Data Platform: Hadoop, Java MapReduce (MRV1, MRV2 YARN), Pig, Hive, HBase, Sqoop, Zookeeper, Oozie, Flume, Storm, Redis, Cassandra, HBase, Tableau, D3.js

Programming Skills: Core Java, Python, C, PHP

Web Technologies: JSP, Servlets, JSF, HTML, CSS, JQuery

Databases: RDBMS MySQL, PostgreSQL

NoSQL: Redis, Cassandra, HBase

Frameworks: Struts, Hibernate

Other Tools: Maven, JUNIT, LOG 4j, Eclipse IDE

PROFESSIONAL EXPERIENCE

Confidential, FL, USA

Big Data/Hadoop Developer

Tools: /Tech Used: Java, Hadoop, MapReduce, HDFS, Hive, Pig, Storm, ELK Stack (Elasticsearch, Logstash, Kibana), JSON, Redis, Cassandra, D3.js, Dc.js, Crossfilter.js, Eclipse

Responsibilities:

  • Gathered the business requirements from the Business Partners and Subject Matter Experts.
  • Used Logstash to ship logs from source nodes in a securely and easy way to a centralized log service
  • Used Redis a key-value in-memory NoSQL data store as a Messaging queue which acts as indirect connection to source to buffer source logs coming from multiple sources
  • Used Apache Storm as a Real time event processing engine for log data coming from 50 odd servers
  • Used Drools as a rule based engine for real-time alerting and with a historical analytics feedback from Hadoop
  • Responsible to manage data coming from different sources.
  • Used Elasticsearch and Kibana to enable Real Time search on email logs for faster root cause analysis and real-time interactive charts with widget support.
  • Used Cassandra a columnar data store optimized for writes and pre-computed aggregations to persist various statistics
  • Used PHP Slim API and PHPCassa to develop web services on top of Cassandra to support for real-time querying using Rest API
  • Used Visualization Stack (D3.js, Crossfilter.js and Dc.js) for developing interactive charts on data pulled from web services
  • GUI was developed using Twitter Bootstrap.
  • Developed and Supported Map Reduce Programs those are running on the cluster.
  • Developed Storm Topology that feeds data from Redis and ingests data into ElasticSearch, Cassandra and Hadoop
  • Implemented source level filtering and tagging data using Grok patterns in LogStash.
  • Created Hive tables and working on them using Hive QL.
  • Involved in installing Hadoop Ecosystem components.
  • Used to manage and review the Hadoop log files.
  • Wrote Map Reduce job using Java API.
  • Supported Map Reduce Programs those are running on the cluster.
  • Involved in HDFS maintenance and loading of structured and unstructured data.
  • Wrote Hive queries for data analysis to meet the business requirements.
  • Installed and configured Pig and also written Pig Latin scripts.
  • Developed UDFs for Pig Data Analysis.
  • Worked with Log4J for logging purpose in the project
  • Involved in managing and reviewing Hadoop log files.
  • Developed Scripts and Batch Job to schedule various Hadoop Program.
  • Utilized Agile Scrum Methodology to help manage and organize a team of 4 developers with regular code review sessions.
  • Used JUnit for unit testing and Continuum for integration testing.
  • Worked hands on with ELT process.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Used Oozie workflow engine to run multiple Hive and Pig jobs.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.

Confidential, Hopkinton, MA

Sr.Java Developer / Hadoop Developer

Tools: /Tech Used: Java, Hadoop, MapReduce, HAWQ, Hive, Pig, Linux, JSON, MySQL, Eclipse, Tableau

Responsibilities:

  • Load and transform large sets of structured, semi structured and unstructured data.
  • Involved in defining job flows, managing and reviewing log files.
  • Developed a MapReduce program to anonymize the actual user names in email address with dummy user names.
  • Handled importing of data from various data sources, performed transformations using Hive, Pig and MapReduce
  • Supported Map Reduce Programs those are running on the cluster.
  • Involved in loading data from UNIX file system to HDFS.
  • Installed and configured Hive and also written Hive UDFs.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in Map Reduce way.
  • Gained very good business knowledge on sendmail, mail logs format, error codes, mail server’s behavior, syslog logging process etc.
  • Responsible and active in the analysis, design, implementation and deployment of full Software Development Lifecycle (SDLC) of the project.
  • Validated the fields of user registration screen and login screen by writing JavaScript validations.
  • Developed multiple real-time interactive dashboards that connects to HAWQ and queries live data.

Confidential

Sr.Java Developer / Hadoop Developer

Tools: /Tech Used: Core Java, Hadoop, MapReduce, HAWQ, HBase

Responsibilities:

  • Developed Map Reduce programs to validate files with given rules in parallel which is sequential earlier
  • Implemented Cron Jobs to run the Map Reduce jobs on scheduled intervals
  • Involved in loading and transforming large sets of structured data in to HBase and HAWQ using Map Reduce programs
  • Involved in designing and reviewing the architecture
  • Implement support for SCD Type 2 using HBase versioning support
  • Wrote Map Reduce Programs to perform Joins on HBase Tables
  • Loaded data into HAWQ Internal tables for faster point queries for fact data
  • On the fly load of Dimensional data for supporting queries on slowly changing dimension data
  • Customized the PXF Extension for support of versioning by passing timestamp as argument to queries
  • Developed Reports on top of HAWQ by writing HAWQ SQL Queries

Confidential

Java Developer

Tools: /Tech: Core Java, Struts, Oracle 10g, Servlets, JSP, JavaScript, JQuery, HTML, CSS.

Responsibilities:

  • Involved in the development of GUI (Graphical User Interface) components like JSP, JQuery and HTML.
  • Used open source tools such as JQuery UI Widgets, JQuery Tabs, JQuery Data tables for developing a rich front-end
  • The Web-Tier layer is managed using Struts, JSP's, Struts Tag Lib and Java Script.
  • Developed Struts action classes, action forms and performed action mapping using Struts framework and performed data validation in form beans and action classes.
  • Extensively used Struts framework as the controller to handle subsequent client requests and invoke the model based upon user requests.
  • Always used the best practices of Java/J2EE, continuous re-factoring of code, minimize database calls, optimized queries to get better performance of application.
  • Extensively used Java Design Patterns and best practices while coding various modules of the project.
  • Developed and tuned the database SQL queries.
  • Used Hibernate ORM to interact with the oracle database to retrieve, insert and update the data.
  • Involved in mapping data base tables with Business objects using Hibernate (.hbm.xml files)
  • Prepared JUnit test classes for the application.
  • Involved in Unit Testing and Integration of the Modules.
  • Participated in System Testing and Environment Testing
  • Regression Testing is also the part of the Project which involves in solving the problems in the earlier versions
  • Developed JSP custom tags (library).
  • Java Beans were used to handle business logic as a Model and Servlets to control the flow of application as Controller.
  • Front end validations are written using JavaScript.
  • Performed Unit Testing using JUnit.
  • Provided a User Manual and User Training on the new system to ease them into the new system.
  • Used Eclipse IDE and Tomcat web application server in development.

Confidential

Software Developer

Tools: /Tech: Core Java, C#, Android, Pocket PC, Visual Studio, Eclipse, MSSQL Server, SQLite

Responsibilities:

  • Involved in Designing, Developing, Documenting and Unit Testing the web application.
  • Developed device drivers to communicate with GPS and DRM devices.
  • Developed software for Navigation in GPS and GPS-less Confidential by integrating with Map Window GIS to trace Confidential with history, Messaging, Zoom, Pan and other features.
  • Involved in gathering business requirements, analyzing the project and created UML diagrams such asUse Cases, Class Diagrams, Sequence Diagrams and flowcharts
  • Used Visual Studio IDE for windows application development, Eclipse IDE for Android app development and Pocket PC emulator in Visual Studio to develop for pocket PC.

Confidential

Full Stack Web Developer

Tools: /Tech: WordPress API, PHP, JQuery, HTML, CSS

Responsibilities:

  • Involved in Designing, Developing, Documenting and Unit Testing the web application.
  • Involved in the development of front end using PHP, HTML, CSS and JavaScript.
  • Gained knowledge of Web App Monetization.
  • Developed a price comparison website using WordPress API.
  • Developed RSS Scraping plugin using open source SimplePie RSS Scraper
  • Developed plugin for posting Facebook comments on blog posts
  • Developed custom theme using HTML, CSS and Javascript
  • Added custom meta-data fields to add additional meta-fields to posts such as actual price, deal price, deal expiry etc.

We'd love your feedback!