We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • Seven years of Software Development experience with strong knowledge in Data Structures, Multi - threading and Distributed Computing and two years of Hadoop Developer experience
  • Configured and maintained Hadoop clusters on Amazon Ec2 and Microsoft Azure and local Linux machines
  • Developed Apache Pig Scripts to perform Extract-transform-load data pipelines and integrated the data and Processed terabytes of online advertising using Hive Query Language
  • Developed User Defined Functions (UDFs) for Apache Pig and Hive using Python and Java languages
  • Exported Relational Database into Hadoop Distributed File System uses Apache Sqoop and experimented with Apache Flume to capture log data
  • Deployed Apache Oozie to schedule Map-Reduce jobs
  • Developed a scalable application using NoSQL databases including HBase, MongoDB and DynamoDB
  • Experimented with Installation and configuration of Hadoop clusters in Single node, Pseudo-distribution and Multi-node clusters using Apache, Cloudera and Hortonworks distributions
  • Applied Machine Learning techniques for clustering data, analyze sentiment and generating recommendation engine and visualized data using tableau
  • Contributed to an open source project named Parallel Cartographic Modeling Language, which processes huge Map Datasets by parallelizing the operations among the clusters
  • An outstanding problem solver, able to quickly grasp complex problems and identify opportunities for improvements and resolution of critical issues
  • Willing to learn new technologies and apply them to solve complex computing problem

TECHNICAL SKILLS

Hadoop: Map-Reduce, Hortonworks Tez, Pig, Hive, Hbase and Oozie

Java Technologies: Core Java, Servlets, JSP, JDBC, JNDI and Java Beans

IDE’s: Eclipse and Net beans

NoSQL Databases: Hbase, MongoDB, DynamoDB and Microsoft DocumentDB

Frameworks: MVC, Struts, Hibernate and Spring

Programming languages: C, C++, Java, Python and Linux shell scripts

Databases: Oracle 11g/10g/9i, MySQL, DB2 and MS-SQL Server

Web Servers: Web Logic, Web Sphere and Apache Tomcat

Web Technologies: HTML, CSS3, JavaScript and JQuery

PROFESSIONAL EXPERIENCE

Confidential, Atlanta, GA

Hadoop Developer

Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Java, Flume, Cloudera, Oozie, MySQL, UNIX, Core Java

Responsibilities:

  • Evaluated suitability of Hadoop and its ecosystem to the above project and implemented various proof of concept (POC) applications to eventually adopt them to benefit from the Big Data Hadoop initiative
  • Estimated Software & Hardware requirements for the Name-Node and Data-Node & planning the cluster
  • Extracted the needed data from the server into HDFS and Bulk Loaded the cleaned data into HBase
  • Written the Map Reduce programs, Hive UDFs in Java where the functionality is too complex
  • Involved in loading data from LINUX file system to HDFS
  • Develop HIVE queries for the analysis, to categorize different items
  • Designing and creating Hive external tables using shared meta-store instead of the derby with partitioning, dynamic partitioning and buckets
  • Given POC of FLUME to handle the real time log processing for attribution reports
  • Sentiment Analysis on reviews of the products on the client’s website
  • Exported the resulted sentiment analysis data to Tableau for creating dashboards
  • Used Map Reduce JUnit for unit testing
  • Maintained System integrity of all sub-components (primarily HDFS, MR, HBase, and Hive)
  • Reviewing peer table creation in Hive, data loading and queries
  • Monitored System health and logs and respond accordingly to any warning or failure conditions
  • Responsible to manage the test data coming from different sources
  • Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers
  • Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Pig Scripts
  • Involved unit testing, interface testing, system testing and user acceptance testing of the workflow tool

Confidential, Memphis, TN

Hadoop/Java Developer

Environment: Apache Hadoop, Core Java, Hive, Ubuntu, Eclipse, Cloudera, Sqoop, J2EE, Junit, HTML, JSP

Responsibilities:

  • Worked on designing application components using Java Collection framework and used multithreading to provide concurrent database access
  • Involved in analysis, design, construction and testing of the application
  • Developed the web tier using JSP to show details and summary
  • Designed and developed the UI using JSP, HTML and JavaScript
  • Utilized JPA for Object/Relational Mapping purposes for transparent persistence onto the SQL Server database
  • Used Tomcat web server for development purpose
  • Involved in creation of Test Cases for JUnit Testing
  • Used Oracle as Database and used Toad for queries execution and also involved in writing SQL scripts, PL/SQL code for procedures and functions
  • Used CVS for version controlling
  • Developed application using Eclipse and used build and deploy tool as Maven
  • Explored and used Hadoop ecosystem features and its architectures
  • Involved in meeting with the business team to gather their requirements
  • Migrated the data from staging database into HDFS using Sqoop
  • Wrote custom Map-Reduce codes, generated JAR files for user defined functions and integrated with Hive to help the analysis team with the statistical analysis
  • Load and transform large sets of structured, semi-structured and unstructured data into HDFS
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers
  • Involved in Installing and Configuring Hadoop cluster on the development cluster
  • Involved unit testing, interface testing, system testing and user acceptance testing of the workflow tool

Confidential, Warren, NJ

Senior JAVA Developer

Environment: JDK 1.5, J2EE 1.4, Agile Development Process, Struts 1.3, Spring 2.0, Web Services (JAX-WS, Axis 2) Hibernate 3.0, RSA, JMS, JSP, Servlets 2.5, WebSphere 6.1, SQL Server 2005, Windows XP, HTML, XML, IBM Rational Application Developer (RAD), ANT 1 6, Log4J, XML, XSLT, XSD, jQuery, JavaScript, Ext JS, JUnit 3.8, SVN

Responsibilities:

  • Developed the application using Struts Framework that leverages classical Model View Layer (MVC) architecture, UML diagrams like use cases, class diagrams, interaction diagrams (sequence and collaboration) and activity diagrams were used
  • Worked in an Agile work environment with Content Management system for workflow management and content versioning
  • Involved in designing user screens and validations using HTML, jQuery, Ext JS and JSP as per user requirements
  • Responsible for validation of Client interface JSP pages using Struts form validations
  • Integrating Struts with Spring IOC
  • Used Spring Dependency Injection properties to provide loose-coupling between layers
  • Implemented the Web Service client for the login authentication, credit reports and applicant information using Apache Axis 2 Web Service
  • UsedHibernateORM framework withSpringframework for data persistence and transaction management
  • Used Hibernate 3 0 object relational data mapping framework to persist and retrieve the data from database
  • Wrote SQL queries, stored procedures, and triggers to perform back-end database operations
  • Developed ANT Scripts to do the compilation, packaging and deployment in the WebSphere server
  • Implemented the logging mechanism using Log4j framework
  • Wrote test cases in JUnit for unit testing of classes

Confidential, Lynchburg, VA

Java Developer

Environment: J2EE, JSP, SOAP, Java Script, Servlet, JDBC, SQL, UNIX and JUnit

Responsibilities:

  • Developed web components using JSP, Servlets and JDBC
  • Designed tables and indexes
  • Designed, Implemented, Tested and Deployed Enterprise Java Beans both Session and Entity using WebLogic as Application Server
  • Developed stored procedures, packages and database triggers to enforce data integrity Performed data analysis and created crystal reports for user requirements
  • Provided quick turn around and resolving issues within the SLA
  • Implemented the presentation layer with HTML, XHTML and JavaScript
  • Used EJBs to develop business logic and coded reusable components in Java Beans
  • Development of database interaction code to JDBC API, making extensive use of SQL
  • Query Statements and advanced Prepared Statements
  • Used connection pooling for best optimization using the JDBC interface
  • Used EJB entity and session beans to implement business logic and session handling and transactions Developed user-interface using JSP, Servlets, and JavaScript
  • Wrote complex SQL queries and stored procedures
  • Actively involved in the system testing

Confidential

Java Developer

Environment: J2EE, JSP, PL/SQL, HTML, CSS, Struts, JUnit

Responsibilities:

  • Involved in the complete SDLC software development life cycle of the application from requirement analysis to testing
  • Developed the modules based on struts MVC Architecture
  • Developed The UI using JavaScript, JSP, HTML, and CSS for interactive cross browser functionality and complex user interface
  • Created Business Logic using Servlets, Session beans and deployed them on WebLogic server
  • Used MVC struts framework for application design
  • Created complex SQL Queries, PL/SQL Stored procedures, Functions for back end
  • Prepared the Functional, Design and Test case specifications
  • Involved in writing Stored Procedures in Oracle to do some database side validations
  • Performed unit testing, system testing and integration testing
  • Developed Unit Test Cases Used JUnit for unit testing of the application
  • Provided Technical support for production environments resolving the issues, analyzing the defects, providing and implementing the solution defects Resolved more priority defects as per the schedule

We'd love your feedback!