We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • 7 Years of Software Development experience wif strong noledge in Data Structures, Multi - threading and Distributed Computing and two years of Hadoop Developer experience
  • Configured and maintained Hadoop clusters on Amazon Ec2 and Microsoft Azure and local Linux machines
  • Developed Apache Pig Scripts to perform Extract-transform-load data pipelines and integrated teh data and Processed terabytes of online advertising using Hive Query Language
  • Developed User Defined Functions (UDFs) for Apache Pig and Hive using Python and Java languages
  • Exported Relational Database into Hadoop Distributed File System uses Apache Sqoop and experimented wif Apache Flume to capture log data
  • Deployed Apache Oozie to schedule Map-Reduce jobs
  • Developed a scalable application using NoSQL databases including HBase, MongoDB and DynamoDB
  • Experimented wif Installation and configuration of Hadoop clusters in Single node, Pseudo-distribution and Multi-node clusters using Apache, Cloudera and Hortonworks distributions
  • Applied Machine Learning techniques for clustering data, analyze sentiment and generating recommendation engine and visualizeddata using tableau
  • Contributed to an open source project named Parallel Cartographic Modeling Language, which processes huge Map Datasets by parallelizing teh operations among teh clusters
  • An outstanding problem solver, able to quickly grasp complex problems and identify opportunities for improvements and resolution of critical issues
  • Willing to learn new technologies and apply them to solve complex computing problem

TECHNICAL SKILLS

Hadoop: Map-Reduce, Hortonworks Tez, Pig, Hive, Hbase and Oozie

Java Technologies: Core Java, Servlets, JSP, JDBC, JNDI and Java Beans

IDE’s: Eclipse and Net beans

NoSQL Databases: Hbase, MongoDB, DynamoDB and Microsoft DocumentDB

Frameworks: MVC, Struts, Hibernate and Spring

Programming languages: C, C++, Java, Python and Linux shell scripts

Databases: Oracle 11g/10g/9i, MySQL, DB2 and MS-SQL Server

Web Servers: Web Logic, Web Sphere and Apache Tomcat

Web Technologies: HTML, CSS3, JavaScript and JQuery

PROFESSIONAL EXPERIENCE

Confidential, Atlanta, GA

Hadoop Developer

Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Java, Flume, Cloudera, Oozie, MySQL, UNIX, Core Java

Responsibilities:

  • Evaluated suitability of Hadoop and its ecosystem to teh above project and implemented various proof of concept (POC) applications to eventually adopt them to benefit from teh Big Data Hadoop initiative
  • Estimated Software & Hardware requirements for teh Name-Node and Data-Node& planning teh cluster
  • Extracted teh needed data from teh server into HDFS and Bulk Loaded teh cleaned data into HBase
  • Written teh Map Reduce programs, Hive UDFs in Java where teh functionality is too complex
  • Involved in loading data from LINUX file system to HDFS
  • Develop HIVE queries for teh analysis, to categorize different items
  • Designing and creating Hive external tables using shared meta-store instead of teh derby wif partitioning, dynamic partitioning and buckets
  • Given POC of FLUME to handle teh real time log processing for attribution reports
  • Sentiment Analysis on reviews of teh products on teh client’s website
  • Exported teh resulted sentiment analysis data to Tableau for creating dashboards
  • Used Map Reduce JUnit for unit testing
  • Maintained System integrity of all sub-components (primarily HDFS, MR, HBase, and Hive)
  • Reviewing peer table creation in Hive, data loading and queries
  • Monitored System health and logs and respond accordingly to any warning or failure conditions
  • Responsible to manage teh test data coming from different sources
  • Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs
  • Weekly meetings wif technical collaborators and active participation in code review sessions wif senior and junior developers
  • Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Pig Scripts
  • Involved unit testing, interface testing, system testing and user acceptance testing of teh workflow tool

Confidential, Memphis, TN

Hadoop/Java Developer

Environment: Apache Hadoop, Core Java, Hive, Ubuntu, Eclipse, Cloudera, Sqoop, J2EE, Junit, HTML, JSP

Responsibilities:

  • Worked on designing application components using Java Collection framework and used multithreading to provide concurrent database access
  • Involved in analysis, design, construction and testing of teh application
  • Developed teh web tier using JSP to show details and summary
  • Designed and developed teh UI using JSP, HTML and JavaScript
  • Utilized JPA for Object/Relational Mapping purposes for transparent persistence onto teh SQL Server database
  • Used Tomcat web server for development purpose
  • Involved in creation of Test Cases for JUnit Testing
  • Used Oracle as Database and used Toad for queries execution and also involved in writing SQL scripts, PL/SQL code for procedures and functions
  • Used CVS for version controlling
  • Developed application using Eclipse and used build and deploy tool as Maven
  • Explored and used Hadoop ecosystem features and its architectures
  • Involved in meeting wif teh business team to gather their requirements
  • Migrated teh data from staging database into HDFS using Sqoop
  • Wrote custom Map-Reduce codes, generated JAR files for user defined functions and integrated wif Hive to halp teh analysis team wif teh statistical analysis
  • Load and transform large sets of structured, semi-structured and unstructured data into HDFS
  • Involved in creating Hive tables, loading wif data and writing hive queries which will run internally in map reduce way
  • Weekly meetings wif technical collaborators and active participation in code review sessions wif senior and junior developers
  • Involved in Installing and Configuring Hadoop cluster on teh development cluster
  • Involved unit testing, interface testing, system testing and user acceptance testing of teh workflow tool

Confidential, Warren, NJ

Senior JAVA Developer

Environment: JDK 1.5, J2EE 1.4, Agile Development Process, Struts 1.3, Spring 2.0, Web Services (JAX-WS, Axis 2) Hibernate 3.0, RSA, JMS, JSP, Servlets 2.5, WebSphere 6.1, SQL Server 2005, Windows XP, HTML, XML, IBM Rational Application Developer (RAD), ANT 16, Log4J, XML, XSLT, XSD, jQuery, JavaScript, Ext JS, JUnit 3.8, SVN

Responsibilities:

  • Developed teh application using Struts Framework that leverages classical Model View Layer (MVC) architecture, UML diagrams like use cases, class diagrams, interaction diagrams (sequence and collaboration) and activity diagrams were used
  • Worked in an Agile work environment wif Content Management system for workflow management and content versioning
  • Involved in designing user screens and validations using HTML, jQuery, Ext JS and JSP as per user requirements
  • Responsible for validation of Client interface JSP pages using Struts form validations
  • Integrating Struts wif Spring IOC
  • Used Spring Dependency Injection properties to provide loose-coupling between layers
  • Implemented teh Web Service client for teh login autantication, credit reports and applicant information using Apache Axis 2 Web Service
  • UsedHibernateORM framework wifSpringframework for data persistence and transaction management
  • Used Hibernate 30 object relational data mapping framework to persist and retrieve teh data from database
  • Wrote SQL queries, stored procedures, and triggers to perform back-end database operations
  • Developed ANT Scripts to do teh compilation, packaging and deployment in teh WebSphere server
  • Implemented teh logging mechanism using Log4j framework
  • Wrote test cases in JUnit for unit testing of classes

Confidential, Lynchburg,VA

Java Developer

Environment: J2EE, JSP, SOAP, Java Script, Servlet, JDBC, SQL, UNIX and JUnit

Responsibilities:

  • Developed web components using JSP, Servlets and JDBC
  • Designed tables and indexes
  • Designed, Implemented, Tested and Deployed Enterprise Java Beans both Session and Entity using WebLogic as Application Server
  • Developed stored procedures, packages and database triggers to enforce data integrity Performed data analysis and created crystal reports for user requirements
  • Provided quick turn around and resolving issues wifin teh SLA
  • Implemented teh presentation layer wif HTML, XHTML and JavaScript
  • Used EJBs to develop business logic and coded reusable components in Java Beans
  • Development of database interaction code to JDBC API, making extensive use of SQL
  • Query Statements and advanced Prepared Statements
  • Used connection pooling for best optimization using teh JDBC interface
  • Used EJB entity and session beans to implement business logic and session handling and transactions Developed user-interface using JSP, Servlets, and JavaScript
  • Wrote complex SQL queries and stored procedures
  • Actively involved in teh system testing

Confidential

Java Developer

Environment: J2EE, JSP, PL/SQL, HTML, CSS, Struts, JUnit

Responsibilities:

  • Involved in teh complete SDLC software development life cycle of teh application from requirement analysis to testing
  • Developed teh modules based on struts MVC Architecture
  • Developed Teh UI using JavaScript, JSP, HTML, and CSS for interactive cross browser functionality and complex user interface
  • Created Business Logic using Servlets, Session beans and deployed them on WebLogic server
  • Used MVC struts framework for application design
  • Created complex SQL Queries, PL/SQL Stored procedures, Functions for back end
  • Prepared teh Functional, Design and Test case specifications
  • Involved in writing Stored Procedures in Oracle to do some database side validations
  • Performed unit testing, system testing and integration testing
  • Developed Unit Test Cases Used JUnit for unit testing of teh application
  • Provided Technical support for production environments resolving teh issues, analyzing teh defects, providing and implementing teh solution defects Resolved more priority defects as per teh schedule

We'd love your feedback!