We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Melville, NY

SUMMARY

  • 7+ years of overall IT experience in a variety of industries, which includes hands on experience of 2 years in Big Data technologies and extensive experience of 4+ years in Java.
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, and MapReduce concepts and experience in working with MapReduce programs using Apache Hadoop for working with Big Data to analyze large data sets efficiently.
  • Hands on experience in working with Ecosystems like Hive, Pig, Sqoop, Map Reduce, Flume, Oozie.Strong knowledge of Pig and Hive’s analytical functions, extending Hive and Pig core functionality by writing custom UDFs.
  • Experience in importing and exporting terra bytes of data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Knowledge of job workflow scheduling and monitoring tools like oozie and Zookeeper, of NoSQL databases such as HBase, Cassandra, and of administrative tasks such as installing Hadoop, Commissioning and decommissioning, and its ecosystem components such as Flume, Oozie, Hive and Pig.
  • Experience in design, development and testing of Distributed, Internet/Intranet/E-Commerce, Client/Server and Database applications mainly using technologies Java, EJB, Servlets, JDBC, JSP, Struts, Hibernate, Spring, JavaScript on WebLogic, Apache Tomcat Web/Application Servers and with Oracle and SQL Server Databases on Unix, windows NT platforms.
  • Extensive work experience in Object Oriented Analysis and Design, Java/J2EE technologies including HTML, XHTML, DHTML, JavaScript, JSTL, CSS, AJAX and Oracle for developing server side applications and user interfaces.
  • Experience in developing Middle-tier components in distributed transaction management system using Java. Good understanding of XML methodologies (XML,XSL,XSD) including Web Services and SOAP.
  • Extensive experience in working with different databases such as Oracle, IBM DB, RDBMS, SQL Server, MySQL and writing Stored Procedures, Functions, Joins and Triggers for different Data Models.
  • Handled several techno-functional responsibilities including estimates, identifying functional and technical gaps, requirements gathering, designing solutions, development, developing documentation, and production support.
  • An individual with excellent interpersonal and communication skills, strong business acumen, creative problem solving skills, technical competency, team-player spirit, and leadership skills.

TECHNICAL SKILLS

Database: DB2, MySQL, Oracle, MS SQL Server, IMS/DB

Languages: Core Java, PIG Latin, SQL, HiveQL, Shell Scripting and XML

API’s/Tools: Mahout, Eclipse, Log4j, SVN, Maven

Web Technologies: HTML, XML, JavaScript

BigData Ecosystem: HDFS, PIG, MAPREDUCE, HIVE, SQOOP, FLUME, HBase

Operating System: Unix, Linux, Windows XP, IBM Z/OS

BI Tools: Tableau, Talend

PROFESSIONAL EXPERIENCE

Confidential, Melville, NY

Hadoop Developer

Responsibilities:

  • Analyzed large data sets by running Hive queries and Pig scripts
  • Worked with the Data Science team to gather requirements for various data mining projects
  • Involved in creating Hive tables, and loading and analyzing data using hive queries
  • Developed Simple to complex MapReduce Jobs using Hive and Pig
  • Involved in running Hadoop jobs for processing millions of records of text data
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
  • Developed multiple MapReduce jobs in java for data cleaning and preprocessing
  • Involved in loading data from LINUX file system to HDFS
  • Responsible for managing data from multiple sources
  • Extracted files from CouchDB through Sqoop and placed in HDFS and processed
  • Experienced in runningHadoopstreaming jobs to process terabytes of xml format data
  • Load and transform large sets of structured, semi structured and unstructured data
  • Responsible to manage data coming from different sources
  • Assisted in exporting analyzed data to relational databases using Sqoop
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts

Environment: Hadoop, HDFS, Pig, Hive, MapReduce, Sqoop, LINUX, and Big Data

Confidential, East Hartford, CT

Hadoop Developer

Responsibilities:

  • Worked on analyzing Hadoop cluster using different big data analytic tools including Pig, Hive, and MapReduce
  • Collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis
  • Worked on debugging, performance tuning of Hive & Pig Jobs
  • Created Hbase tables to store various data formats of PII data coming from different portfolios
  • Implemented test scripts to support test driven development and continuous integration
  • Worked on tuning the performance Pig queries
  • Involved in loading data from LINUX file system to HDFS
  • Importing and exporting data into HDFS and Hive using Sqoop
  • Experience working on processing unstructured data using Pig and Hive
  • Supported MapReduce Programs those are running on the cluster
  • Gained experience in managing and reviewing Hadoop log files
  • Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts

Environment: Hadoop, HDFS, Pig, Hive, MapReduce, Sqoop, Oozie, LINUX, and Big Data

Confidential, St. Louis, MO

Sr. Java Developer

Responsibilities:

  • Created design documents and reviewed with team in addition to assisting the business analyst / project manager in explanations to line of business.
  • Responsible for understanding the scope of the project and requirement gathering.
  • Involved in analysis, design, construction and testing of the application
  • Developed the web tier using JSP to show account details and summary.
  • Designed and developed the UI using JSP, HTML, CSS and JavaScript.
  • Utilized JPA for Object/Relational Mapping purposes for transparent persistence onto the SQL Server database.
  • Used Tomcat web server for development purpose.
  • Involved in creation of Test Cases for JUnit Testing.
  • Used Oracle as Database and used Toad for queries execution and also involved in writing SQL scripts, PL/SQL code for procedures and functions.
  • Used CVS for version controlling.
  • Developed application using Eclipse and used build and deploy tool as Maven.
  • Used Log4J to print the logging, debugging, warning, info on the server console.

Environment: Java, J2EE Servlet, JSP, JUnit, AJAX, XML, JavaScript, Log4j, CVS, Maven, Eclipse, Apache Tomcat, and Oracle.

Confidential

Sr. Java Developer

Responsibilities:

  • Developed web components using JSP, Servlets and JDBC
  • Designed tables and indexes
  • Designed, Implemented, Tested and Deployed Enterprise Java Beans both Session and Entity using WebLogic as Application Server
  • Developed stored procedures, packages and database triggers to enforce data integrity. Performed data analysis and created crystal reports for user requirements
  • Provided quick turn around and resolving issues within the SLA.
  • Implemented the presentation layer with HTML, XHTML and JavaScript
  • Used EJBs to develop business logic and coded reusable components in Java Beans
  • Development of database interaction code to JDBC API making extensive use of SQL
  • Query Statements and advanced Prepared Statements.
  • Used connection pooling for best optimization using JDBC interface
  • Used EJB entity and session beans to implement business logic and session handling and transactions. Developed user-interface using JSP, Servlets, and JavaScript
  • Wrote complex SQL queries and stored procedures
  • Actively involved in the system testing
  • Prepared the Installation, Customer guide and Configuration document which were delivered to the customer along with the product

Environment: Windows NT 2000/2003, XP, and Windows 7/ 8 C, Java, UNIX, and SQL using TOAD, Finacle Core banking, CRM 10209, Microsoft Office Suit, Microsoft project

Confidential

Java Developer

Responsibilities:

  • Involved at requirement gathering & Analysis of the project
  • Designed the functional specifications and architecture of the web-based module using Java Technologies.
  • Created Design specification using UML Class Diagrams, Sequence & Activity Diagrams
  • Developed the Web Application using MVC Architecture, Java, JSP, and Servlets & Oracle Database.
  • Developed various Java classes, SQL queries and procedures to retrieve and manipulate the data from backend Oracle database using JDBC.
  • Extensively worked with Java Script for front-end validations.
  • Analysis of business requirements and develop system architecture document for the enhancement project.
  • Designed and developed applications on Service Oriented Architecture (SOA)
  • Created UML (Use cases, Class diagrams, Activity diagrams, Component diagrams, etc.) using Visio
  • Provided Impact Analysis and Test cases.
  • Delivered the code within the timeline, and logged the bugs/fixes in TechOnline, tracking system
  • I had developed Unit & Functional Test cases for testing Web Application.

Environment: Windows NT 2000/2003, XP, and Windows 7/ 8, C, Java, UNIX, and SQL

Confidential

Systems Engineer

Responsibilities:

  • Involved in the design and development phases of Rational Unified Process (RUP).
  • Designed Class Diagrams, Sequence Diagrams and Object Diagrams using IBM Rational Rose to model
  • Application was built on MVC architecture with JSP 1.2 acting as presentation layer, Servlets as controller and Developed the application using Jakarta Struts 1.1 Framework: developed action classes, form beans and Used Struts Validation Framework for validating front end forms.
  • Extensively used XML Web Services for transferring/retrieving data between different providers.
  • Developed complete Business tier with Session beans and CMP Entity beans with EJB 2.0 standards using JMS Queue communication in authorization module.
  • Designed and implemented Business Delegate, Session Facade and DTO Design Patterns
  • Involved in implementing the DAO pattern
  • Used JAXB API to bind XML Schema to java classes
  • Used the report generation in the databases written in PL/SQL
  • Used Maven for building the enterprise application modules
  • Used Log4J to monitor the error logs
  • Used JUnit for unit testing
  • Used SVN for Version control
  • Deployed the applications on WebLogic Application Server.

Environment: Struts 1.1, EJB 2.0, Servlets 2.3, JSP 1.2, SQL, XML, XSLT, Web Services, JAXB, SOAP, WSDL, JMS1.1, JavaScript, TDD, JDBC, Oracle 9i, PL/SQL, Log4J, JUnit, WebLogic, Eclipse, Rational XDE, SVN, Linux

We'd love your feedback!