Hadoop Developer Resume
Alpharetta, GA
SUMMARY
- Overall 6 years of professional IT experience with 4+ years of experience in analysis, architectural design, prototyping, development, Integration and testing of applications using Java/J2EE and Endur Technologies and 2 years of experience in Big Data Analytics as Hadoop Developer.
- 2 years of experience as Hadoop Developer with good knowledge in Hadoop ecosystem technologies.
- Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
- Experienced on major Hadoop ecosystem’s projects such as PIG, HIVE and HBASE.
- Extensive experience in developing PIG Latin Scripts and using Hive Query Language for data analytics.
- Good working experience using Sqoop to import data into HDFS from RDBMS and vice - versa
- Good knowledge in using job scheduling and monitoring tools like Oozie and ZooKeeper
- Experience in Hadoop administration activities such as installation and configuration of clusters using Apache and Cloudera
- Developed UML Diagrams for Object Oriented Design: Use Cases, Sequence Diagrams and Class Diagrams using Rational Rose,Visual Paradigm and Visio
- Good Working experience in using different Spring modules like Spring Core Container Module, Spring Application Context Module, Spring MVC Framework module, Spring ORM Module in Web applications
- Aced the persistent service, Hibernate and JPA for object mapping with database. Configured xml files for mapping and hooking it with other frameworks like Spring, Struts
- Working knowledge of database such as Oracle 8i/9i/10g
- Strong experience in database design, writing complex SQL Queries and Stored Procedures
- Have extensive experience in building and deploying applications on Web/Application Servers like Weblogic, Websphere, and Tomcat
- Strong work ethic with desire to succeed and make significant contributions to the organization
- Strong problem solving skills, good communication, interpersonal skills and a good team player
- Have the motivation to take independent responsibility as well as ability to contribute and be a productive team member
TECHNICAL SKILLS
Hadoop/Big Data Technologies: HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Hbase, Oozie, Zookeeper, Kerberos
Programming Languages: Java JDK1.4/1.5/1.6 (JDK 5/JDK 6), C/C++, HTML, SQL
Frameworks: Hibernate 2.x/3.x, Spring 2.x/3.x,Struts 1.x/2.x
Web Services: Apache CXF/XFire, Apache Axis
Client Technologies: JQUERY, Java Script, AJAX, CSS, HTML 5, XHTML
Operating Systems: UNIX, Windows, LINUX
Application Servers: IBM Web sphere, Tomcat, Web Logic, Web Sphere
Web technologies: JSP, Servlets, JNDI, JDBC, Java Beans, JavaScript, Web Services(JAX-WS)
Databases: Oracle 8i/9i/10g & MySQL 4.x/5.x
Java IDE: Eclipse 3.x, IBM Web Sphere Application Developer, IBM RAD 7.0
Tools: SQL Developer, Visio, Rational Rose
PROFESSIONAL EXPERIENCE
Confidential, Alpharetta, GA
Hadoop Developer
Responsibilities:
- Responsible for building scalable distributed data solutions using Hadoop
- Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster
- Setup and benchmarked Hadoop/HBase clusters for internal use
- Developed Simple to complex Map/reduce Jobs using Hive and Pig
- Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop
- Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
- Used UDF’s to implement business logic in Hadoop
- Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other sources.
- Continuous monitoring and managing the Hadoop cluster using Cloudera Manager
- Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
- Installed Oozie workflow engine to run multiple Hive and Pig jobs
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team
Environment: Hadoop, MapReduce, HDFS, Hive, Pig, Java, SQL, Cloudera Manager, Sqoop, Flume, Oozie, Java (jdk 1.6), Eclipse
Confidential, Plano, TX
Hadoop developer
Responsibilities:
- Launching Amazon EC2 Cloud Instances using Amazon Images (Linux/ Ubuntu) and Configuring launched instances with respect to specific applications.
- Launching and Setup of HADOOP/ HBASE Cluster which includes configuring different components of HADOOP and HBASE Cluster.
- Hands on experience in loading data from UNIX file system to HDFS.
- Experienced on loading and transforming of large sets of structured, semi structured and unstructured data from HBase through Sqoop and placed in HDFS for further processing.
- Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
- Managing and scheduling Jobs on a Hadoop cluster using Oozie.
- Involved in creating Hive tables, loading data and running hive queries in those data.
- Extensive Working knowledge of partitioned table, UDFs, performance tuning, compression-related properties, thrift server in Hive.
- Involved in writing optimized Pig Script along with involved in developing and testing Pig Latin Scripts.
- Working knowledge in writing Pig’s Load and Store functions
Environment: Amazon EC2, Apache Hadoop 1.0.1, MapReduce, HDFS, CentOS 6.4, Hbase, Hive, Pig, Oozie, Flume, Java (jdk 1.6), Eclipse
Confidential, NY
Java & J2EE developer
Responsibilities:
- Involved in development of business domain concepts into Use Cases, Sequence Diagrams, Class Diagrams, Component Diagrams and Implementation Diagrams.
- Implemented various J2EE Design Patterns such as Model-View-Controller, Data Access Object, Business Delegate and Transfer Object.
- Responsible for analysis and design of the application based on MVC Architecture, using open source Struts Framework.
- Involved in configuring Struts, Tiles and developing the configuration files.
- Developed Struts Action classes and Validation classes using Struts controller component and Struts validation framework.
- Developed and deployed UI layer logics using JSP, XML, JavaScript, HTML /DHTML.
- Used Spring Framework and integrated it with Struts.
- Involved in Configuring web.xml and struts-config.xml according to the struts framework.
- Used transaction interceptor provided by Spring for declarative Transaction Management.
- The dependencies between the classes were managed by Spring using the Dependency Injection to promote loose coupling between them.
- Provided connections using JDBC to the database and developed SQL queries to manipulate the data.
- Developed DAO using spring JDBC Template to run performance intensive queries.
- Developed ANT script for auto generation and deployment of the web service.
- Wrote stored procedure and used JAVA APIs to call these procedures.
- Developed various test cases such as unit tests, mock tests, and integration tests using the JUNIT.
- Experience writing Stored Procedures, Functions and Packages .
Environment: Java, J2EE, Struts MVC, Tiles, JDBC, JSP, JavaScript, HTML, Spring IOC, Spring AOP, JAX-WS, Web sphere Application Server, Oracle, JUNIT, Eclipse
Confidential, Boston, MA
Java /J2EE Developer
Responsibilities:
- Responsible for gathering and analyzing requirements and converting them into technical specifications
- Used Rational Rose for creating sequence and class diagrams
- Developed presentation layer using JSP, Java, HTML and JavaScript
- Used Spring Core Annotations for Dependency Injection
- Designed and developed a ‘Convention Based Coding’ utilizing Hibernate’s persistence framework and O-R mapping capability to enable dynamic fetching and displaying of various table data with JSF tag libraries
- Designed and developed Hibernate configuration and session-per-request design pattern for making database connectivity and accessing the session for database transactions respectively. Used HQL and SQL for fetching and storing data in databases
- Participated in the design and development of database schema and Entity-Relationship diagrams of the backend Oracle database tables for the application
- Implemented web services with Apache Axis
- Designed and Developed Stored Procedures, Triggers in Oracle to cater the needs for the entire application. Developed complex SQL queries for extracting data from the database
Environment: Java, JDK 1.5, Servlets, Hibernate, Ajax, Oracle 10g, Eclipse, Apache Axis, Apache Ant, Web Logic Server, JavaScript, HTML, CSS, XML