Hadoop Developer Resume
Herndon, VA
OBJECTIVE:
Looking for a challenging position as Big Data / Hadoop Developer where I can use my knowledge, Technical and Analytical skills to contribute to projects that add value to the organization.
EXPERIENCE SUMMARY:
- Cloudera Certified Hadoop Developer with around 7 years of experience in Information Technology involving Analysis, Design, Coding, Testing, Implementation and . Excellent skills in state - of-the-art technology of Client Server computing, desktop applications and Website development.
- Around 2 years of work experience on Big Data Analytics with hands on experience on writing Map Reduce jobs on Hadoop Ecosystem including Hive and Pig.
- Excellent experience in Installing, Configuring and Testing Hadoop ecosystem components.
- About 5 years of work experience as a Java/J2EE programmer developing applications using Servlets, JSP, EJB, Struts, Spring, JSF, Java Beans, JDBC, JMS, Hibernate and MVC architecture.
- Progressive experience in all phases of the iterative Software Development Life Cycle (SDLC).
- Actively involved in Requirements Gathering, Analysis, Development, Unit Testing and Integration Testing.
- Worked on Agile Methodology projects extensively.
- Good working experience on Hadoop architecture, HDFS, Map Reduce and other components in the Cloudera - Hadoop echosystem.
- Good working experience onHadooparchitecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm
- Hands on experience in installing, configuring, and usingHadoopecosystem components like Hadoop, Map Reduce, HDFS, HBase, Hive, Sqoop, Pig, Zookeeper and Flume.
- Good working experience onApacheHadoopMap Reduce programming, PIG Scripting and Distribute Application and HDFS.
- Hands on experience in Sequence files, RC files, Combiners, Counters, Dynamic Partitions, Bucketing for Best practice and Performance improvement.
- Good working experience onHadoopCluster architecture and monitoring the cluster, In-depth understanding of Data Structure and Algorithms.
- Experience in managing and reviewingHadooplog files.
- Excellent understanding and knowledge of NoSQL Databases like HBase, and Mongo DB.
- Experience in implementing in setting up Standards and Processes forHadoopbased application design and implementation.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice versa.
- Extensive experience working in JavaScript for Client side validations and implemented AJAX with JavaScript for reducing Data transfer overhead between User and Server.
- Extensive experience working in Oracle, SQL Server and My SQL database.
- Hands on experience in Application Development using Java and RDBMS.
- Experience in the Design and Implementation of Dynamic Web-based applications using Core Java, Struts, Java Server Faces, Hibernate and XML technologies.
- Implemented J2EE modules based on MVC Design Pattern.
- Strong knowledge in Developing Internet/Intranet Applications with JSP, XML, CSS, HTML, JavaScript.
- Worked on Enhancing and Developing JAVA applications.
- Ability to adapt to evolving technology, strong sense of responsibility and .
TEHCNICAL SKILLS:
Big Data technologies: Map Reduce, Hive, PIG, SQOOP, Flume, and OOZIE.
No SQL data bases: Mongo DB, HBase.
Frameworks and ORM tools: Hibernate, JSF
Technologies: JAVA, J2EE, JSF, JSP, springs, Servlets, HTML, JavaScript, JQuery.
Database: Oracle, My SQL and SQL server.
Operating Systems: Windows and Linux
Tools: & IDE:: Eclipse, RAD, CISCO EOS Platform, CVS, JIRA, Edit Plus, FileZilla, Tivoli
PROFESSIONAL EXPERIENCE:
Confidential, Herndon, VA
Hadoop Developer
Responsibilities:
- Processed Data into HDFS by developing solutions.
- Analysed the data using Map Reduce, Pig, Hive and produce summary results from Hadoop to downstream systems.
- Worked extensively with HIVE DDLs and Hive Query language (HQLs).
- Developed UDF, UDAF, UDTF functions and implemented it in HIVE Queries.
- Implemented SQOOP for large Dataset transfer betweenHadoopand RDBMs.
- Created Map Reduce Jobs to convert the periodic of XML messages into a partition Avro Data.
- Used Sqoop widely in order to import data from various systems/sources (like MySQL) into HDFS.
- Created components like Hive UDFs for missing functionality in HIVE for analytics.
- Developing Scripts and Batch Job to schedule a bundle (group of coordinators) which consists of various.
- Exported the Analysed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
- Involved in ETL, Data Integration and Migration.
- Used different file formats like Text files, Sequence Files, Avro.
- Cluster co-ordination services through Zookeeper.
- Assisted in creating and maintaining Technical documentation to launching HADOOP Clusters and even for executing Hive queries and Pig Scripts.
- Assisted in Cluster maintenance, Cluster monitoring, adding and removing Cluster nodes and trouble shooting.
- Installed and configured Hadoop, Map Reduce, HDFS, Developed multiple Map Reduce jobs in Java for Data cleaning and Pre-processing.
Solution Environment: Hadoop, Map Reduce, HDFS, Sqoop, Flume, LINUX, Oozie, Pig, Hive, HBase, Hadoop Cluster and JAVA.
Confidential
Senior Systems Engineer
Responsibilities:
- Involved in all phases of the project, requirement analysis, design, coding and Unit testing.
- Used Validator framework in developing the applications
- Coordination withDevelopersand QA Testing Team regarding Testing issues like setting up builds, issuing tickets, setting up Testing and Development Environment.
- Designed the user interface of the application using HTML5, CSS3,JavaServer Faces 2.0 (JSF 2.0), JSP, JSTL, JavaScript, and AJAX.
- Also developed 4 artist sites in Myspace.
- Was trained in Drupal, a content management system.
- Worked in DAO layer of the application, writtenjavacode to access hibernate session factory using spring hibernate template, Hibernate Search Quires and written search, persistence, deletion for the persistence objects.
- Coordination withDevelopersand QA Testing Team regarding Testing issues like setting up builds, issuing tickets, setting up Testing and Development Environment.
Solution Environment: Windows XP, Core JAVA, J2EE, JSP, JS, XML, Hibernate, SQL Server 2005, Eclipse, JIRA, Citrix, CISCO EOS, and Drupal.
Confidential
Systems Engineer
Responsibilities:
- Worked with the Business Analysts to analyse the required functionality.
- Involved in configuring Web sphere Portal Configuration.
- Involved in Resolution document preparation for Change Requests.
- Coordination withDevelopersand QA Testing Team regarding Testing issues like setting up builds, issuing tickets, setting up Testing and Development Environment.
- Implementing enhancements using Jsp/Servlets and Java Technology.
- Responsible for development of Business logic in CoreJava.
- Used Eclipse to Develop the Applications
- Involved in unit testing, integration testing.
- Used JUNIT testing framework for Unit testing.
Solution Environment: Windows XP, Core JAVA, J2EE, Struts, JSP, JS, XML, Hibernate, DB2 database, SQL Server 2005, WAS 5, WebSphere Portal Server 5, jdk, Tivoli LDAP, IBM RAD 7.0, Edit Plus, Putty, Citrix