We provide IT Staff Augmentation Services!

Hadoop Developer Resume

4.00/5 (Submit Your Rating)

Dublin, OH

PROFESSIONAL SUMMARY:

  • Over 9 years of overall experience in Financial, Marketing and Enterprise Application Development in diverse industries which includes hands on experience in Big data ecosystem related technologies.
  • Three years of comprehensive experience as Hadoop Developer.
  • Experience in writing Hadoop Jobs for analyzing data using Hive and Pig
  • Experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on amazon web services (AWS).
  • Hands - on experience on major components in Hadoop Ecosystem including Hive, HBase, HBase-Hive Integration, PIG, Sqoop, Flume, Map reduce, Spark, Kafka, storm and Oozie.
  • Set up standards and processes for Hadoop based application design and implementation.
  • Extensive experienced in working with NoSQL databases including HBase, Cassandra and MongoDB.
  • Experience in working with Map Reduce programs using Apache Hadoop for working with Big Data.
  • Experience in installation, configuration, supporting and monitoring Hadoop clusters using Apache, Clouderadistributions and AWS.
  • Experience in using Pig, Hive, Scoop, HBase and Cloudera Manager.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems(RDBMS)and vice-versa.
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFC, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce Concepts
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting
  • Extending Hive and Pig core functionality by writing custom UDFs.
  • Experience in analyzing data using HiveQL, PigLatin, and custom Map Reduce programs in Java.
  • In depth and extensive knowledge of Hadoop architecture and its components.
  • Familiarity and experience with Data warehousing and ETL tools.
  • Experienced in NoSQL databases such as HBase, Cassandraand MongoDB.
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper.
  • Experienced in analyzing/processing data using HiveQL, Storm, Kafka, Redis, Flume, Sqoop, Pig Latin, and custom MapReduce programs in Java.
  • Knowledge on importing and exporting data using Flume and kafka .
  • Familiarity working with popular frameworks likes Struts 1.1, Hibernate 3.0, Spring IOC, Spring AOP and Spring JDBC
  • Experience using middleware architecture using Sun Java technologies like J2EE, JSP 2.0, Servlets 2.4, JDBC, JUnit and application servers like Web Sphere 7.1 and Web logic 10.3.
  • Good understanding of XML methodologies (XML, XSL, XSD) including Web Services (JAX-WS Specification )and SOAP
  • Experience in Web Services using XML, HTML and SOAP.
  • Experience in component design using UML Design, Use case, Class, Sequence, Deployment and Component diagrams for the requirements.
  • Experience in Message based systems using JMS, TIBCO & MQ Series.
  • Experience in writing database objects like Stored Procedures, Triggers, SQL, PL/SQL packages and Cursors for Oracle, SQL Server, DB2 and Sybase.
  • Experienced in using CVS, SVN and Sharepoint as version manager.
  • Proficient in unit testing the application using Junit, MRUnit and logging the application using Log4J.
  • Ability to blend technical expertise with strong Conceptual, Business and Analytical skills to provide quality solutions and result-oriented problem solving technique and leadership skills.

TECHNICAL SKILLS:

Hadoop Ecosystem: HDFS, Map Reduce, Pig, Hive, Hbase, Sqoop, Oozie, Flume, Zookeeper kafka, spark, storm and Avro.

Technologies: Core Java, J2EE,Servlets,JSP,JDBC, Hibernate, Spring, XML,AJAX,SOAP, WSDL

Methodologies: Agile, UML, Design Patterns (Core Java and J2EE)

Programming Languages: Core Java, java/j2ee,Python, XML, Unix Shell scripting, HTML, CSS

Data Bases: Cassandra, MongoDB, CouchDB, Hbase, Oracle 11g/10g, DB2, MS-SQL Server, MySQL, MS-Access

Operating systems: Windows, Linux, Unix

Version controllers: Svn, cvs, Git.

PROFESSIONAL EXPERIENCE:

Confidential, Dublin, OH

Hadoop Developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
  • Experience in installation, configuration, supporting and monitoring Hadoop clusters using Apache, Clouderadistributions and AWS.
  • Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
  • Setup and benchmarked Hadoop/HBase clusters for internal use.
  • Developed Simple to complex Map/reduce Jobs using Hive.
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop.
  • Experience in Data Warehousing and ETL processes and Strong database, SQL, ETL and data analysis skills.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Configured, deployed and maintained multi-node Dev and Tested Kafka Clusters
  • Developed multiple Kafka Producers and Consumers from base by using low level and high level API’s and implementing.
  • Experienced with batch processing of data sources using Apache Spark and Elastic search.
  • Experienced in implementing SparkRDD transformations, actions to implement business analysis migrated Hive QL queries on structured into SparkQL to improve performance.
  • Implemented Strom topologies as pre processing components before move data from Kafka consumers to HDFS and Cassandra.
  • Configured, deployed and maintained a single node storm cluster in DEV environment

Environment: Hadoop, MapReduce, Hive, Pig, Sqoop, Avro, Spark, Kafka, Storm, Datameter, Teradata, SQL Server, IBM Mainframes, Java 7.0, Log4J, Junit, MRUnit, SVN, JIRA.

Confidential, NYC

Hadoop Developer

Responsibilities:

  • Installed and configured Hadoop MapReduce, HDFS and developed multiple MapReduce jobs in Java for data cleansing and preprocessing.
  • Created and maintained Technical documentation for executing MapReduce, Hive queries and Pig Scripts.
  • Wrote shell scripts for rolling day-to-day processes and it is automated.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
  • Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page-views, visit duration, most purchased product on website.
  • Developed Pig Latin scripts to extract data from the web server output files to load into HDFS.
  • Extensively used Pig for data cleansing.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Involved in developing Hive DDLs to create, alter and drop tables.
  • Created and maintained technical documentation for launching Hadoop clusters and for executing Hivequeries and Pig Scripts.
  • Optimized the Cassandra cluster by making changes in Cassandra properties and Linux OS configurations
  • Proficient work experience withNOSQL, MongoDB databases.
  • Extracted files from CouchDB through Sqoop and placed in HDFS and processed.
  • Developed Pig program for loading and filtering the streaming data into HDFS using Flume.
  • Worked on NOSQL databases including HBase and MongoDB.
  • Worked with the Data Science team to gather requirements for various data mining projects.
  • Experience in Agile Software Development Lifecycle Model.

Enivornment: Hadoop, MapReduce, Hive, Pig, Sqoop, Flume, Kafka, Impala, Python, Java 7.0, XML, WSDL, SOAP, Webservices, Oracle/Informix, Log4J, Junit, SVN.

Confidential, Horsham, PA

Java Developer

Responsibilities:

  • Developed Business logic using J2EE, JSP, Servlet and Spring Bean Classes.
  • Developed the application using Struts Framework which is based on the MVC Design Pattern.
  • Interacted with end applications and performed Business Analysis and Detailed Design of the system from Business Requirement documents.
  • Design and Development of Objects using Object Oriented Design in Java.
  • Extensively used Hibernate Query Language (HQL) and Criteria based queries to work with MySQL databases.
  • Created data-models for customer data using the Cassandra Query Language.
  • Created security certificates and established secured connections between Application Servers and Web Servers.
  • Used Spring framework, Spring-AOP, Spring-ORM, Spring-JDBC modules.
  • Worked on Sun Jersey REST framework to create web services.
  • Configured database with hibernate.cfg.xml & database mapping with hbm.xml
  • Used containers like Array and Map.
  • Database handling,Multithreading, Synchronization, Communication.
  • Involved in Trouble Shooting and Customer Support.
  • Developed the Spring AOP Programming to configure logging for the application
  • Implemented application level persistence using Hibernate and Spring.
  • Developed user interface using JSP, AJAX, HTML, XHTML, XSLT and Java Script to simplify the complexities of the application.
  • Interactive with JavaScript frameworks like j Query.
  • Involved in writing PL/SQL stored procedures, triggers and complex queries.
  • Developed GUI using JSP, Struts frame work.
  • Involved in developing the presentation layer using Spring MVC/Angular JS/JQuery, Bootstrap .
  • Worked with Maven, ANT Builder for application building, scheduling, mailing and automation.
  • Configured Log4j and involved in trouble shooting the environmental issues.
  • Created test plans and JUnit test cases and test suite for testing the application.
  • Worked in Agile environment with active scrum participation.

Environment: - Java, Servlets, JSP, Struts, Spring, Hibernate, XML, CSS, AJAX, HTML5, Rational Clear Case, Microsoft Visio, WebSphere Application Server, Java Script, REST Web Services, Jersey, Apache CXF, JQuery, AJAX, Oracle, CRUD, SQL, UML, JUNIT, MAVEN, UML, NetBeans, ANT, Agile.

Confidential, Alpharetta, GA

Java Developer

Responsibilities:

  • Involved in analysis, design and development of Expense Processing system.
  • Developed the application using Struts 1.2 Framework that leverages classical Model View Layer (MVC) architecture.
  • Created used interfaces using JSP.
  • Developed the Web Interface using Servlets,Java Server Pages, HTML and CSS.
  • Developed the DAO objects using JDBC.
  • Business Services using the Servlets and Java.
  • Design and development of User Interfaces and menus using HTML 5, JSP, Java Script, client side and server side validations.
  • Developed GUI using JSP, Struts frame work.
  • Involved in developing the presentation layer using Spring MVC/Angular JS/JQuery/ Bootstrap .
  • Involved in designing the user interfaces using Struts Tiles Framework.
  • Used Spring Framework for Dependency injection and integrated with the Struts Framework and Hibernate.
  • Used Hibernate 3.0 in data access layer to access and update information in the database.
  • Experience in SOA (Service Oriented Architecture) by creating the web services with SOAP and WSDL.
  • Developed JUnit test cases for all the developed modules.
  • Used Log4J to capture the log that includes runtime exceptions, monitored error logs and fixed the problems.
  • Used CVS for version control across common source code used by developers.
  • Used ANT scripts to build the application and deployed on Weblogic Application Server 10.0.

Environment: - Struts1.2, Hibernate3.0, Spring2.5, JSP, Servlets, XML,SOAP, WSDL, JDBC, JavaScript, HTML, CVS, Log4J, JUNIT, Web logic App server, Eclipse, Oracle, Bootstrap, LINUX.

Confidential

Java Developer

Responsibilities:

  • Participated along with a team of six members in the design and development of Mobile Technician, Service Administrator and Roving Eye modules in architecture built using J2EE Design Patterns
  • Worked with JSP, Servlets, JSF, JSTL/EL
  • Worked with JDBC and Hibernate
  • Configured and Maintained Subversion version control
  • Implemented Data Access Object, MVC design patterns
  • Experience of working in Agile Methodology, Sprint Methodology
  • Worked with Restful web Services and WSDL
  • Worked with Complex SQL queries, Functions and Stored Procedures
  • Developed Test Scripts using JUnit and JMockit
  • Implementation of presentation logic using JSPs, HTML and XML
  • Providing the Object-Relational Mapping between business objects and database using Hibernate Query Language
  • Used RESTFUL Services to interact with the Client by providing the RESTFUL URL mapping.
  • Providing business intelligence through well-formatted, summarized reports using Jasper Reports

Environment: Windows XP, Linux, SQL Server7.0, WebLogic6.1, Core Java, JSP, EJB, JMS, Hibernate1.2, Jasper Reports, MS - VSS, rational requisite Pro, Borland, Junit, Restful.

We'd love your feedback!