We provide IT Staff Augmentation Services!

Hadoop Developer Resume

2.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Over 7+ years of professional IT experience which includes experience in Bigdata, Hadoop ecosystem related technologies in Banking, Insurance and Communication sectors.
  • Well versed in installation, configuration, supporting and managing of Big Data and underlying infrastructure of Hadoop Cluster.
  • Hands on experience on major components in Hadoop Ecosystem like Hadoop Map Reduce, HDFS, HIVE, PIG, Hbase, Zookeeper, Sqoop, Oozie, Flume and Avro.
  • Excellent understanding and knowledge of NOSQL databases like MongoDB, HBase, Cassandra, and SOLR/Lucene.
  • Responsible for setting up processes for Hadoop based application design and implementation.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Good understanding of NoSQL Data bases and hands on work experience in writing applications on No SQL databases like Cassandra and Mongo DB.
  • Good knowledge in querying data from Cassandra for searching grouping and sorting.
  • Good understanding of Data Structures and Algorithms.
  • Experience in managing and reviewing Hadoop log files.
  • Very good experience in complete project life cycle (design, development, testing and implementation) of Client Server and Web applications.
  • Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
  • Experience in managing Hadoop clusters using Cloudera Manager tool.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Extensive expertise on development using SQL and PL/SQL
  • Extensive expertise working in Oracle, DB2, SQL Server and My SQL database.
  • Experience in Java, JSP, Servlets, EJB, WebLogic, WebSphere, Hibernate, Spring, JBoss, JDBC, RMI, Java Script, Ajax, Jquery, XML, and HTML
  • Knowledge on Unix and Database administration.
  • Determined, committed, hardworking with strong communication, interpersonal and organizational skills.
  • Ability to work in a team and coordinate/resolve issues with team of developers and other stakeholders.

TECHNICAL SKILLS

Big Data/Hadoop: HDFS, Hadoop MapReduce, Zookeeper, Hive, Pig, Sqoop, Flume, Oozie.

Languages: C, C++, Java, J2EE, HTML, JavaScript, JQuery, Ajax, PHP, SQL/PLSQL, Python, Perl, Shell Scripting

Methodologies: Agile, V-model, Waterfall model

Databases: HBase, MongoDB, Cassandra, Oracle 10g,11g, MySQL, Couch, MS SQL server

Web Tools/Frameworks: HTML, Java Script, XML, ODBC, JDBC, Java Beans, EJB, MVC, Ajax, JSP, Servlets, Struts, Junit, REST API, Spring, Hibernate

Web/Application Server: Apache, Tomcat, OC4J, Web Logic and Sun one Webserver.

PROFESSIONAL EXPERIENCE

Confidential, Chicago, IL

Hadoop Developer

Responsibilities:

  • Imported data from Teradata to HDFS through incremental import Sqoop
  • Ingested data from servers through flume agents
  • Imported data from Oracle to HDFS by Sqoop import
  • Used Hive to load the data sets
  • Designed custom serde over hive
  • Performed Map-joins hive
  • Handled JSON-datasets
  • Involved in various NOSQL databases like HBase, Cassandra in implementing and integration
  • Loaded data to HBase over HCatalog
  • Used Pig for bulk load
  • Designed Thrift API Java code for accessing data from HBase
  • Designed Workflow in Oozie
  • Queried and analyzed data from Cassandra for quick searching, sorting and grouping through CQL
  • Used Parquet file format for optimization

Environment: Hadoop, HDFS, Hive, Flume, HBase, Cassandra, Sqoop, PIG, Java (JDK 1.6), Eclipse, MySQL and Ubuntu, Zookeeper.

Confidential, San Francisco, CA

Java/Hadoop Developer

Responsibilities:

  • Involved in importing data from Datawarehouse to HDFS by Sqoop
  • Used Pig to handle the datasets
  • Used Hive to make Avro files
  • Utilized Lzo compression codec
  • Made UDF’s in Hive to extract the required fields as XOR gate
  • Loaded the data to Hbase as Avro schema by Mapreduce
  • Designed work flow Oozie
  • Exported the data to Sql-server as per client requirement

Environment: Hadoop, MapReduce, HDFS, Hive, Java, Hadoop distribution of Cloudera, Pig, HBase, Linux, XML, Java 6, Eclipse, Oracle 10g,11, PL/SQL, MongoDB, Toad.

Confidential, Madison, WI

Java/Hadoop Developer

Responsibilities:

  • Processed data into HDFS by developing solutions, analyzed the data using MapReduce, Pig, Hive and produce summary results from Hadoop to downstream systems
  • Used Sqoop widely in order to import data from various systems/sources (like MySQL) into HDFS
  • Applied Hive quires to perform data analysis on HBase using Storage Handler in order to meet the business requirements
  • Created components like Hive UDFs for missing functionality in HIVE for analytics.
  • Hands on experience with NoSQL databases like HBase, Cassandra for POC (proof of concept) in storing URL’s and images.
  • Developing Scripts and Batch Job to schedule a bundle (group of coordinators) which consists of various Hadoop Programs using Oozie
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Worked with cloud services like Amazon web services (AWS)
  • Involved in ETL, Data Integration and Migration
  • Used different file formats like Text files, Sequence Files, Avro
  • Cluster co-ordination services through Zookeeper
  • Assisted in creating and maintaining Technical documentation to launching HADOOP Clusters and even for executing Hive queries and Pig Scripts
  • Assisted in Cluster maintenance, cluster monitoring, adding and removing cluster nodes and Troubleshooting
  • Installed and configured Hadoop, Mapreduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.

Environment: MapReduce, HDFS Sqoop, Flume, LINUX, Oozie, Hadoop, Pig, Hive, Hbase, Cassandra, Hadoop Cluster, Amazon Web Services

Confidential

Senior Java/J2EE Developer

Responsibilities:

  • Involved in requirements gathering, analysis, and design and in development.
  • Involved in developing Webservices using SOAP, WSDL and UDDI Components.
  • Involved in communicating with end client while requirements gathering and demo of project.
  • Involved in Use Case Realization, Use Case Diagrams, Sequence Diagrams and Class Diagram for various modules.
  • Worked with BA in requirement Analysis and prepared detailed software requirement document.
  • Involved in Coding, Debugging & Code review.
  • Developed Web Service Client programs using JAX-RPC.
  • Used Spring Inversion of Control (IOC) to wire DAO and delegate object to the registry.
  • Design and implementation of domain model layer (used by application for DB interaction) using Spring and Hibernate.
  • Used struts framework for web tier and spring MVC on back end code.
  • Implemented Hibernate for persisting Java objects, JUNIT or Spring AOP to test performance.
  • Involved in writing ANT Scripts for building the web application. Used SVN for version control of the code and configuration files. Log4j was used to log the events in the application.
  • Involved in UI development using Javascript.
  • Involved in customizing Ajax calls, types and strategies using Dojo tool kit.
  • Implemented DB Connector using Singleton Pattern.
  • Followed RUP Process and used VSS for version control and used JUNIT for unit testing.
  • Involved In writing JUnit test cases using JUnit framework
  • Involved in Units integration, bug fixing, and User acceptance testing with test cases.

Environment: Java 2.0(JDK 1.6), JDBC, Hibernate, Struts2, Messaging queue, Unified Modeling Language, XML, Eclipse, JavaScript, Web Sphere Application Server, Oracle9i, VPN, Stored procedures, Triggers, SQL Server, SOAP, Ajax, Dojo toolkit, WSDL,UDDI,UNIX, JUNIT, ANT, MAVEN, Rational Rose, SVN.

Confidential

Java/J2EE Developer

Responsibilities:

  • Involved in different phases to gather requirements, document the functional specifications, design, data modeling and development of the applications.
  • J2EE Front-End and Back-End supporting business logic, integration, and persistence.
  • Used JSP with Spring Framework for developing User Interfaces.
  • Developed the front-end user interface using J2EE, Taglibs, Servlets, JDBC, HTML, DHTML, CSS, XML, XSL, XSLT and JavaScript as per Use Case Specification.
  • Integrated Security Web Services for authentication of users.
  • Used Hibernate Object/Relational mapping and persistence framework as well as a Data Access abstraction Layer.
  • Data Access Objects (DAO) framework is bundled as part of the Hibernate Database Layer.
  • Designed Data Mapping XML documents that are utilized by Hibernate, to call stored procedures.
  • Responsible for Testing and moving the application into Staging and Production environments.
  • Responsible for Project Documentation, Status Reporting and Presentation.
  • Used CVS version control to maintain the Source Code.

Environment: J2EE/Java, JSP, Spring 1.2, Spring MVC, Hibernate, Eclipse 3.0, MySQL, JDBC, JBoss, ANT, JavaScript, XML, SOAP, WSDL, CVS.

Confidential

Java/J2EE developer

Responsibilities:

  • Involved in evaluating the existing system
  • Involved in designing and development of the SSO, stabilizing the system (meeting Performance requirements, functionality implementation), providing support and enhancing the application according to business needs..
  • Fixed the bugs and solved the issues raised.
  • Made minor & major enhancements as & when required.
  • Prepared the Gap Analysis Docs
  • Involved in the fixing of existing bugs.
  • Helped the team in understanding the requirements.
  • Involving in Unit, Integration and Pre-Production testing of the application
  • Prepared the GAP Analysis docs.

Environment: Java, JSP, Servlets, JDBC, DOJO, HTML, CSS.

We'd love your feedback!