We provide IT Staff Augmentation Services!

Hadoop Developer Resume

Chicago, IL

PROFESSIONAL SUMMARY:

  • 7+ years of professional IT experience which includes experience in Java and Big - Data related technologies in Finance, Healthcare and Retail Industries.
  • Hands on experience on major components in Hadoop Ecosystem like Hadoop Map Reduce, HDFS, YARN, Cassandra, IMPALA, Hive, Pig, HBase, Sqoop, Oozie, Flume.
  • Good Knowledge on Map Reduce design patterns.
  • Experience with distributed systems, large-scale non-relational data stores, NoSQL map-reduce systems, data modeling, database performance tuning, and multi-terabyte data warehouses.
  • Responsible for setting up processes for Hadoop based application design and implementation.
  • Extensively worked on Hive and Pig for performing data analysis.
  • Experience in managing HBase database and using it to update/modify the data quickly.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experience in ingesting log data into HDFS using Flume.
  • Experience in Spark to process large stream of data.
  • Experienced in running MapReduce and Spark jobs over YARN.
  • Experience in managing and reviewing Hadoop log files.
  • Experience with Cloudera, Horton works and MapR distribution.
  • Handling data in various file formats such as Sequential, AVRO, RC, Parquet and ORC.
  • Very good experience in complete project life cycle (design, development, testing and implementation) of Client Server and Web applications.
  • Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
  • Involved in developing complex ETL transformation & performance tuning .
  • Good knowledge in MongoDB concepts and its architecture.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Experience using middleware architecture using Sun Java technologies like J2EE, JSP, Servlets, and application servers like Web Sphere and Web logic.
  • Well versed in installation, configuration, supporting and managing of Big Data and underlying infrastructure of Hadoop Cluster
  • Ability to blend technical expertise with strong Conceptual, Business and Analytical skills to provide quality solutions and result-oriented problem solving technique and leadership skills.

TECHNICAL SKILLS:

Operating Systems: LINUX(Ubuntu, CentOS, Redhat), Windows

Big Data/Hadoop: HDFS, Hadoop MapReduce, Hive, Pig, Sqoop, Flume, Oozie, Spark.

Languages: C, Java, SCALA, SQL/PLSQL, Shell Scripting, HiveQL, Pig Latin, Spark SQL

Methodologies: Agile, Waterfall model

Databases: HBase, CASSANDRA, MySQL, DB2, Oracle 10g, Teradata

Web Tools/Frameworks: HTML, Java Script, XML, ODBC, JDBC, JSP, Servlets

IDE: Eclipse

Application Servers: Apache Tomcat server, Apache HTTP webserver

PROFESSIONAL EXPERIENCE:

Confidential, Chicago, IL

Hadoop Developer

Responsibilities:
  • Developed workflows using custom MapReduce, Pig, Hive, and Sqoop.
  • Developed workflow of loading data into HDFS from DB2 and preprocessing with PIG.
  • Developed the UDF's to preprocess the data for analysis.
  • Tuned the cluster for Spark to process these large data sets.
  • Built reusable Hive UDF libraries for business requirements which enabled users to use these UDF's in Hive querying.
  • Used Avro to represent complex data structure within a map reduce job.
  • The logs and semi structured content that are stored on HDFS were preprocessed using PIG and the processed data is imported into Hive warehouse which enabled business analysts to write Hive queries.
  • Designed and implemented Spark test bench application to evaluate quality of recommendations made by the engine.
  • Configured big data workflows to run on the top of Hadoop and these workflows comprises of heterogeneous jobs like Pig, Hive, Sqoop and MapReduce.
  • Developed suit of Unit Test Cases for Mapper, Reducer and Driver classes using testing library.
  • Worked on NoSQL databases including HBase, Cassandra and MongoDB.
  • Designed a data warehouse using Hive.
  • Collected the logs data from web servers and integrated in to HDFS using Flume.

Environment: Hadoop 2x, HDFS, Tableau, DB2,, Map Reduce, Sqoop, HBase, Shell Scripting, PIG, HIVE, Oozie, Core Java, Cloudera Hadoop Distribution, Oracle 11g, PL/SQL, SQL*PLUS, Toad, Windows NT, LINUX

Confidential, Farmington Hills, MI

Hadoop Developer

Responsibilities:

  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Experience in python scripting and writing python programs for map reduce.
  • Experienced in defining job flows.
  • Experienced in managing and reviewing Hadoop  log files.  
  • Load and transform large sets of structured, semi structured and unstructured data.  
  • Responsible to manage data coming from different sources.   
  • Supported Map Reduce Programs those are running on the cluster.  
  • Involved in loading data from UNIX file system to HDFS .  
  • Installed and configured Hive and also written Hive UDFs .  
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.  
  • Experience in computation system like Storm, Spark to process large stream of data.
  • Implemented CDH3 Hadoop cluster on CentOS .
  • Created HBase tables to store variable data formats of PII data coming from different portfolios.
  • Extensively working on Core Java for MapReduce Jobs.
  • Implemented best income logic using Pig scripts .
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.
  • Involved in templates and screens in HTML and JavaScript

Environment: Hadoop, MapReduce, HDFS, Hive, Java, SQL, PIG, Hbase,Sqoop, Spark, HTML, XML

Confidential, Kansas City, MO

Java/ Hadoop Developer

Responsibilities:
  • Involved in review of functional and non-functional requirements.
  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Installed and configured Pig and also written Pig Latin scripts.
  • Wrote MapReduce job using Pig Latin.
  • Involved in ETL, Data Integration and Migration
  • Imported data using Sqoop to load data from Oracle to HDFS on regular basis.
  • Developing Scripts and Batch Job to schedule various Hadoop Program.
  • Written Hive queries for data analysis to meet the business requirements.
  • Creating Hive tables and working on them using Hive QL.
  • Importing and exporting data into HDFS from Oracle Database and vice versa using Sqoop.
  • Experienced in defining job flows.
  • Got good experience with NOSQL database HBase, MongoDB.
  • Involved in creating Hive tables, loading the data and writing hive queries that will run internally in a map reduce way.
  • Developed a custom FileSystem plugin for Hadoop so it can access files on Data Platform. 
  • The custom FileSystem plugin allows Hadoop MapReduce programs, HBase, Pig and Hive to work unmodified and access files directly.
  • Designed and implemented MapReduce-based large-scale parallel relation-learning system
  • Extracted feeds form social media sites such as Facebook, Twitter using Python scripts.
  • Setup and benchmarked Hadoop/HBase clusters for internal use

Environment: Hadoop, MapReduce, HDFS, Hive, Java, Hadoop distribution of Cloudera, Pig, HBase, Linux, XML, Java 6, Eclipse, Oracle 10g, PL/SQL, MongoDB, Toad

Confidential, Chicago, IL

Java Developer

Responsibilities:
  • Application was developed using the Struts MVC architecture.
  • Developed action and form classes based on Struts framework to handle the pages.
  • Developed a web-based reporting for credit monitoring system with HTML5, XHTML, JSTL, custom tags and Tiles using Struts framework.
  • Understanding and analyzing business requirements, High Level Design and Detailed Design.
  • Provided high level systems design; this includes specifying the class diagrams, sequence diagrams and activity diagrams
  • Utilized Java/J2EE Design Patterns - MVC at various levels of the application and ATG Frameworks
  • Expertise in developing JSP's, Servlets and good with web services (REST, SOAP)
  • Served as DB Administrator, creating and maintaining all schemas
  • Collaborated in design, development and maintenance of the Front-end for applications using JSP, JSTL, Custom Tags.
  • Developed Servlets and JSPs based on MVC pattern using Struts framework.
  • Developing Web Services using Apache Axis 2 to retrieve data from legacy systems.
  • Developed Servlets, Action classes, Action Form classes and configured the struts-config.xml file.
  • Used XML parser APIs such as JAXP and JAXB in the web service's request response data marshalling as well as unmarshalling process.
  • Planned and implemented various SQL, Stored Procedure, and triggers.

Environment: J2EE, Java 1.5, Servlets, JSP, JDBC, JQuery, backbone.js, HTML5, JSTL, XML, Struts, Hibernate, Web Services, WebLogic Server, JSF, JAXB, Jasper Report, JUnit, SOAP, XML, JavaScript, UML, Apache Axis 2, ANT, MySQL.

Confidential, San Francisco, CA

Java Developer 

Responsibilities:
  • Extensively used Core Java, Servlets, JSP and XML
  • Resolved product complications at customer sites and funneled the insights to the development and deployment teams to adopt long term product development strategy with minimal roadblocks.
  • Convinced business users and analysts with alternative solutions that are more robust and simpler to implement from technical perspective while satisfying the functional requirements from the business perspective.
  • Played a crucial role in developing persistence layer.
  • Applied design patterns and OO design concepts to improve the existing Java/J2EE based code base.
  • Identified and fixed transactional issues due to incorrect exception handling and concurrency issues due to unsynchronized block of code.

Environment: Java 1.2/1.3, Swing, Applet, Servlet, JSP, custom tags, JNDI, JDBC, XML, XSL, DTD, HTML, CSS, Java Script, Oracle, DB2, PL/SQL, WebLogic, JUnit, Log4J and CVS. 

Confidential

Java Developer

Responsibilities:
  • Designed and developed user interfaces using JSP, Java script and HTML.
  • Developed web components using JSP, Servlets and JDBC
  • Used JDBC and managed connectivity, for inserting/querying& data management including stored procedures and triggers.
  • Developed Database applications using SQL and PL/SQL.
  • Involved in the design and coding of the data capture templates, presentation and component templates.
  • Developed an API to write XML documents from database.
  • Involved in fixing bugs and unit testing with test cases using Junit
  • Part of a team which is responsible for metadata maintenance and synchronization of data from database.

Environment: Java script, JSP, JDBC, Servlets, HTML, XML, SQL, Junit.

Hire Now