We provide IT Staff Augmentation Services!

Hadoop Developer Resume

Stamford, CT

SUMMARY

  • Over 8.5 years IT professional experience includes 4 years of Hadoop Big Data Implementation and Development and 4.5 years in developing Web based and distributed J2EE Enterprise Applications.
  • Experienced in building highly scalable Big - Data solutions using Hadoop and multiple distributions such as Cloudera, Horton works and NoSQL platforms.
  • Excellent knowledge in Hadoop Architecture, HDFS Framework and its eco system like MapReduce (MR), Hive, Zookeeper, Pig, HBase, Sqoop, Oozie, Flume, Kafka for data extraction, storage and analysis.
  • Well experienced Mapper, Shuffling, Practitioner, and Reducer and Combiner process along with custom Partitioning for efficient Bucketing.
  • Experience in analyzing data using HiveQL, Pig Latin, HBase and custom MapReduce programs in Java.
  • Solid understanding of Hadoop MRV1 and MRV2 (or) YARN Architecture.
  • Developed, deployed and supported several MapReduce applications in Java to handle Semi and Unstructured data.
  • Experience with Oozie Workflow management for sequence and parallel execution of Java, Map Reduce, Hive, Pig, Sqoop jobs.
  • Worked with relational database systems (RDBMS) such as MySQL, MS SQL, Oracle and NoSQL database systems like Mongo DB, HBase and Cassandra.
  • Experienced in Application Development using Hadoop, Java, J2SE, J2EE, JSP, Servlets, Struts, RDBMS, Tag Libraries, JDBC, Hibernate, XML and Linux shell scripting.
  • Experience working in Agile and Waterfall methodologies of Software Development Life Cycle (SDLC) including design, development, implementation, testing, deployment and support maintenance.
  • Expertise in implementing Object Oriented Programing (OOPS) with Java and J2EE.
  • Expertise in developing Java web based applications using Struts, Spring Web framework.
  • Proficient in Web Development using HTML4/5, CSS 2/3, JQuery, JavaScript, XML, AJAX and JSON.
  • Experience in working on Web services using REST and SOAP.
  • Extensively experience in working on IDEs like Eclipse, Net Beans, Edit Plus.
  • Experience working on various Web/Application servers Apache Tomcat, JBOSS, IBM Web Sphere and Web Logic.
  • Implemented Java/J2EE Design Patterns like Business Delegate and Data Transfer Object (DTO), Data Access Object (DAO) etc.
  • Expertise in Object Oriented Analysis and Programming like UML and use of various Design Patterns.
  • Highly motivated team player with strong communication, analytical and problem solving skills and also excellent individual worker.
  • Self starter and quick learner.
  • Excellent communication and Verbal Skills.

TECHNICAL SKILLS

Big Data: Hadoop, Map Reduce, HDFS, Hive, HBase, Pig, Sqoop, Oozie, Zookeeper, Flume, Mahout, AWS, YARN, Storm, Spark, Kafka, Mongo DB, Cassandra.

Java/J2EE: Java, J2EE, JSP, JavaScript, Servlets, JDBC, Struts, Java Beans, JMS, EJB.

Frameworks: Apache Struts, Hibernate, Spring, MVC.

Languages: Core Java, J2EE, C, SQL, PL/SQL, UML.

Web Services / Technologies: REST, SOAP, JSP, MVC, Spring, Hibernate, JavaScript, XML, HTML, CSS, AJAX, JSON and Maven.

Databases: NoSQL, SQL Server, MySQL, Oracle, DB2, PL/SQL.

Web/Application Servers: Apache Tomcat, Web Logic, Web Sphere, JBOSS.

Tools: Eclipse, NetBeans, HTML, JavaScript, XML, Talend

Build Tools: UML, Design Patterns, Maven, Ant

Hadoop Distributions: Cloudera, Horton works, MapR

PROFESSIONAL EXPERIENCE

Confidential, Stamford, CT

Hadoop Developer

Responsibilities:

  • Wrote MapReduce jobs and pig scripts using various input and output formats. Also designed custom format as per the business requirements.
  • Wrote extensive MapReduce jobs in Java to train the cluster and developed Java MapReduce programs for the analysis of sample log files stored in cluster.
  • Worked on Customer Segmentation Use Case life cycle.
  • Analyzed large and critical datasets using HDFS, Map Reduce, Hive, Pig.
  • Created and modified UDF and UDAF's for Hive whenever necessary and developed Hive queries to analyze data and generate results.
  • Developed Pig UDF's for preprocessing the data for analysis as per business logic.
  • Designed and Implemented Partitioning (Static, Dynamic), Buckets in Hive.
  • Used Sqoop to dump data from relational database into HDFS and HBase and viceversa.
  • Created HBase column families to store various data types coming from various sources.
  • Used Pig and Hive queries to analyze the historical data in HBase.
  • Involved in loading and transforming large sets of Structured, Semi-Structured and Unstructured data and analyzed them by running Hive queries and Pig scripts.
  • Involved in writing Hive queries for data analysis with respect to business requirements.
  • Automated all the jobs, for pulling net flow data from the relational databases to load data into Hive tables, workflows using Oozie and enabled email alerts on any failure cases.
  • Wrote Hive and Pig scripts as ETL tool to do transformations, event joins, filter both traffic and some preaggregations before storing into the HDFS.
  • Worked with Flume to import the log data from the reaper logs and syslog's into the Hadoop cluster.
  • Used complex data types like bags, tuples and maps in Pig for handling data.
  • Created workflow and coordinator using Oozie for regular jobs and to automate the tasks of loading the data into HDFS
  • Used Talend as a ETL tool to transform and load the data from different databases.
  • Incremental data movement using Sqoop and Oozie jobs.
  • Worked on analyzed data into relational database using Sqoop for making it available for visualization and report generation by the BI team using Tableau.

Environment: Apache Hadoop, HDFS, Pig, Hive, Sqoop, Spark, NoSQL, Map Reduce, Avro, Zookeeper, Hbase, Talend, Shell Scripting, Ubuntu, Flume, Tableau.

Confidential, Maryland

Hadoop Developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Written multiple MapReduce programs in Java for Data Analysis
  • Wrote MapReduce job using Pig Latin and Java API
  • Performed performance tuning and troubleshooting of MapReduce jobs by analyzing and reviewing Hadoop log files
  • Developed pig scripts for analyzing large data sets in the HDFS.
  • Knowledge on handling Hive queries using Spark SQL that integrate with Spark environment.
  • Responsible for creating Hive tables, loading the structured data resulted from MapReduce jobs into the tables and writing hive queries to further analyze the logs to identify issues and behavioral patterns.
  • Performed extensive Data Mining applications using Hive.
  • Performed streaming of data into Flume and Kafka and transferred to HDFS for analysis.
  • Responsible for performing extensive data validation using Sqoop jobs, Pig and Hive scripts were created for data ingestion from relational databases to compare with historical data.
  • Used Kafka to load data in to HDFS and move data into NoSQL databases.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Involved in submitting and tracking Map Reduce jobs using Job Tracker.
  • Involved in creating Oozie workflow and Coordinator jobs to kick off the jobs on time for data availability.
  • Used Visualization tools such as Power view for excel, Tableau for visualizing and generating reports.
  • Exported data to Tableau and excel with Power view for presentation and refining
  • Implemented business logic by writing Pig UDFs in Java and used various UDFs from Piggybanks and other sources
  • Implemented test scripts to support test driven development and continuous integration.
  • Actively participated in daily scrum meetings.

Environment: Hadoop, Map Reduce, HDFS, Pig, Hive, Sqoop, Flume, Oozie, Java, Linux, Maven, Zookeeper, Tableau, HBase, Cassandra.

Confidential, Houston, TX

Hadoop/Java Developer

Responsibilities:

  • Worked on analyzing, writing Hadoop MapReduce jobs using Java API, Pig Latin and Hive.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Involved in loading data from edge node to HDFS using shell scripting.
  • Used Sqoop to import the data from RDBMS to Hadoop Distributed File System (HDFS) and later analyzed the imported data using Hadoop Components
  • Created HBase tables to store variable data formats of data coming from different portfolios.
  • Implemented a script to transmit information from Oracle to HBase using Sqoop.
  • Implemented best income logic using Pig scripts and UDFs.
  • Implemented test scripts to support test driven development and continuous integration.
  • Worked on tuning the performance using Apache Pig queries.
  • Worked with QA team in preparation and review of test cases.
  • JUnit was used for unit testing for the integration testing tool.
  • Involved in loading and transforming large sets of Structured, Semi-Structured and Unstructured data and analyzed them by running Hive queries and Pig scripts.
  • Experience in managing and reviewing Hadoop log files.
  • Assisted application teams in installing Hadoop updates, operating system, patches and version upgrades when required
  • Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
  • Used Oozie workflow engine to run multiple Hive and pig jobs automatically.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.

Environment: Hadoop, HDFS, Hive, Apache Pig, Sqoop, HBase, Shell Scripting, Ubuntu, Linux Red Hat, ZooKeeper.

Confidential - Houston, TX

Sr. Java Developer

Responsibilities:

  • Worked with business analyst in understanding business requirements, design and development of the project.
  • Implemented the Struts frame work with MVC architecture.
  • Created new JSP's for the front end using HTML, Java Script, Jquery, and Ajax.
  • Developing JSP pages and configuring the module in the application.
  • Developed the presentation layer using JSP, HTML, CSS and client side validations using JavaScript.
  • Involved in creating Restful web services using JAX RS and JERSEY tool.
  • Involved in designing, creating, reviewing Technical Design Documents.
  • Developed DAOs (Data Access Object) using Hibernate as ORM to interact with DBMS - Oracle.
  • Applied J2EE design patterns like Business Delegate, DAO and Singleton.
  • Deployed and tested the application using Tomcat web server.
  • Using java scripts did client side validation.
  • Involved in developing DAO's using JDBC.
  • Worked with QA team in preparation and review of test cases.
  • JUnit was used for unit testing for the integration testing tool.
  • Writing SQL queries to fetch the business data using Oracle as database.
  • Developed UI for Customer Service Modules and Reports using JSF, JSP's and My Faces Components
  • Log4j used for logging the application log of the running system to trace the errors and certain automated routine functions.

Environment: and Tools: Java, JSP, JavaScript, Servlets, Struts, Hibernate, REST, EJB, JSF, JSP, Ant, Tomcat, Eclipse, SQL, Oracle.

Confidential

Java Developer

Responsibilities:

  • Involved in various phases of Software Development Life Cycle (SDLC) of the application like Requirement gathering, Design, Analysis and Code development.
  • Prepared Use Cases, sequence diagrams, class diagrams and deployment diagrams based on UML to enforce Rational Unified Process using Rational Rose.
  • Extensively worked on user interface for few modules using HTML, JSP's, and JavaScript.
  • Generated Business Logic using servlets, Session beans and deployed them on Web logic server.
  • Created complex SQL queries and stored procedures.
  • Used Hibernate ORM framework with spring framework for data persistence and transaction management.
  • Wrote test cases in JUnit for unit testing of classes.
  • Provided technical support for production environments resolving the issues, analyzing the defects, providing and implementing the solution defects.
  • Built and deployed Java application into multiple UNIX based environments and produced both unit and functional test results along with release notes.
  • Analyzed the banking and existing system requirements and validated them to suit J2EE architecture.
  • Designed the process flow between front-end and server side components
  • Developed and implemented the MVC Architectural Pattern using Struts Framework including JSP, Servlets,
  • EJB, Form Bean and Action classes.
  • Developed web based presentation-using JSP, AJAX using Servlets technologies and implemented using struts framework.
  • Designed and developed backend Java components residing on different machines to exchange information and data using JMS.
  • Involved in creating the Hibernate Objects and mapped using Hibernate Annotations.
  • Used JavaScript for client-side validation and Struts Validator Framework for form validations.
  • Implemented Java/J2EE Design patterns like Business Delegate and Data Transfer Object (DTO), Data Access Object.
  • Written Junit Test cases for performing unit testing.
  • Integrated Spring DAO for data access using Hibernate, used HQL and SQL for querying databases.
  • Worked with QA team for testing and resolving defects.
  • Used ANT automated build scripts to compile and package the application.
  • Used JIRA for bug tracking and project management.

Environment: J2EE, JSP, JDBC, Spring Core, Struts, Hibernate, Design Patterns, XML, WebLogic, Apache Axis, ANT, Clear case, JUnit, JavaScript, Web Services, SOAP, XSLT, JIRA, Oracle, PL/SQL Developer and Windows

Confidential

Java Programmer

Responsibilities:

  • Involved in Development of Servlets and Java Server Pages (JSP).
  • Involved in writing Pseudo-code for Stored Procedures.
  • Developed PL/SQL queries to generate reports based on client requirements.
  • Enhancement of the System according to the customer requirements.
  • Created test case scenarios for Functional Testing.
  • Used Java Script validation in JSP pages.
  • Helped design the database tables for optimal storage of data.
  • Coded JDBC calls in the servlets to access the Oracle database tables.
  • Responsible for Integration, unit testing, system testing and stress testing for all the phases of project.
  • Prepared final guideline document that would serve as a tutorial for the users of this application.

Environment: Java, Servlets, J2EE, JDBC, Oracle, PL/SQL, HTML, JSP, Eclipse, UNIX.

Hire Now