We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

IL

SUMMARY

  • Around 7+ years of experience in Software Development Life Cycle process which includes analysis, design, development and maintenance of Hadoop and Java based applications.
  • 2+ years of experience in Hadoop development.
  • Extensive experience of development using Hadoop ecosystem covering Map Reduce, HDFS, YARN, Hive, Pig, Hbase, Apache Kafka, Sqoop, Oozie, Cloudera.
  • Proficient in development methodologies such as Agile, Scrum and Waterfall.
  • Experience in working with various Cloudera distributions (CDH4/CDH5) and have knowledge on Horton works.
  • Hands on experience in writing Map Reduce jobs using Java.
  • Expert level understanding of Hadoop HDFS and Map Reduce internals.
  • Knowledge on RDBMS databases like Oracle, MySQL and DB2.
  • Experience in strong and analyzing data using HiveQL, Pig Latin, HBase and custom Map Reduce programs in Java.
  • Performed Importing and exporting data into HDFS and Hive using Sqoop.
  • Experience in writing shell scripts to dump the Shared data from MySQL servers to HDFS.
  • Good knowledge about YARN configuration.
  • Knowledge of NoSQL databases such as HBase, MongoDB and Cassandra.
  • Manage and review Hadoop log file and Knowledge on Spark & Storm.
  • Experience in shell and python scripting languages.
  • Working knowledge in multi - tiered distributed environment, OOAD concepts, good understanding of Software Development Lifecycle (SDLC).
  • Thorough working knowledge in application development using Java SE, with strong experience in Spring, Hibernate, Struts, Spring Boot, Rest API/SOAP web services, JDBC, Servlet, EJB, JMS, Java Server Pages, JQuery, JavaScript, JSF, Angular JS, AJAX, XML and HTML5.
  • Experience in implementing software best practices, including Design patterns, Use Cases, Object oriented analysis and design, agile methodologies, and Software/System Modeling (UML).
  • Worked excessively on Core Java concepts like polymorphism, Collection, inheritance, serialization, synchronization, multi-threading, exception handling and socket programming.
  • Experience in software testing, Junit testing, regression testing and defect tracking and management using JIRA.
  • Used version controller tool like GITHUB, Pub-Subversion and Confidential .
  • Extensively worked on debugging using Eclipse, Net Beans debugger.
  • Experience in working with web development technologies such as HTML, CSS, and JavaScript.
  • Configuration of Web Sphere MQ, Splunk Forwarder.
  • Hands on experience working with Web Services such as Restful services.

TECHNICAL SKILLS

Big Data Ecosystems: HDFS, Sqoop, Flume, Hive, Pig, Map Reduce YARN, Oozie, Kafka, Spark and HBASE.

Big Data Platforms: Horton works, Cloudera and AWS.

Databases: NoSQL, Oracle and MySQL.

Languages: SQL, Pig Latin, HiveQL, Unix, Java, JavaScript, python and scala.

Operating Systems: Linux, windows.

Development Methodologies: Agile, Scrum, waterfall.

Web technologies: JSP, JavaScript, jQuery, AJAX, XML, XSLT, HTML5, DHTML and CSS3.

PROFESSIONAL EXPERIENCE

Confidential, IL

Hadoop Developer

Responsibilities:

  • As a Hadoop Developer worked with all the Legacy Claims systems to understand the data and contributed to architecture design for Legacy Archival and Retrieval of Claims data.
  • Migrated data from Hadoop to AWS S3 bucket using DISTCP. Also migrated data across new and old clusters using DISTCP.
  • Automated all the jobs, for pulling data from FTP server to load data into Hive tables, using Oozie workflows. Also Wrote Pig scripts to run ETL jobs on the data in HDFS.
  • Written Hive queries for data analysis to meet the business requirements.
  • Created partitioned tables in Hive. Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Worked on migrating Map Reduce programs into Spark transformations using Spark and Scala.
  • Implement a data migration/archival methodology that preserve business records in the legacy applications in an audit-ready, read only point in time format.
  • Data is processed using Map Reduce and the result is stored in HBase and displayed as per the user requirement either Pie/bar chart or both.
  • UsedImpalato read, write and query the data in HDFS.
  • Plan and review the deliverables. Assist the team in their development & deployment activities.
  • Involved in hardware architectural guidance, planning and estimating cluster capacity, and creating roadmaps for Hadoopcluster deployment.
  • Executed various activities and ensured on-time delivery of the quality deliverables to the client.

Environment: CDH 5, AWS, DISTCP, Pig, Hive, Map Reduce, YARN, Oozie, Flume, Sqoop, Impala, Spark, Scala, SQL Server, AS400, Oracle, Shell Scripting.

Confidential, Beaverton, OR

Hadoop Developer

Responsibilities:

  • Involved in business requirement gathering, analysis, feasibility research of the project, estimation and enhancements.
  • Responsible for importing data to HDFS using Sqoop from different RDBMS servers and exporting data using Sqoop to the RDBMS servers after aggregations for other ETL operations.
  • Collected and aggregated large amounts of web log data from different sources such as web servers, mobile using Apache Flume and stored the data into HDFS/HBase for analysis.
  • Developed an automated process using Shell script which drives the data pull process from RDBMs to Hadoop using Sqoop.
  • Worked on custom Pig Loaders and storage classes to work with variety of data formats such as JSON and XML file formats.
  • Developed automated processes for flattening the upstream data from Cassandra which in JSON format. Used Hive UDFs to flatten the JSON Data.
  • Highly involved in creating tables on Hive, views for the data pulled.
  • Developed Hive queries for performing DQ checks on the data loaded to HDFS.
  • Created Partitioning, Bucketing, Map side Join, Parallel execution for optimizing the hive queries.
  • Visualize the HDFS data to customer using BI tool with the help of Hive ODBC Driver.
  • Designed test cases to test the connectivity to various RDBMs.
  • Conduct Knowledge Transfer (KT) sessions on the business value and technical functionalities incorporated in the developed modules for new recruits.

Environment: Hadoop 2.2.0, Map Reduce, Hive, Pig, HBase, Oozie, Sqoop, Flume, Core Java, Cloudera Distributed Hadoop (CDH), HDFS, RDBMS, JSON, XML.

Confidential - Irvine, California

JAVA Developer

Responsibilities:

  • Developed web pages using Struts framework 2.0, Ajax, JSP, XML, JavaScript, HTML/ DHTML and CSS, configure struts application, use tag library.
  • Developed Application using Spring and Hibernate, Spring batch, Web Services like SOAP using Apache Axis Containers and restful Web services.
  • Used Spring Framework at Business Tier and also Spring’s Bean Factory for initializing services.
  • Used Spring IOC to inject services and their dependencies.
  • Used AJAX, JavaScript and GWT to create interactive user interface.
  • Developed server-side scripts in Python to customizeGIT and integrate it with tools like JIRA and Jenkins.
  • Designed and developed several EJBs using Session facade pattern.
  • Taken care of Java Multithreading, Collections, File Handling, Serialization part in back end components.
  • Done the design, development and testing phases of Software Development using AGILE methodology and Test Driven Development (TDD)
  • Performed Test Driven Development (TDD) using JUnit and Mockito.
  • Implemented Hibernate to persist the data into Database and wrote HQL based queries to implement CRUD operations on the data.
  • Developed Web Services to communicate to other modules using XML based SOAP using Apache Axis Container and WSDL.
  • Developed REST API components using Jersey REST framework followingTDDapproach for processing Product Upgrade requests.
  • Involved in writing Junit test cases for controller classes by usingMockito, Junit Framework.
  • Developed test code in Java language using Eclipse, IDE andTesting framework.
  • UsedTesting framework to run unit test and Maven to build the Project.
  • Designed the complex BPDs (Business Process Definition) of the application.
  • Extensively used design patterns like Singleton, Value Object, Service Delegator and Data Access Object.
  • Used Spring Core Annotations for Dependency Injection and usedApache Camel to integrate spring framework.
  • Followed top down approach to implement SOAP based web services & used Apache AXIS commands to generate artifacts from WSDL file.
  • Used SOAP-UI to test the Web Services using WSDL.
  • Used JERSEY API to develop restful web services.
  • Development and Integration of the Application using Eclipse IDE.
  • Used Maven tool to build project and JUnit to develop unit test cases.
  • Used the Log4j framework to log the system execution details to log files.

Environment: Java 1.7, Spring, Hibernate, HTML, HTML5, TDD, CSS, CSS3, Java Script, AJAX, Eclipse, XML, Confidential, Maven, WSDL, SOAP, Apache AXIS, JSE, JAX-WS, AngularJS, Python, JAX-RS, JERSEY, SOAP UI, Log4J, DB2, Oracle 11g, IBM Web Sphere server, UNIX, DB2- SQL & PL/SQL.

Confidential

JAVA Developer

Responsibilities:

  • Developing User Interface and implementing business process using JSP and Servlets.
  • Development of the application that was used to create dynamic JSPs, given input from a database table containing information about the contents of the JSP being developed.
  • Involved in writing the presentation layer in Java Server Pages (JSP)
  • Developed the front end User Interface using HTML5, JavaScript, CSS3, JSON, JQuery
  • Wrote Servlets programming and JSP scripting for the communication between web browser and server.
  • Responsible for coding SQL Statements and Stored procedures for back end communication using JDBC.
  • Developed an API to write XML documents from a database. Utilized XML and XSL Transformation for dynamic web-content and database connectivity.
  • Coded different deployment descriptors using XML. Generated Jar files are deployed on Apache Tomcat Server.
  • Involved in the development of presentation layer and GUI framework in JSP. Client Side validations were done using JavaScript.
  • Involved in code reviews and mentored the team in resolving issues.
  • Participated in weekly design reviews and walkthroughs with project manager and development teams.
  • Provide technical guidance to business analysts, gather the requirements and convert them into technical specifications/artifacts for developers to start.

Environment: HTML, JSP, Servlets, JDBC, JavaScript, Tomcat, Eclipse IDE, XML, XSL, Tomcat 5.

Confidential

PL/SQL Developer

Responsibilities:

  • Analyzed Business requirements based on the Business Requirement Specification document.
  • Performed extensive query analysis and tuning, indexes and hints and written numerous complex queries involving sub-queries, correlated queries, union/all, minus, inline SQL’s, analytical function SQL’s.
  • Developed program specifications for PL/SQL Procedures and Functions to do the data migration and conversion.
  • Created wide range of data types, tables, and index types and scoped variables.
  • Designed the front end interface for the users, using Oracle Forms.
  • Involved in database development by creating Oracle PL/SQL Functions, Procedures, Triggers, Packages, Records and Collections.
  • Involved in development of ETL process using SQL* Loader and PL/SQL Package.
  • Developed and customized Forms/Reports Using Oracle D2K.
  • Designed Data layouts and Developer Reports using Oracle D2K.
  • Implemented batch jobs (shell scripts) for loading database tables from Flat Files using SQL*Loader.
  • Participated in Performance Tuning using Explain Plan.
  • Created numerous of database Triggers using PL/SQL.
  • Involved in Technical Documentation, Unit test, Integration Test, writing the Test plan and version controlling with Confidential .
  • Created UNIX shell and Perl scripts for data file handling and manipulations.

Environment: Oracle 9i/10g, SQL, PL/SQL, SQL*Plus, Oracle D2K, SQL*Loader.

We'd love your feedback!