We provide IT Staff Augmentation Services!

Hadoop Sr Developer Resume

3.00/5 (Submit Your Rating)

Livonia, MI

SUMMARY

  • 8+ Years of extensive IT experience in Analysis, Design, Development, Implementation of software applications which includes 3+ Years of experience in Big Data using Hadoop, HDFS, Hive, PIG, Sqoop, Oozie and MapReduce Programing.
  • Capable of processing large sets of structured, semi - structured and unstructured data and supporting systems application architecture.
  • Importing and exporting data into HDFS and HIVE.
  • Involved in creating HIVE tables, loading with data and writing HIVE queries.
  • Having experience in writing PIG Scripts.
  • Managing and Reviewing Hadoop Log Files, Deploy and Maintaining Hadoop Cluster.
  • Excellent understanding of Hadoop architecture and its components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce programming paradigm.
  • Experience in providing support to data analyst in running Pig and Hive queries.
  • Successfully loaded files to Hive and HDFS from Oracle.
  • Experience in working with different data sources like Avro data files, xml files, Json files, Sql server, Oracle to load data into Hive tables.
  • Worked on Performance Tuning of Hadoop jobs by applying techniques such as Map Side Joins, Partitioning and Bucketing.
  • Extensive noledge in NoSQL databases like MongoDB, Cassandra.
  • Configure and Install Hadoop and Hadoop Ecosystems (Hive/Pig).
  • Installed and configured Hive, Pig, Sqoop, Flume and Oozie on teh Hadoop cluster.
  • Experience utilizing Java tools in Business, Web, and Client-Server environments including Java, Jdbc, Servlets, Jsp, Struts Framework, Jasper Reports and Sql.
  • me has worked primarily in teh domains of banking and telecommunication and main area of experience TEMPhas been involved in project development of web applications.

TECHNICAL SKILLS

Platforms: Windows (2000/XP), Linux

Big Data Ecosystems: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, Oozie, Flume

Programming Languages: Java

Scripting Languages: Jsp 2.0, Servlets 2.3, JavaScript, HTML

Databases: Oracle8.x/9i, NoSQL

Frameworks: Struts 1.2

Tools: My Eclipse 5.1, Ant Build Tool, iReport 3.0, Hudson

Servers: JBoss, Tomcat 5.0, J2EE 1.3.1, BEA Web Logic 8.1, Web Sphere 6

Methodologies: UML, Design Patterns

Concepts: JMS, Jasper Reports

PROFESSIONAL EXPERIENCE

Confidential, Livonia, MI

Hadoop SR Developer

Responsibilities:

  • Involved in Importing and Exporting data into HDFS and Hive.
  • Involved in writing queries for external and internal tables in Hive.
  • Experienced in managing and reviewing Hadoop log files.
  • Experience in working with various kinds of data sources such as Oracle.
  • Successfully loaded files to Hive and HDFS from Oracle.
  • Responsible for managing data coming from different sources.
  • Supported Map Reduce Programs those are running on teh cluster.
  • Involved in creating Hive tables, loading with data and writing hive queries.
  • Load & transform large sets of structured, semi structured, unstructured data.
  • Worked on Performance Tuning of Hadoop jobs by applying techniques such as Map Side Joins, Partitioning and Bucketing.
  • Developed MapReduce programs to parse teh raw data, populate staging tables and store teh refined data in partitioned tables.
  • Managing and Reviewing Hadoop Log Files, deploy and Maintaining Hadoop Cluster.
  • Configure and Install Hadoop and Hadoop Ecosystems (Hive/Pig).

Environment: Hadoop, HDFS, HIVE, PIG, Map Reduce, Sqoop, Oozie.

Confidential, San Francisco, CA

Hadoop Developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted teh data from MySQL into HDFS using Sqoop.
  • Installed and configured Hive, Pig, Sqoop, Flume and Oozie on teh Hadoop cluster.
  • Setup and benchmarked Hadoop clusters for internal use.
  • Developed Simple to complex Map/reduce Jobs using Java programming language dat are implemented using Hive and Pig.
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms.
  • Analyzed teh data by performing Hive queries (HiveQL) and running Pig scripts (Pig Latin) to study customer behavior.
  • Used UDF’s to implement business logic in Hadoop.
  • Implemented business logic by writing UDFs in Java and used various UDFs from other sources.
  • Experienced on loading and transforming of large sets of structured and semi structured data.
  • Managing and Reviewing Hadoop Log Files, deploy and Maintaining Hadoop Cluster.

Environment: Hadoop, HDFS, HIVE, PIG, Map Reduce, Sqoop, Oozie, Eclipse, Java.

Confidential, Houston, Texas

Hadoop Developer

Responsibilities:

  • Created Hive queries dat halped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into teh Hadoop Distributed File System and PIG to pre-process teh data.
  • Provided design recommendations and thought leadership to sponsors/stakeholders dat improved review processes and resolved technical problems.
  • Managed and reviewed Hadoop log files.
  • Tested raw data and executed performance scripts.
  • Shared responsibility for administration of Hadoop, Hive and Pig.
  • Installed and configured MapReduce, HIVE and teh HDFS; implemented CDH3 Hadoop cluster on CentOS. Assisted with performance tuning and monitoring.
  • Created Hive tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Supported code/design analysis, strategy development and project planning.
  • Created reports for teh BI team using Sqoop to export data into HDFS and Hive.
  • Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
  • Assisted with data capacity planning and node forecasting.
  • Collaborated with teh infrastructure, network, database, application and BI teams to ensure data quality and availability.

Environment: Hadoop, HDFS, HIVE, PIG, Map Reduce, Sqoop, Oozie, Eclipse, Java.

Confidential

SR Java Developer

Responsibilities:

  • Developed teh presentation layer using JSP, HTML, CSS and client validations using JavaScript.
  • Involved in designing and development of teh ecommerce site using JSP, Servlet and JDBC.
  • Used Eclipse 6.0 as IDE for application development.
  • Validated all forms using Struts validation framework and implemented Tiles framework in teh presentation layer.
  • Configured Struts framework to implement MVC design patterns.
  • Designed and developed GUI using JSP, HTML, DHTML and CSS.
  • Worked with JMS for messaging interface.
  • Detailed Technical Design (DTD) Document Preparation.
  • Development and Deployment.
  • UTP Preparation.
  • Internal and External Review Comments fixing.
  • Fixing teh Bugs.
  • Preparing release documents such as Code Review Completion Report, Implementation Plan, Support Handover, Release Notes, TVR, and Induction Kit.

Environment: Core Java, JSP, HTML, Struts framework, Java Script, JDBC, XML, JMS, XSLT, UML, JUnit, Log4j.

Confidential

Java/ J2EE Developer

Responsibilities:

  • Involved in discussions and workshops with TTSL business teams.
  • Involved in Impact Analysis, Solution Design.
  • A major role in deploying teh application.
  • Client side validations.
  • Developing teh required DB Objects to handle different validations.
  • Design and development of teh screens according to teh business requirements using Servlets, JSP.
  • Developed ANT script for compiling and deployment
  • Performed unit testing using JUnit
  • Extensively used Log4j for logging teh log files
  • Used Subversion as teh version control system
  • Used CVS as a repository for managing/deploying application code.
  • Involved, supported and solved critical production issues, also involved in deploying applications in production environment.
  • Followed a complete software development life cycle, gathered requirements from teh Business, detail analysis, conceptual and detail design, development and testing.
  • Used My Eclipse for complete development, debugging and deploying teh application.

Environment: Core Java, JSP, HTML, Struts framework, Java Script, JDBC, Servlets, Log4j, SQL.

Confidential 

Java Developer

Responsibilities:

  • Designed and developed GUI using JSP, HTML, DHTML and CSS.
  • Developed teh presentation layer using JSP, HTML, CSS and client validations using JavaScript.
  • Involved in designing and development of teh site using JSP, Servlet, JavaScript and JDBC.
  • Used Eclipse 6.0 as IDE for application development
  • Validated all forms using Struts validation.
  • Configured Struts framework to implement MVC design pattern.
  • Designed and developed GUI using JSP, HTML, DHTML and CSS
  • Worked with JMS for messaging interface.
  • Deployed teh entire project on WebLogic application server.
  • Used AJAX for interactive user operations and client side validations.
  • Used XML for ORM mapping relations with teh java classes and teh database.
  • Used XSL transforms on certain XML data.
  • Developed ANT script for compiling and deployment.
  • Performed unit testing using Junit.
  • Extensively used Log4j for logging teh log files.
  • Used Subversion as teh version control system.

Environment: Java 1.4, HTML, CSS, DHTML, JDBC, XML, Oracle, JavaScript, JSP, ANT, SVN, Jboss, Eclipse.

We'd love your feedback!