We provide IT Staff Augmentation Services!

Hadoop Developer Resume

5.00/5 (Submit Your Rating)

PhoeniX

SUMMARY

  • Hadoop Developer with 7 years of professional IT experience, which includes experience in the Big Data ecosystem, related technologies.
  • Good understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Experience in managing and reviewing Hadoop log files.
  • Experience in analyzing data using HiveQL, Pig Latin, HBase and custom MapReduce programs using Java.
  • Extending Hive and Pig core functionality by writing custom UDFs.
  • Experience in data management and implementation of Big Data applications using Hadoop frameworks.
  • Experience in understanding the security requirements for Hadoop and integrate with Kerberos authentication and authorization infrastructure.
  • Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and the Hadoop ecosystem.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Strong experience as Java Developer in Web/intranet, client/server technologies using Java, J2EE, Servlets, JSP, JSF, EJB, JDBC and SQL.
  • Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology.
  • Proficient in UNIX Shell scripting.
  • Excellent working knowledge of popular frameworks like Struts, Hibernate, and Spring MVC.
  • Experience in Agile Engineering practices.
  • Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills.

TECHNICAL SKILLS

Big Data Technologies: Hadoop, HDFS, Hive, Map Reduce, Pig, Sqoop, Flume, Zookeeper, Oozie, Accumulo, Avro, HBase

Languages: Java, J2EE, Hibernate, Spring, Guice, JPA, C/C++, Linux Script, SQL, Ruby, Python, Perl, AngularJS

Web Technologies: JavaScript, JSF,, JSP, Servlets, Java Swings, Java Beans, JSON, EJB, JMS, HTML, XML, CSS

IDE: Eclipse, RSA,Vmware, Apache

GUI: Visual Basic 5.0, Oracle, MS Office (Word, Excel, Outlook, PowerPoint, Access).

Browsers: Google Chrome, Mozilla Fire Fox, IE8

Testing Tools: Junit, Jmockit

Monitoring and Reporting: Ganglia,Nagios,Custom scripts

Application Servers: IBM WebSphere 5.x/6.x, WebLogic 8./9.x,Tomcat 5.x,Jetty

DB Languages: SQL, PL/SQL

NoSQL Databases: HBase, MongoDB

Operating Systems: LINUX/Unix, Windows XP, 7, Mac OS X

PROFESSIONAL EXPERIENCE

Confidential, Phoenix

Hadoop Developer

Responsibilities:

  • Building framework for storing and processing input data from various resources.
  • Ingesting high volume of transaction and notification data into HBase.
  • Providing additional flexibility to add/modify the existing attributes with ease.
  • Performing mandatory data and field validations on the incoming feed.
  • Performing data standardization on the successful data validated files and storing them in HBase.
  • Performing data analytics to derive profile-attributes using business rules.
  • Calculating positive and negative attributes and storing the final result into HBase.
  • Maintaining job status and configuration in relation table (MySQL) for tracking and further processing.
  • Scheduling Cron for triggering application M/R job using Quartz framework.
  • Purging records older than business defined days and archiving those records into a file.
  • Involved in various life cycle phase from requirement analysis to implementation.
  • JUnit was used to do the Unit testing for the application.
  • Worked on code review comments after lead review and fixed issues within.

Environment: Hadoop Map-R, HBase, HDFS, Quartz, Java Map-Reduce, Maven, SVN, Jenkins, Unix, MYSQL, Eclipse, Putty, FileZilla.

Confidential, Boston

Hadoop Developer

Responsibilities:

  • Launching and Setup of HADOOP Cluster on AWS, which includes configuring different components of HADOOP.
  • Experience in Using Sqoop to connect to the DB2 and move the pivoted data to Hive tables or Avro files.
  • Managed the Hive database, which involves ingest and index of data.
  • Expertise in exporting the data from avro files and indexing the documents in sequence or serde file format.
  • Hands on experience in writing custom UDF’s and also custom input and output formats.
  • Involved in design and architecture of custom Lucene storage handler
  • Configured and Maintained different topologies in storm cluster and deployed them on regular basis.
  • Understanding of Ruby scripts used to generated yaml files.
  • Monitored clusters using Nagios to send timely email for the alerts.
  • Involved in GUI development using Javascript and AngularJS and Guice.
  • Developed Unit test case using Jmockit framework and automated the scripts.
  • Hands on experience on Oozie workflow.
  • Worked in Agile environment, which uses Jira to maintain the story points and Kanban model.
  • Maintained different cluster security settings and involved in creation and termination of multiple cluster environments.
  • Involved in brain storming JAD sessions to design the GUI.
  • Hands on experience on maintaining the builds in Bamboo and resolved the build failures in Bamboo.

Environment: Hadoop HortonWorks, Bigdata, Hive, Hbase, Sqoop, Accumulo, Oozie, HDFS, MapReduce, Jira, Bitbucket, Maven, Bamboo, J2EE, Guice, AngularJS, Jmockit, Lucene, Storm, Ruby, Unix, Sql, AWS(Amazon Web Services).

Confidential, Seattle, WA

Hadoop Developer

Responsibilities:

  • Responsible for building system that ingests terabytes of data per day into Hadoop from a variety of data sources providing high storage efficiency and optimized layout for analytics.
  • Responsible for converting the Disney wide online video and ad impression tracking system, the source of truth for billing, from a legacy stream based architecture to a MapReduce architecture, reducing support effort.
  • Used Cloudera Crunch to develop data pipelines that ingests data from multiple data sources and process them.
  • Used Sqoop to move the data from relational databases to HDFS.
  • Used Flume to move the data from web logs onto HDFS.
  • Used Pig to apply transformations, cleaning and deduplication of data from raw data sources.
  • Used MRUnit for unit testing.
  • Experienced in managing and reviewing Hadoop log files.
  • Created adhoc analytical job pipeline using Hive and Hadoop Streaming to compute various metrics and dumped them in Hbase for downstream applications.

Environment: JDK1.6, Red Hat Linux, HDFS, Map-Reduce, Hive, Pig, Sqoop, Flume, Zookeeper, Oozie, Python, Crunch, HBase, MRUnit.

Confidential

Java/J2EE Developer

Responsibilities:

  • Analyze the requirements and communicate the same to both Development and Testing teams.
  • Involved in the designing of the project using UML.
  • Followed J2EE Specifications in the project.
  • Designed the user interface pages in JSP.
  • Used XML and XSL for mapping the fields in database.
  • Used JavaScript for client side validations.
  • Created stored procedures and triggers that are required for project.
  • Created functions and views in Oracle.
  • Responsible for updating database tables and designing SQL queries using PL/SQL.
  • Created bean classes for communicating with database.
  • Involved in documentation of the module and project.
  • Prepared test cases and test scenarios as per business requirements.
  • Prepared coded applications for unit testing using JUnit.

Environment: Struts, Hibernate, Spring, EJB, JSP, Servlets, JMS, XML, JavaScript, UML, HTML, JNDI, CVS, Log4J, JUnit, Windows 2000, Web Sphere App server, RAD, Rational Rose, Oracle 9i.

Confidential, San Antonio, TX

Java Developer

Responsibilities:

  • Performed in different phases of the Software Development Lifecycle (SDLC) of the application, including: requirements gathering, analysis, design, development and deployment of the application.
  • Model View Control (MVC) design pattern was implemented with Struts MVC, Servlets, JSP, HTML, AJAX, JavaScript, CSS to control the flow of the application in the Presentation/Web tier, Application/Business layer (JDBC) and Data layer (Oracle 10g).
  • Performed the analysis, design, and implementation of software applications using Java, J2EE, XML and XSLT.
  • Developed Action Forms and Controllers in Struts 2.0/1.2 framework.
  • Utilized various Struts features like Tiles, tagged libraries and Declarative Exception Handling via XML for the design.
  • Created XML Schema, XML template and used XML SAX/DOM API to parse them.
  • Implemented design patterns like Data Access Objects (DAO), Value Objects/Data Transfer Objects (DTO), Singleton etc.
  • Developed JavaScript validations on order submission forms.
  • Designed, developed and maintained the data layer using Hibernate.
  • JUnit was used to do the Unit testing for the application.
  • Used Apache Ant to compile java classes and package into jar archive.
  • Used Clear Quest to keep track of the application bugs as well as to coordinate with the Testing team.
  • Involved in tracking and resolving defects, which arise in QA & production environments.

Environment: Java, J2EE, JSP, Servlets, Struts 2.0/1.2, Hibernate, HTML, CSS, JavaScript, XML, JUnit, Apache Tomcat, PL/SQL, Oracle 11g, Apache Ant, Eclipse, Rational Rose.

Confidential

Java Developer

Responsibilities:

  • Performed in various phases of the Software Development Life Cycle (SDLC).
  • Designed and developed several multi-tiered J2EE application and products as per an Object Oriented Architecture OR SOA standards.
  • Developed user interfaces using JSP framework with AJAX, Java Script, HTML, XHTML, and CSS.
  • Performed the design and development of various modules using CBD Navigator Framework.
  • Had an understanding of all project requirements as specified in Use Cases, Requirements.
  • Actively participated in design and development of the Home Page, Investment Products and user maintenance screens for internal admin in IAM and FAS application as per UI prototype
  • Used Hibernate framework to communicate with the DB2 database for various modules.
  • Used JavaScript for client side validations and used JSF frame work for server side validation.
  • Used XSLT to transform XML documents into HTML templates.
  • Performed key role in designing and developing enterprise J2EE applications using RAD.
  • Used JProbe Framework for performance testing, code coverage.
  • Deployed J2EE applications in Web sphere application server by building and deploying ear file using ANT script.
  • Used J2EE design patterns like Spring MVC.
  • Actively participated in the Agile Development Process.

Environment & Tools : Java 1.5, J2EE, Spring MVC, JSF, Java Beans, JDBC, HTML, XHTML, JavaScript, AJAX, XML, XSLT, CSS, Hibernate, RAD 7.0, UML, JUnit, JTest, JProbe, Serena PVCS, Team Track and DB2.

We'd love your feedback!