We provide IT Staff Augmentation Services!

Hadoop Developer Resume

5.00/5 (Submit Your Rating)

Plano, TX

SUMMARY

  • Hadoop Developer with 7+ years of professional IT experience, which includes experience in the Big Data ecosystem, related technologies.
  • Working experience with large scale Hadoop environments build and support including design, configuration, installation, performance tuning, Analytics and monitoring.
  • Over Three years of experience in Hadoop, Spark, Hive, Pig, Shark, Hue, Sqoop, Impala, Hcatalog, Spring - XD, Ganglia, Hadoop streaming and designing and implementing Map/Reduce jobs to support distributed data processing and process large data sets utilizing the Hadoop cluster.
  • Experience in analyzing data using HiveQL, Pig Latin, HBase and custom MapReduce programs in Java.
  • Experience in managing and reviewing Hadoop Log files.
  • Experience with Oozie Workflow Engine in running workflow jobs with actions that run Hadoop Map/Reduce and Pig jobs.
  • Experience in architecting and creating Cassandra database systems.
  • Experience in Cassandra systems backup and recovery, Cassandra security.
  • Experience in Cassandra maintenance and tuning - both database and server
  • Experience in Web Server like Apache Tomcat 5.0.
  • Experienced in working with Oracle 9i, MYSQL databases.
  • Strong experience as a senior Java Developer in Web/intranet, client/server technologies using Java, J2EE, Servlets, JSP, JSF, EJB, JDBC and SQL.
  • Experienced in web-based applications and Performance Tuning In Object-Oriented Systems.
  • Exposed to Restful, SOAP Web services and Web services concepts including WSDL, UDDI.
  • Implemented SOA principles as a part of developing SOAP based Web services.
  • In depth knowledge and experience in developing presentation layer using HTML, CSS, AJAX, JavaScript, jQuery and XML
  • Experience in deploying business applications using application servers like Java 2 Enterprise Server, Weblogic, JBoss and Apache Tomcat and deploying in various environments such as Windows, Linux with IDEs such as Eclipse and MyEclipse
  • Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
  • Working experience with testing tools like Junit.
  • Worked with the software development models, RUP Software Development Methodology, Waterfall Model and the Agile Software Development Methodology.
  • Experience in Administering, Installation, configuration, troubleshooting, Security, Backup, Performance Monitoring and Fine-tuning of Linux Redhat.

TECHNICAL SKILLS

Big Data Technologies: Hadoop, HDFS, Hive, Map Reduce, Pig, Sqoop, Flume, Zookeeper, Oozie, Accumulo, Avro, HBase, Spring-XD, Kaafka, Spark, Shark, Impala and Ganglia.

Languages: Java, J2EE, Hibernate, Spring, Guice, JPA, C/C++, Linux Script, SQL, Ruby, Python, Perl, AngularJS

Web Technologies: JavaScript, JSF, Ajax, Jquery, JSP, Servlets, Java Swings, Java Beans, JSON, EJB, JMS, HTML, XML, CSS

IDE: Eclipse, RSA,Vmware, Apache

GUI: Visual Basic 5.0, Oracle, MS Office (Word, Excel, Outlook, PowerPoint, Access).

Browsers: Google Chrome, Mozilla Fire Fox, IE8, safari

Testing Tools: Junit, Jmockit, EasyMock, PowerMock, Jprofile

Monitoring and Reporting: Ganglia,Nagios,Custom scripts

Application Servers: IBM WebSphere 5.x/6.x, WebLogic 8./9.x,Tomcat 5.x,Jetty

DB Languages: SQL, PL/SQL

NoSQL Databases: Hbase, MongoDB

Operating Systems: LINUX/Unix, all Windows, Mac OS X

PROFESSIONAL EXPERIENCE

Confidential, Plano, TX

Hadoop Developer

Responsibilities:

  • Launching and Setup of HADOOP Cluster on AWS, which includes configuring different components of HADOOP.
  • Experience in Using Sqoop to connect to the DB2 and move the pivoted data to hive tables or Avro files.
  • Managed the Hive database, which involves ingest and index of data.
  • Expertise in exporting the data from avro files and indexing the documents in sequence or serde file format.
  • Hands on experience in writing custom UDF’s and also custom input and output formats.
  • Involved in design and architecture of custom Lucene storage handler
  • Configured and Maintained different topologies in storm cluster and deployed them on regular basis.
  • Understanding of Ruby scripts used to generated yaml files.
  • Monitored clusters using Nagios to send timely email for the alerts.
  • Involved in GUI development using Javascript and An gularJS and Guice.
  • Developed Unit test case using Jmockit framework and automated the scripts.
  • Hands on experience on Oozie workflow.
  • Worked in Agile environment, which uses Jira to maintain the story points and Kanban model.
  • Maintained different cluster security settings and involved in creation and termination of multiple cluster environments.
  • Involved in brain storming JAD sessions to design the GUI.
  • Hands on experience on maintaining the builds in Bamboo and resolved the build failures in Bamboo.

Environment: Hadoop, Bigdata, Hive, Hbase, Sqoop, Accumulo, Oozie, HDFS, MapReduce, Jira, Bitbucket, Maven, Bamboo, J2EE, Guice, AngularJS, Jmockit, Lucene, Storm, Ruby, Unix, Sql, AWS(Amazon Web Services),HortonWorks.

Confidential, Plano, TX

Hadoop Developer

Responsibilities:

  • Involved in the Complete Software development life cycle (SDLC) to develop the application.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase databaseand Sqoop.
  • Involved in loading data from LINUX file system to HDFS.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reportsfor the BI team.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Implemented test scripts to support test driven development and continuous integration.
  • Installed and configured Hadoop, Map Reduce, HDFS (Hadoop Distributed File System) developed multiple Map Reduce jobs in java for data cleaning
  • Installed and configured HadoopMapReduce, HDFS, Developed multiple MapReduce jobs in java for datacleaning and preprocessing.
  • Created Pig Latin scripts to sort, group, join and filter the enterprise wise data.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in Map Reduceway.
  • Supported MapReduce Programs those are running on the cluster.
  • Analyzed large data sets by running Hive queries and Pig scripts.
  • Worked on tuning the performance Pig queries.
  • Mentored analyst and test team for writing Hive Queries.
  • Installed Oozie workflow engine to run multiple Mapreduce jobs.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as Required

Environment: Hadoop, HDFS, Map Reduce, Hive, Pig, Sqoop, Linux, Java, Oozie, Hbase

Confidential, Westbrook, ME

Java/J2EE Developer

Responsibilities:

  • Analyze the requirements and communicate the same to both Development and testing teams.
  • Involved in the designing of the project using UML.
  • Followed J2EE Specifications in the project.
  • Designed the user interface pages in JSP.
  • Used XML and XSL for mapping the fields in database.
  • Used JavaScript for client side validations.
  • Created stored procedures and triggers that are required for project.
  • Created functions and views in Oracle.Responsible for updating database tables and designing SQL queries using PL/SQL.
  • Created bean classes for communicating with database.
  • Involved in documentation of the module and project.
  • Prepared test cases and test scenarios as per business requirements.
  • Prepared coded applications for unit testing using JUnit.

Environment: Struts, Hibernate, Spring, EJB, JSP, Servlets, JMS, XML, JavaScript, UML, HTML, JNDI, CVS, Log4J, JUnit, Windows 2000, Web Sphere App server, RAD, Rational Rose, Oracle 9i

Confidential, San Antonio, TX

Java Developer

Responsibilities:

  • Performed in different phases of the Software Development Lifecycle (SDLC) of the application, including: requirements gathering, analysis, design, development and deployment of the application.
  • Model View Control (MVC) design pattern was implemented with Struts MVC, Servlets, JSP, HTML, AJAX, JavaScript, CSS to control the flow of the application in the Presentation/Web tier, Application/Business layer (JDBC) and Data layer (Oracle 10g).
  • Performed the analysis, design, and implementation of software applications using Java, J2EE, XML and XSLT.
  • Developed Action Forms and Controllers in Struts 2.0/1.2 framework.
  • Utilized various Struts features like Tiles, tagged libraries and Declarative Exception Handling via XML for the design.
  • Created XML Schema, XML template and used XML SAX/DOM API to parse them.
  • Implemented design patterns like Data Access Objects (DAO), Value Objects/Data Transfer Objects (DTO), Singleton etc.
  • Developed JavaScript validations on order submission forms.
  • Designed, developed and maintained the data layer using Hibernate.
  • JUnit was used to do the Unit testing for the application.
  • Used Apache Ant to compile java classes and package into jar archive.
  • Used Clear Quest to keep track of the application bugs as well as to coordinate with the Testing team.
  • Involved in tracking and resolving defects, which arise in QA & production environments.

Environment: Java, J2EE, JSP, Servlets, Struts 2.0/1.2, Hibernate, HTML, CSS, JavaScript, XML, JUnit, Apache Tomcat, PL/SQL, Oracle 11g, Apache Ant, Eclipse, Rational Rose

We'd love your feedback!