We provide IT Staff Augmentation Services!

Hadoop Developer Resume Profile

3.00/5 (Submit Your Rating)

Salem, NH

PROFESSIONAL SUMMARY

  • Over 7 years of IT experience as a Developer, Designer quality reviewer with cross platform integration experience using Hadoop, Java, J2EE.
  • Hands on experience with Hadoop, HDFS, MapReduce and Hadoop Ecosystem like Pig, Hive, Oozie, Flume and Hbase.
  • Hands on experience using Cloudera and Horton work Hadoop Distributions.
  • Hands on experience in installing, configuring and using Apache Hadoop ecosystems such as Map Reduce, HIVE, PIG, SQOOP, FLUME and OOZIE.
  • Strong understanding of various Hadoop services, MapReduce and YARN architecture.
  • Responsible for writing MapReduce programs.
  • Experienced in importing-exporting data into HDFS using SQOOP.
  • Load log data into HDFS using Flume.
  • Experience loadingdata to Hive partitions and creating buckets in Hive
  • Logical Implementation and interaction with HBase
  • Developed MapReduce jobs to automate transfer the data from HBase.
  • Expertise in analysis using PIG, HIVE and SQOOP and Puppet
  • Worked in Multiple Environment in installation and configuration.
  • Experienced in developing UDFs for Hive using Java.
  • Strong understanding of NoSQL databases like HBase, MongoDB Cassandra.
  • Familiar with handling complex data processing workflows using Oozie.
  • Scheduling all hadoop/hive/sqoop/Hbase jobs using Oozie.
  • Experience in setting cluster in Amazon EC2 S3 including the automation of setting extending the clusters in AWS Amazon cloud.
  • Developed core modules in large cross-platform applications using JAVA, J2EE, spring, Struts, Hibernate, JAX-WS Web Services, and JMS.
  • Worked on debugging tools such as Dtrace, Struss and Top. Expert in setting up SSH, SCP, SFTP connectivity between UNIX hosts.
  • Good understanding of Scrum methodologies, Test Driven Development and continuous integration.
  • Major strengths are familiarity with multiple software systems, ability to learn quickly new technologies, adapt to new environments, self-motivated, team player, focused adaptive and quick learner with excellent interpersonal, technical and communication skills

TECHNICAL SKILLS

Hadoop/Big Data : HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Oozie, and ZooKeeper.

No SQL Databases : Hbase, Cassandra, mongoDB

Languages : C, C , Java, J2EE, PL/SQL, Pig Latin, HiveQL, UNIX shell scripts

Java/J2EE Technologies : Applets, Swing, JDBC, JNDI, JSON, JSTL, RMI, JMS, Java Script, JSP, Servlets,

EJB, JSF, JQuery

Frameworks : MVC, Struts, Spring, Hibernate

Operating Systems : Sun Solaris, HP-UNIX, RedHat Linux, Ubuntu Linux and Windows XP/Vista/7/8

Web Technologies : HTML, DHTML, XML, AJAX, WSDL, SOAP

Web/Application servers : Apache Tomcat, WebLogic, JBoss

Databases : Oracle 9i/10g/11g, DB2, SQL Server, MySQL, Teradata

Tools and IDE : Eclipse, NetBeans, Toad, Maven, ANT, Hudson, Sonar, JDeveloper, Assent

PMD, DB Visualizer

Network Protocols : TCP/IP, UDP, HTTP, DNS, DHCP

PROFESSIONAL EXPERIENCE

Confidential

Hadoop Developer

Responsibilities

  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase database and Sqoop.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Implemented nine nodes CDH3 Hadoop cluster on Red hat LINUX.
  • Involved in loading data from LINUX file system to HDFS.
  • Worked on installing cluster, commissioning decommissioning of datanode, namenode recovery, capacity planning, and slots configuration.
  • Created HBase tables to store variable data formats of PII data coming from different portfolios.
  • Implemented a script to transmit sysprin information from Oracle to Hbase using Sqoop.
  • Implemented best income logic using Pig scripts and UDFs.
  • Implemented test scripts to support test driven development and continuous integration.
  • Worked on tuning the performance Pig queries.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
  • Responsible to manage data coming from different sources.
  • Involved in loading data from UNIX file system to HDFS.
  • Load and transform large sets of structured, semi structured and unstructured data
  • Cluster coordination services through Zookeeper.
  • Experience in managing and reviewing Hadoop log files.
  • Job management using Fair scheduler.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
  • Installed Oozie workflow engine to run multiple Hive and pig jobs.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.

Environment

Hadoop, HDFS, Pig, Sqoop, HBase, Shell Scripting, Ubuntu, Linux Red Hat.

Confidential

Hadoop Developer

Responsibilities

  • Installed and configured Hadoop MapReduce, HDFS and developed multiple MapReduce jobs in Java for data cleansing and preprocessing.
  • Involved in setting up Multi Node cluster in Amazon Cloud by creating instances on Amazon EC2.
  • Created MapReduce Jobs on Amazon Elastic Map Reduce Amazon EMR .
  • Involved in loading data from UNIX file system to HDFS.
  • Installed and configured Hive and also written Hive UDFs.
  • Evaluated business requirements and prepared detailed specifications that follow project guidelines required to develop written programs.
  • Devised procedures that solve complex business problems with due considerations for hardware/software capacity and limitations, operating times and desired results.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Provided quick response to ad hoc internal and external client requests for data and experienced in creating ad hoc reports.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
  • Worked hands on with ETL process.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
  • Extracted the data from Teradata into HDFS using Sqoop.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like shopping enthusiasts, travelers, music lovers etc.
  • Exported the patterns analyzed back into Teradata using Sqoop.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Installed Oozie workflow engine to run multiple Hive.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.

Environment

Hadoop, MapReduce, HDFS, Hive, Ooozie, Java jdk1.6 , Cloudera, NoSQL, Oracle 11g, 10g, PL SQL, SQL PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting.

Confidential

Java/J2EE Developer

Responsibilities

  • Created design documents and reviewed with team in addition to assisting the business analyst / project manager in explanations to line of business.
  • Responsible for understanding the scope of the project and requirement gathering.
  • Involved in analysis, design, construction and testing of the online banking application
  • Developed the web tier using JSP, Struts MVC to show account details and summary.
  • Used Struts Tiles Framework in the presentation tier.
  • Designed and developed the UI using Struts view component, JSP, HTML, CSS and JavaScript.
  • Used AJAX for asynchronous communication with server
  • Utilized Hibernate for Object/Relational Mapping purposes for transparent persistence onto the SQL Server database.
  • Used Spring Core for dependency injection/Inversion of control IOC , and integrated frameworks like Struts and Hibernate.
  • Involved in writing Spring Configuration XML files that contains declarations and other dependent objects declaration.
  • Used Tomcat web server for development purpose.
  • Involved in creation running of Test Cases for JUnit Testing.
  • Used Oracle as Database and used Toadfor queries execution and also involved in writing SQL scripts, PL/SQL code for procedures and functions.
  • Used CVS for version controlling.
  • Developed application using Eclipse and used build and deploy tool as Maven.
  • Used Log4J to print the logging, debugging, warning, info on the server console.

Environment

Java, J2EE Servlet, JSP , JUnit, AJAX, XML, JavaScript, Spring, Struts, Hibernate, Log4j, CVS, Maven, Eclipse, Apache Tomcat, and Oracle.

Confidential

Java Developer

Responsibilities

  • Extensively involved in the design of JSP screens for the Public Provident Fund and Bond modules.
  • Developed the user interface screens for the above modules.
  • Worked with the front-end applications using HRML, XML.
  • Developed the business components in core Java used in the JSP screens.
  • Implemented Delegate, Fa ade, DAO patterns for building the application.
  • Written Ant scripts for build, unit testing, deployment, check styles etc.
  • Used JUnit for unit testing.
  • I was part of all testing phases. Provided UAT support.
  • Created war files and deployed in Web Logic and Websphere Application Server.
  • Created tables, stored procedure, fulfills the requirements and accommodate the business rules in Oracle 8i database.
  • Delivered Zero defects in UAT.

Environment

Java, JSP, XML, HTML, Servlets, SQL, PL-SQL, JDK JDBC, Web Logic 6.1, Websphere, EJB, JNDI, Eclipse, Ant.

We'd love your feedback!