We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

OBJECTIVE

  • To seek a challenging position posed by Big Data management to enhance my technical expertise and contribute to teh success of teh organization in Hadoop, Java and Linux environments.

SUMMARY

  • Over all 8+ year’s Professional experience in IT and 3+ years of experience in Hadoop technologies. Analysis, design, development, implementation, integration and testing of Client - Server applications using Object Oriented Analysis Design (OOAD) and methodologies.
  • Hands on experience working on hadoop ecosystem components like Hadoop Map Reduce, HDFS, Zoo Keeper, Oozie, Hive, Sqoop, Pig, Flume, Avro.
  • Experience using scripting languages like Pig to manipulate data.
  • Experience in writing UDFs for Apache Hive and Apache Pig.
  • Hands on experience in writing Hadoop Map Reduce jobs in Java language.
  • Experience in designing, implementing and managing Secure Autantication mechanism to Hadoop Cluster wif Kerberos.
  • Experience in upgrading teh existing Hadoop cluster to latest releases.
  • Experience in working wif Flume to load teh log data from multiple sources directly into HDFS.
  • Experience in Importing and exporting data from different databases like MySQL, Oracle into HDFS and Hive using Sqoop.
  • Experience in designing both time driven and data driven automated workflows using Oozie.
  • Experience in configuring Zookeeper to coordinate teh servers in clusters to maintain teh data consistency.
  • Storage experience in configuring and managing NAS (File level Access - NFS) and SAN (Block level Access - iSCSI)
  • Experience in building infrastructure from bare metal by utilizing DHCP, PXE, KICKSTART, DNS and NFS.
  • Experience in setting up monitoring infrastructure for Hadoop cluster using Nagios and Ganglia.
  • Knowledge on Apache Spark, Spark SQL, Spark Streaming.
  • Experience in using Cloudera Manager for Installation and management of Hadoop Cluster.
  • Experience in supporting data analysis projects using Elastic Map Reduce on teh Amazon Web Services (AWS) cloud. Exporting and importing data into S3.
  • Experience in developing Shell Scripts for system management and for automating routine tasks.
  • In-depth understanding of Data Structure and Algorithms.
  • Good Understanding of Distributed Systems and Parallel Processing architecture.
  • Effective problem solving skills and outstanding interpersonal skills.Ability to work independently as well as wifin a team environment. Driven to meet deadlines. Ability to learn and use new technologies quickly.

TECHNICAL SKILLS

Operating Systems: Windows (7,XP, 2000,NT, 98,95), Unix, Linux

Languages: Java (J2SE, J2EE), XML, XSL, HTML, Java Script, SQL and PL/SQL.

Enterprise/web: J2EE,SOA,Webservices(SOAP,WSDL,UDDI), Servlets, JSP, EJB, JMSMDB, JSTL, Struts, AJAX,Java Script, XML, XSLT, HTML, CSS, JFC / Swing, JSF and JavaMail

Open Source: Struts, Spring, Hibernate 3.0

Application Servers: Web Logic 10.X/8.X/7.X/6.X, Websphere 6.X, JBoss 3.X/4.X/5.X

Web Servers: JavaWebserver, Tomcat 6.X, Apache Server, IIS

Project Management: MS Projects, MS Excel, MS Word, Visio.

RDBMS: Oracle 7.3/8i/10g, DB2, MS Access,PostgreSQL, MS SQL Server 2000

IDE Tools: Eclipse 3.X, MyEclipse5.X, WSAD6.0,IntelliJ and RAD

Version Control System: PVCS, Subversion (SVN), CVS (Win CVS), VSS and Rational Clear Case.

Build Tools: ANT

Methodologies: UML, RUP, Agile methodologies, Rational Rose.

We'd love your feedback!