We provide IT Staff Augmentation Services!

Hadoop Developerp Resume

5.00/5 (Submit Your Rating)

Pittsburgh, PA

SUMMARY:

  • Over 8 years of experience in IT, which includes hands on experience of 3+years inBigData technologies.
  • Knowledge on Hadoop architecture and its components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode, Resource Manager, Node Manager and MapReduce programming paradigm.
  • Hands on experience on major components in Hadoop Ecosystem like Hadoop Map Reduce, HDFS, HIVE,PIG, Pentaho, HBase, Zookeeper, Sqoop, Oozie, Cassandra, Flume and Avro.
  • Experience in developing Pig Latin Scripts and Hive Query Language.
  • Involved in project planning, setting up standards for implementation and design of Hadoop based applications.
  • Experience in setting up cluster and monitoring cluster performance based on the usage.
  • Experience in writing custom UDFs in pig and hive based on the user requirements.
  • Experience in storing, processing unstructured data using NOSQL databases like HBase, Cassandra and MongoDB.
  • Experience in writing work flows and scheduling jobs using Oozie.
  • Written Hive queries for data analysis and to process the data for visualization.
  • Experience in managing and reviewing Hadoop Log files.
  • Experience in importing and exporting the different formats of data into HDFS, Hbasefrom different RDBMS databases and vice versa.
  • Experience in complete project life cycle (design, development, testing and implementation) of Client Server and Web applications.
  • Experience in Object - Oriented Design, Analysis, Development, Testing and Maintenance.
  • Hands on experience on IDE tools like Eclipse, NetBeans, Visual Studio, Maven.
  • Exposure to Cloudera development environment and management using Cloudera Manager.

TECHNICAL SKILLS:

Hadoop/Big Data: HDFS, MapReduce, YARN, Pig, Hive, Impala, HBase, MongoDB, Cassandra, Mahout, Oozie, Sqoop, Zookeeper, Flume, Spark, Falcon, Java & JEE Technologies Core Java, Hibernate, Spring,JSP, Servlets, Java Beans, JDBC, EJB 3.0, IDE Tools Eclipse, Net Beans.

Programming/Scripting languages: Java, Python, Shell Scripting, Groovy, Swift, PHP, jQuery, HTML5, CSS, AJAX, XHTML, JavaScript,AngularJS, HTML, XML, XSLT, XPATH, CSS, DOM, WSDL, GWT, Perl, VB Script.

Databases: Oracle, MySQL, DB2

Operating Systems: Windows, UNIX, Mac OS, CentOS, Ubuntu.

Other Tools: WinScp, Stream weaver, Putty

PROFESSIONAL EXPERIENCE:

HADOOP DEVELOPERP

Confidential, Pittsburgh, PA

Responsibilities:

  • Installed and configured Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
  • Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
  • Loaded data into the cluster from dynamically generated files using Flume and from RDBS, MongoDB using Sqoop.
  • Created Pig Latin scripts to sort, group, join and filter the enterprise wise data.
  • Used Hive and created Hive tables and involved in data loading and writing Hive UDFs.
  • Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.
  • Used Oozie to automate/schedule business workflows which invoke Sqoop, MapReduce and Pig jobs as per the requirements.
  • Job management using Fair scheduler.
  • Experience in using Zookeeper for coordinating the distributed applications.
  • Used Maven, Eclipse and Ant to build the application.
  • Experienced in managing and reviewing Hadoop log files.
  • Used Compression techniques to process data before storing it to HDFS.
  • Used Maven extensively for building MapReduce jar files and deployed it to Amazon Web Services (AWS) using EC2 virtual Servers in the cloud.
  • Experienced in working with Elastic MapReduce (EMR) on Amazon Web Services (AWS).

Environment: Hadoop, MapReduce, YARN, Sqoop, HDFS, Hive, Pig, Oozie,Hbase, Java, Oracle, CentOS, MongoDB, AWS, EMR, Eclipse, Maven.

HADOOP DEVELOPER

Confidential, Hartford, CT

Responsibilities:

  • Wrote MapReduce jobs to standardize the data and clean it and calculate aggregates.
  • Worked with ETL workflow, analysis of big data and loaded them into Hadoop cluster.
  • Implemented Pig Latin scripts to sort, group, join and filter the data.
  • Implemented Pig UDFs for evaluation, filtering, loading and storing of data for functionalities which cannot be achieved using built-in Pig functions.
  • Created internal and external Hive tables, defined static and dynamic partitions as per requirement for optimized performance.
  • Used Sqoop to transfer data between RDBMS and HDFS andvice versa.
  • Created Hive based reports and wrote customizedHive UDFs in Java.
  • Created the Hbase Tables and inserted data into it.
  • Worked on Cassandra NoSQL database architecture for data read/write.
  • Integrated the hive warehouse with Hbase.
  • Implemented test scripts to support test driven development and continuous integration.
  • Monitored System health, logs and respond accordingly to any warning or failure conditions.
  • Effectively used Oozie to develop automatic workflows of Sqoop, MapReduce and Hive jobs.

Environment: Apache Hadoop, MapReduce, HDFS, Hive, Pig, YARN, Java, Sqoop, Oracle, SQL, Hbase,Cassandra, Zookeeper, CentOS, Eclipse, Maven.

JAVA DEVELOPER

Confidential, Madison, WA

Responsibilities:

  • Prepared Use case, Class and Activity diagrams using Rational Rose tool.
  • Involved in requirement gathering and analysis.
  • Involved in Low-level Architecture design.
  • Designed and developed Validation framework for field validations in Struts framework.
  • Designed and developed new J2EE Components like Value Objects, Servlets and bean components like Session Bean to in corporate business level validations.
  • Developed application code using Core Java and J2EE (Servlets, XML) in Eclipse tool.
  • Developed Unit test plans and Developed JUnit Test classes for all Unit Test cases.
  • Used XML Spy for XML development.
  • Used DAO for accessing database.
  • Developed Ant Scripts to bundle and deploy application.
  • Deployed application in WebLogic Application Server.
  • Used Rational Clear Case as a source control for code changes.
  • Prepared User guide, Deployment guide, System admin guide.
  • Performing end-to-end system testing of product writing test cases and fixed the issues found.

Environment: Java, Servlets, JSP, ANT, JDBC, Struts, XML, XSLT, JAXP, Eclipse, Oracle, Javascript, HTML, JUnit, Web logic, Rational Rose, Rational Clear Quest, Rational Clear Case, ANT.

JAVA DEVELOPER

Confidential

Responsibilities:

  • Designed the application using the J2EE design patterns such as Session Facade, Business Delegate, Service Locator, Value Object and Singleton.
  • Developed presentation tier as HTML, JSPs using Struts 1.1 Framework. Used AJAX for faster page rendering.
  • Developed the middle tier using EJBs Stateless Session Bean, Java Servlets.
  • Entity Beans used for accessing data from the Oracle 8i database.
  • Worked on Hibernate for data persistence.
  • Prepared high and low level design documents for the business modules for future s and updates.
  • Deployed the application in JBoss Application Server in development and production environment.
  • Implemented CVS as Version control system.
  • Code Walkthrough/ Test cases and Test Plans.
  • Used ANT as build tool. Used Junit for writing Unit tests.

Environment: Eclipse, HTML, JavaScript, CoreJava, JUnit, JSP, Servlets, JDBC, Oracle 8i, AJAX, CVS and JBoss Application Server.

We'd love your feedback!