We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Jersey City, NJ

SUMMARY

  • Over 7 years of experience in the field of IT including four years of experience in Hadoop ecosystem.
  • Good knowledge of Hadoop Architecture and its components such as HDFS, Job Tracker, Task Tracker, Data Node, Name Node and Map - Reduce concepts.
  • Implemented in setting up standards and processes for Hadoopbased application design and implementation.
  • Experience in installation, configuration and deployment of Big Data solutions.
  • Experience in Hadoop Ecosystem including HDFS, Hive, Pig, Hbase, Oozie, Sqoop and knowledge of Map-Reduce framework.
  • Experience working with NoSQL database including MongoDB and Hbase.
  • Experience in developing NoSQL database by using CRUD, Sharding, Indexing and Replication.
  • Experience in working with Cassandra NoSQL database.
  • Experience with ETL working with Hive and Map-Reduce.
  • Worked on data warehouse product Amazon Redshift which is a part of the AWS (Amazon Web Services)
  • Worked on graph database Neo4j by creating nodes and by creating relationships between the nodes.
  • Experience in developing Pig scripts and Hive Query Language.
  • Written Hive queries for data analysis and to process the data for visualization.
  • Responsible for developing for Pig Latin scripts.
  • Managing and scheduling batch Jobs on a Hadoop Cluster using Oozie.
  • Experience in managing and reviewing Hadoop Log files.
  • Used Zookeeper to provide coordination services to the cluster.
  • Experienced using Sqoop to import data into HDFS from RDBMS and vice-versa.
  • Sound knowledge of Business Intelligence and Reporting. Preparation of Dashboards using Tableau.
  • Experience in requirement analysis, system design, development and testing of various software applications.
  • Hands on experience in application development using Java, RDBMS and Linux Shell Scripting.
  • Detailed understanding of Software Development Life Cycle (SDLC) and sound knowledge of project implementation methodologies including Waterfall and Agile.
  • Experiences in all phases of the software development lifecycle: Concept, Design, Development, QA, Rollout and Enhancements
  • Ability to work independently to help drive solutions in fast paced/dynamic work environments
  • Strong team building, conflict management, time management and meeting management skills.
  • Excellent communication skills and leadership skills.

TECHNICAL SKILLS

Big Data Technologies: Hadoop, HDFS, Hive, Pig, Oozie, Sqoop, Map-Reduce, Hbase, MongoDB, Spark, Zookeeper

Database Technologies: PL/SQL, NoSQL, MongoDB, Neo4j

Programming Languages: C, C++, JAVA

Web Technologies: HTML, JavaScript, AngularJS

Operating Systems: Windows, Linux

Office Tools: MS Word, MS Excel, MS PowerPoint, MS Project

PROFESSIONAL EXPERIENCE

Confidential, Jersey City, NJ

Hadoop Developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Installed and configured Hive, Pig, Oozie, and Sqoop on Hadoop cluster.
  • Developed simple to complex Map-Reduce jobs using Java programming language that was implemented using Hive and Pig.
  • Supported Map Reduce Programs that are running on the cluster.
  • Cluster monitoring, maintenance and troubleshooting.
  • Handled the importing of data from various data sources, performed transformations using hive, Map-Reduce, loaded data into HDFS and extracted data from Mysql into HDFS using Sqoop.
  • Analyzed the data by performing Hive queries (HiveQL) and running Pig Scripts (Pig Latin).
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs.
  • Experience with ETL by using the Business Object tool.
  • Experience in analyzing large datasets with Amazon Redshift.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Generated the reports and dashboards using the tool Tableau.
  • Generated Tableau reports with trend lines and used filters, sets and calculated fields on the reports.
  • Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.
  • Worked on NoSQL database including MongoDB, Cassandra and HBase.
  • Developed NoSQL database by using CRUD, Indexing, Replication and Sharding in MongoDB. Sorted the data by using indexing.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Written multiple MapReduce programs in Java for data extraction, transformation and aggregation from multiple file formats including XML, JSON, CSV and other compressed file formats

Environment: Hive, Pig, Ooze, Scoop, HBase, Tableau, Amazon RedShift, ETL, Map-Reduce, MongoDB, Cassandra

Confidential, Memphis, TN

Hadoop Developer

Responsibilities:

  • Involved in Installing, Configuring Hadoopecosystem, and Cloudera Manager using CDH3 Distribution.
  • Experienced in managing and reviewing Hadoop log files
  • Experienced in running Hadoop streaming jobs to process terabytes of xml format data
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Supported Map Reduce Programs those are running on the cluster
  • Importing and Exporting of data from RDBMS to HDFS using Sqoop.
  • Installed and configured Hive and also written Hive UDFs.
  • Involved in creating Hive tables, loading the data and writing hive queries which will run internally in map reduce.
  • Written Hive queries for data to meet the business requirements.
  • Analyzed the data using Pig and written Pig scripts by grouping, joining and sorting the data.
  • Hands on experience with NoSQL Database.
  • Worked on MongoDB by using CRUD (Create, Read, Update and Delete), Indexing, Replication and Sharding features.
  • Participate in requirement gathering and analysis phase of the project in documenting the business requirements by conducting workshops/meetings with various business users.
  • Designed and Developed Dashboards using Tableau.
  • Actively participated in weekly meetings with the technical teams to review the code.

Environment: CDH3, Hive, Pig, NoSQL, MongoDB, RDBMS, Tableau

Confidential, Naperville IL

Software Engineer

Responsibilities:

  • As a Programmer/Analyst was involved in analysis and requirement gathering.
  • Developed SQL queries and stored procedures.
  • Used JAVA, JSP, JavaScript to develop presentation layer using MVC architecture.
  • Developed and deployed the application on JBoss Application Server, Tomcat Web Server.
  • Developed customized tag libraries in Struts (MVC Architecture)
  • Performed Unit Testing with JUnit.
  • Design and developed applications using Eclipse.
  • Responsible for delivering enhancements as per schedule after estimation.
  • Participated in Code review and testing of the enhancements done.
  • Coordinated with the back end team for integration issues.
  • Performed Functional Testing with Twill Framework.
  • Deployed the application on UNIX environment.
  • Interacting with direct Business users for the requirements.
  • Implemented Object Relational/Persistence mapping using Hibernate to provide database independence to support wide range of databases and query services.
  • Maintained version control for changed/released sources using Rational Clear Case.

Environment: Java, JSP, Swing, Struts, Tomcat, JBoss Application Server, Eclipse, Rational Clear Case, Oracle 9i, Twill, UNIX, JUnit.

Confidential, Charlotte, NC

Software Engineer

Responsibilities:

  • Developed various Java classes and SQL queries to retrieve and manipulate the data.
  • Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
  • Involved in complete requirement analysis, design, coding and testing phases of the project.
  • Analysis of business requirements and gathering the requirements.
  • Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.
  • Implemented Queries using SQL.
  • Development of complex SQL queries and stored procedures to process and store the data.
  • Involved in unit testing and bug fixing.

Environment: Java, JSP, EJB, XML, Oracle 9i.

Confidential

Software Engineer

Responsibilities:

  • Prepared user requirements document and functional requirements document for different modules.
  • Analyzing the Business Requirements.
  • Architecture with JSP as View, Action Class as Controller and combination of EJBs and Java classes as Model.
  • Involved in coding Session-beans and Entity-beans to implement the business logic.
  • Prepared SQL script for database creation and migrating existing data to the higher version of application.
  • Developed different Java Beans and helper classes to support Server Side programs.
  • Involved in development of backend code for email notifications to admin users with multi excel sheet using the xml.
  • Modified the existing Backend code for different level of enhancements.
  • Designing error handling flow and error logging flow.

Environment: Java, JSP, EJB, SQL Server, Session Handling, Entity Handling.

We'd love your feedback!