We provide IT Staff Augmentation Services!

Hadoop Developer Resume

Plano, TX

SUMMARY:

  • 7 +Years of professional IT experience which includes experience in Big data ecosystem and Java/J2EE related technologies
  • Hands on experience in development, installation, configuring, and using Hadoop & ecosystem components like Hadoop MapReduce, HDFS, HBase, Hive, Sqoop, Pig, Flume, Kafka, Storm, Spark, Elastic Search.
  • Implemented in setting up standards and processes for Hadoop based application design and implementation.
  • Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
  • Experience in managing and reviewing Hadoop log files.
  • Good knowledge on AWS.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Extensive experience in developing applications using JSP, Servlets, Spring, Hibernate, Java Script, Angular, AJAX, CSS, JQuery, HTML, JDBC, JNDI, JMS, XML, and SQL across the platforms like Windows, Linux, and UNIX.
  • Extensive experience in developing and deploying RESTful and SOAP Web Services.
  • Good experience on web/Application servers like Websphere, Apache Tomcat, and JBoss
  • Experienced in a variety of scripting languages such as UNIX scripts and Java Scripts.
  • Installing, configuring and managing of Hadoop Clusters and Data Science tools
  • Analyzed machine data for improved business results using datastage.
  • Managing the Hadoop distribution with Cloudera Manager, Cloudera Navigator, Hue.
  • Setting up the High-Availability for Hadoop Clusters components and Edge nodes
  • Good experience in Python Language.
  • Experience in developing Shell scripts and Python Scripts for system management.
  • Good experience on ETL like informatica.
  • Strong domain knowledge in Insurance, Finance & Health Care, Social Network.

TECHNICAL SKILLS:

Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Oozie, Flume, Spark.

Programming Languages: Java, C/C++, VB

Scripting Languages: JSP & Servlets, PHP, JavaScript, XML, HTML, Python.

Databases: Oracle, My SQl, MS SQL

Tools: Eclipse, CVS, Ant, MS Visual Studio, NetBeans,ETL.

Platforms: Windows, Linux/Unix

Application Servers: Apache Tomcat 5.x 6.0, Jboss 4.0

Methodologies: Agile, UML, Design Patterns

PROFESSIONAL EXPERIENCE:

Confidential, Plano, TX

Hadoop Developer

Roles and Responsibilities:

  • Configured a Spark streaming application to stream syslogs and various application logs from 100+ nodes for monitoring and alerting as well as to feed the data to dynamic dashboards.
  • Migrated traditional MR jobs to Spark MR Jobs.
  • Worked on Spark SQL and Spark Streaming.
  • Imported, exported file to the HDFS, Hive, Impala SQL.
  • The processed results were consumed by HIVE, Scheduling applications and various other BI reports through data warehousing multi-dimensional models.
  • Run Ad-Hoc query through PIG Latin language, Hive or Java mapreducer
  • Wrote PIG scripts and executed by using Grunt shell
  • Big data analysis using Pig, Hive and User defined functions (UDF)
  • Performed joins, group by and other operations in MapReduce using Java or PIG Latin
  • Scheduling all Hadoop and Hive jobs.
  • Used Python programming and language to develop a working and efficient network within the company
  • Used informatica to move data into hadoop in batches.
  • Collected log data from the web servers and integrated it to HDFS using Flume
  • Used setter and getter methods of Java in the reducer to set/get values to and from the java jar
  • Processed the output from PIG, Hive and formatted it before sending to the Hadoop output file
  • Used HIVE definition to map the output file to tables
  • Setting up Virtual Machines and managing storage devices
  • Involved in managing and reviewing Hadoop log files
  • Use to write programs using python.
  • Developed Scripts and Batch Jobs to schedule various Hadoop Program

Environment: Hadoop 2.6.0-cdh5.4.2, Cloudera Manager, Redhat Linux, java, Perl, Python, Cloudera Navigator.

Confidential, Somerset,NJ

Bigdata/Hadoop Developer

Roles and Responsibilities:

  • Imported logs from web servers with Flume to ingest the data into HDFS.
  • Implemented custom interceptors for flume to filter data and defined channel selectors to multiplex the data into different sinks.
  • Retrieved data from HDFS into relational databases with Sqoop.
  • Parsed cleansed and mined useful and meaningful data in HDFS using Map-Reduce for further analysis
  • Fine tuning hive jobs for optimized performance
  • Implemented UDFS, UDAFS, UDTFS in java for hive to process the data that can't be performed using Hive inbuilt functions.
  • Coordinates workflow where ever the data resides acorss all platforms using datastage.
  • Designed and implemented PIG UDFS for evaluation, filtering, loading and storing of data.
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.

Environment: Hadoop, Hive, Pig, Sqoop, Oracle10g, HDFS, Oozio, Flume,ETL.

Confidential, M View - CA

Java Developer

Roles and Responsibilities:

  • Was a part of the architecture team for design and implementation of site components using J2EE framework.
  • Involved in implementing the Spring Model-View-Controller Architecture for the site, which accomplishes a tight and neat co-ordination of JSP Pages, Java Beans.
  • Used JQuery and JavaScript in conjunction to perform Front end validations and event handling.
  • Improved the performance of the response page for the Survey responses which include loading, saving and validating the responses.
  • Implemented the JQuery Pagination feature to drastically improve the response loading page which resulted in loading the page in 2 seconds (which was 8 seconds earlier).
  • Used the JQuery DataTables to render the data in the table format.
  • Used AJAX to increase the web page’s interactivity, speed, functionality and usability.
  • Used iText library to create and manipulate all the PDF reports.
  • Used Apache POI API for exporting all the data in to Excel sheets.
  • Used Scheduler to schedule and automate some back end jobs.
  • Used Dynatree JQuery view plugin to support different components of the system supported by multiple selection, Drag and Drop features.
  • Used ClearCase for Code Repository, ClearQuest for defect tracking and Log4j for logging in the application.

Environment: Java, J2EE, RAD 8.0.4, WebSphere 8.0, JavaScript, JQUERY, AJAX, Hibernate, Windows/UNIX.

Confidential, Atlanta-GA

Java Developer

Roles and Responsibilities:

  • Worked on designing the content and delivering the solutions based on understanding the requirements.
  • Wrote web service client for tracking operations for the orders which is accessing web services API and utilizing in our web application.
  • Developed User Interface using JavaScript, JQuery and HTML.
  • Used AJAXAPI for intensive user operations and client-side validations.
  • Worked with Java, J2EE, SQL, JDBC, XML, JavaScript, web servers.
  • Utilized Servlet for the controller layer, JSP and JSP tags for the interface
  • Worked on Model View Controller Pattern and various design patterns.
  • Worked with designers, architects, developers for translating data requirements into the physical schema definitions for SQL sub-programs and modified the existing SQL program units.
  • Designed and Developed SQL functions and stored procedures.
  • Involved in debugging and bug fixing of application modules.
  • Efficiently dealt with exceptions and flow control.
  • Worked on Object Oriented Programming concepts.
  • Added Log4j to log the errors.
  • Installed and used MS SQL Server 2008 database.
  • Spearheaded coding for site management which included change of requests for enhancing and fixing bugs pertaining to all parts of the website.

Environment: Java, JDK1.8, Appache Tomcat-7, JavaScript, JSP, JDBC, Servlets, MS SQL, XML, Windows XP, Ant, SQL Server database, Eclipse.

Confidential

Java Developer

Roles and Responsibilities:

  • Worked on designing the content and delivering the solutions based on understanding the requirements.
  • Wrote web service client for tracking operations for the orders which is accessing web services API and utilizing in our web application.
  • Developed User Interface using JavaScript, JQuery and HTML.
  • Used AJAXAPI for intensive user operations and client-side validations.
  • Worked with Java, J2EE, SQL, JDBC, XML, JavaScript, web servers.
  • Utilized Servlet for the controller layer, JSP and JSP tags for the interface
  • Worked on Model View Controller Pattern and various design patterns.
  • Worked with designers, architects, developers for translating data requirements into the physical schema definitions for SQL sub-programs and modified the existing SQL program units.
  • Designed and Developed SQL functions and stored procedures.
  • Involved in debugging and bug fixing of application modules.
  • Efficiently dealt with exceptions and flow control.
  • Worked on Object Oriented Programming concepts.
  • Added Log4j to log the errors.
  • Used Eclipse for writing code and SVN for version control.
  • Installed and used MS SQL Server 2008 database.
  • Spearheaded coding for site management which included change of requests for enhancing and fixing bugs pertaining to all parts of the website.

Environment: Java, JDK1.8, Appache Tomcat-7, JavaScript, JSP, JDBC, Servlets, MS SQL, XML, Windows XP, Ant, SQL Server database,, Redhat linux, Eclipse luna, SVN

Hire Now