We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Little Rock, AR

PROFESSIONAL SUMMARY:

  • Around 7 years of experience spread across Hadoop, Java and ETL, that includes extensive experience into Big Data Technologies and in development of standalone and web applications in multi - tiered environment using Java, Hadoop, Hive, HBase, Pig, Sqoop, J2EE Technologies (Spring, Hibernate), Oracle, HTML, Java Script.
  • Extensive experience on BigData Analytics with hands on experience in writing MapReduce jobs on Hadoop Ecosystem including Hive and Pig.
  • Excellent knowledge on Hadoop architecture; as in HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Experience with distributed systems, large scale non-relational data stores, MapReduce systems, data modeling, and big data systems.
  • Experience on Apache, Cloudera, Hortonworks Hadoop distributions.
  • Involved in developing solutions to analyze large data sets efficiently
  • Excellent hands on with importing and exporting data from different Relational Database Systems like Mysql and Oracle into HDFS and Hive and vice-versa, using Sqoop.
  • Hands-on experience in writing Pig Latin scripts, working with grunt shells and job scheduling with Oozie.
  • Experience in analyzing data using Hive QL, Pig Latin, and custom Map Reduce programs in Java.
  • Experience with web-based UI development using jQuery, ExtJS, CSS, HTML, HTML5, XHTML and Javascript.
  • Knowledge of job workflow scheduling and monitoring tools like oozie and Zookeeper
  • Experience with databases like DB2, Oracle 9i, Oracle 10g, MySQL, SQL Server and MS Access.
  • Experience in creating complex SQL Queries and SQL tuning, writing PL/SQL blocks like stored procedures, Functions, Cursors, Index, triggers and packages.
  • Very good understanding on NOSQL databases like mongoDB, Cassandra and HBase.
  • Have good Knowledge in ETL and hands on experience in Informatica ETL.
  • Extensive experience in creating Class Diagrams, Activity Diagrams, Sequence Diagrams using Unified Modeling Language (UML)
  • Experienced in SDLC, Agile (SCRUM) Methodology, Iterative Waterfall
  • Experience in developing test cases, performing Unit Testing, Integration Testing, experience in QA with test methodologies and skills for manual/automated testing using tools like WinRunner, JUnit.
  • Experience with various version control systems Clear Case, CVS, SVN.
  • Expertise in extending Hive and Pig core functionality by writing custom UDFs.
  • Development Experience with all aspects of software engineering and the development life cycle
  • Strong desire to work for a fast-paced, flexible environment
  • Proactive problem solving mentality that thrives in an agile work environment
  • Good Experience on SDLC (Software Development Life cycle)
  • Exceptional ability to learn new technologies and to deliver outputs in short deadlines.
  • Worked with developers, DBAs, and systems support personnel in elevating and automating successful code to production.
  • Possess strong Communication skills of written, oral, interpersonal and presentation.
  • Ability to perform at a high level, meet deadlines, adaptable to ever changing priorities.

PROFESSIONAL EXPERIENCE:

Confidential, Little Rock, AR

Hadoop Developer

Responsibilities:

  • Involved in design and development phases of Software Development Life Cycle (SDLC) using Scrum methodology
  • Developed data pipeline using Flume, Sqoop, Pig and MapReduce to ingest customer behavioural data and purchase histories into HDFS for analysis.
  • Developed job flows in Oozie to automate the workflow for extraction of data from warehouses and weblogs.
  • Used Pig as ETL tool to do transformations, event joins, filter bot traffic and some pre-aggregations before storing the data onto HDFS
  • Written Hive queries to parse the logs and structure them in tabular format to facilitate effective querying on the log data.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
  • Used hive optimization techniques during joins and best practices in writing hive scripts.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Experienced in managing and reviewing the Hadoop log files.
  • Working with Apache Crunch library to write, test and run HADOOP MapReduce pipeline jobs.
  • Involved in joining and data aggregation using Apache Crunch.
  • Used Pig as ETL tool to do Transformations, even joins and some pre-aggregations before storing the data onto HDFS.
  • Worked on Oozie workflow engine for job scheduling.
  • Developed custom implementation for Partioner, Input / Output Formats, Record Reader and Writers.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting on the dashboard.
  • Loaded the aggregated data onto DB2 for reporting on the dashboard.
  • Monitoring and Debugging Hadoop jobs/Applications running in production.
  • Worked on Providing User support and application support on Hadoop Infrastructure.
  • Reviewing ETL application use cases before on boarding to Hadoop.
  • Worked on Evaluating, comparing different tools for test data management with Hadoop.
  • Helped and directed testing team to get up to speed on Hadoop Application testing.
  • Worked on Installing 20 node UAT Hadoop cluster.
  • Created ETL jobs to generate and distribute reports from MySQL database using Pentaho Data Integration.
  • Created ETL jobs using Pentaho Data Integration to handle the maintenance and processing of data.

Environment: JDK1.6, RedHat Linux, HDFS, Mahout, Map-Reduce, Apache Crunch, Hive, Pig, Cloudera, Sqoop, Flume, Zookeeper, Oozie, DB2, HBase and Pentaho.

Confidential, Philadelphia, PA

Hadoop/ Java Developer

Responsibilities:

  • Developed multiple MapReduce programs for analyzing the insurance data of the customer and produce summary results from Hadoop to downstream systems.
  • Worked on importing and exporting data from Oracle and DB2 into HDFS using Sqoop.
  • Developed data pipeline using Flume to ingest customer behavioral data and financial histories into HDFS for analysis.
  • Prepared the best practices in writing map reduce programs and hive scripts.
  • Scheduled a workflow to import the weekly transactions in the revenue department from RDBMS database using Oozie.
  • Built wrapper shell scripts to hold these Oozie workflow.
  • Developed PIG Latin scripts to transform the log data files and load into HDFS.
  • Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing the data onto HDFS.
  • Developed Some Helper class for abstracting Cassandra cluster connection act as core toolkit.
  • Hands on experience with NoSQL databases like Cassandra for POC (proof of concept) in storing URL's and images.
  • Developed hive UDF for functions that were not preexisting in Hive like the rank etc.
  • Created External Hive tables and involved in data loading and writing Hive UDFs.
  • Experienced in implementing POC's to migrate iterative map reduce programs into Spark transformations using Scala.
  • Created concurrent access for hive tables with shared and exclusive locking that can be enabled in hive with the help of Zookeeper implementation in the cluster.
  • Wrote the shell scripts to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions.
  • Developed Unit test cases using MRunit for map reduce code.
  • Involved in creating Hadoop streaming jobs.
  • Involved in Installing, Configuring Hadoop Eco System, Cloudera Manager using CDH4 Distribution.

Environment: Hadoop, MapReduce, Hive, HDFS, PIG, Sqoop, Oozie, Cloudera, Flume, HBase, ZooKeeper, CDH3, Cassandra, Oracle, NoSQL and Unix/Linux, Kafka.

Confidential, Tampa, FL

Sr. Java Developer

Responsibilities:

  • Involved in Object Oriented Design/Object Oriented Analysis with usage of UML Sequence diagrams and Class Diagrams with Rational Rose.
  • Responsibilities include analysis of these various applications, designing of the enterprise applications, co-ordination with client and offshore team, meetings with business users, functional and technical guide to the offshore team, project management.
  • Designed UI using JSP and HTML, and validated with JavaScript for providing the user interface and communication between the client and server.
  • Implemented MVC architecture by developing struts framework.
  • Implemented the Spring dependency injection of the Database helper instance to the action objects.
  • Used DOJO, Javascript and Spring to create interactive user interface.
  • Experience in dimensionally modeling relational data sources (DMR) using Cognos 8 framework.
  • Written Action classes, Business Objects and Service classes.
  • Configured Struts-config.xml with all the mappings required by the architecture.
  • Created detail design documents which has the UML Design diagrams, table information, object model.
  • Worked with various version control tools like CVS, ClearCase and Subversion (SVN).
  • Used Object/Relational mapping Hibernate framework as the persistence layer for interacting with DB2.
  • Developed EJB components (Session Beans, Entity beans) using EJB design patterns to business and data process.
  • Worked on Hibernate object/relational mapping according to database schema.
  • Prepare Java/J2EE development structure for Maven.
  • Responsible for modifying existing DB2 stored procedures and writing SQL queries as per requirement.
  • Developed the Servlets for processing the data on the server.
  • Experience with SQL and advanced CRUD operations.
  • Used Hibernate annotations to avoid writing a mapping file.
  • Developed Web Based Rich Internet Application (RIA) using Adobe Flex.
  • Used Subversion for version control and created automated build scripts.
  • Used Apache/Maven to build projects, which assisted in testing, and produce reports on projects.
  • Wrote JMS classes to communicate with MQ series deployed at Credit Card Issuing agency (VISA, MASTER).
  • Used SQL statements and procedures to fetch the data from the database.
  • Installed, configured, and maintained Websphere server and DB2.
  • Deployed the application on to Websphere application server.

Environment: Java, Servlets, JSP, JSTL, Struts, JMS, EJB, DOJO, Cognos, Hibernate, HTML, XML, Apache, DB2, Spring, Apache CXF, CRUD, SVN, Web services, Maven, UML, IBM Websphere, Websphere Portal, JUnit.

Confidential

Java Developer

Responsibilities:

  • Used Microsoft Visio for designing the Use Case Diagrams, Class model, Sequence diagrams, and Activity diagrams for SDLC process of the application.
  • Implemented GUI pages by using JSP, JSTL, HTML, DHTML, XHTML, CSS, JavaScript, AJAX
  • Configured the project on WebSphere 6.1 application servers
  • Implemented the online application by using Core Java, Jdbc, JSP, Servlets and EJB 1.1, Web Services, SOAP, WSDL.
  • Communicated with other Health Care info by using Web Services with the help of SOAP, WSDL JAX-RPC.
  • Implemented Singleton, factory design pattern, DAO Design Patterns based on the application requirements.
  • Used SAX and DOM parsers to parse the raw XML documents.
  • Used RAD as Development IDE for web applications.
  • Used Clear Case for Version Control tool and ClearQuest for bug tracking tool
  • Configured job scheduling in Linux using shell scripts and Crontab.
  • Developed test plan documents for all back end database modules.
  • Deployed the project in Linux environment.

Environnent: JDK 1.5, JSP, WebSphere, JDBC,EJB2.0, XML, DOM, SAX, XSLT, Apache Commons Digester, Apache Commons, CSS, HTML, JNDI, Web Services, WSDL, SOAP, RAD, SQL, PL/SQL, JavaScript, DHTML, XHTML, Oracle10g, Java Mail, PL/SQL Developer, Toad, POI Reports, Log4j, ANT, Clear Case, Windows XP, Red Hat Linux.

Confidential

Jr. Java Developer

Responsibilities:

  • Developed the application based on MVC (Model-View-Controller) Architecture using Spring Web MVC.
  • Used Microsoft Visio for designing the Use Case Diagrams, Class model, Sequence diagrams, and Activity diagrams for SDLC process of the application.
  • Implemented GUI pages by using JSP, JSTL, HTML, DHTML, XHTML, CSS, JavaScript, AJAX
  • Extensively used Java multi-threading to implement batch Jobs with JDK 1.5 features.
  • Configured the project on Web Logic 10.3 application servers
  • Implemented the online application using Core Java, JDBC, JSP, Servlets, spring, Hibernate, Web Services, SOAP, and WSDL.
  • Communicated with other Health Care info using Web Services with the help of SOAP, WSDL.
  • Tested the web services with SOAP UI tool
  • Implemented Singleton, Factory Design Pattern, DAO Design Patterns based on the application requirements.
  • Used SAX and DOM parsers to parse the raw XML documents.
  • Used RAD as Development IDE for web applications.
  • Tuning complex database queries and joining the tables to improve the performance of the application
  • Designed and developed base classes, framework classes and common re-usable components.
  • Used SVN for Version Control tool and Silk Central for bug tracking tool

Environnent: JDK 1.5, JSP, web logic 10.3, JDBC, XML, DOM, SAX, CSS, HTML, JNDI, Web Services, WSDL, SOAP, Eclipse 3.0, SQL, PL/SQL, JavaScript, HTML, Oracle11g, PL/SQL Developer, Log4j, ANT, SVN, Silk Central.

We'd love your feedback!