We provide IT Staff Augmentation Services!

Hadoop Developer Resume Profile

4.00/5 (Submit Your Rating)

Rochester, MN

SUMMARY

  • 7 years of professional IT experience including 2 years of experience on Big Data, Hadoop Development and Ecosystem Analytics in Banking, Healthcare, Insurance and Tele-Communication sectors.
  • Good understanding of Hadoop Architecture and underlying framework including Storage Management.
  • Well versed in installation, configuration, supporting and managing of Big Data and underlying infrastructure of Hadoop Cluster.
  • Strong knowledge of using Pig/Hive for processing and analyzing large volume of data.
  • Expert in creating Pig and Hive UDFs using Java in order to analyze the data efficiently.
  • Expert in using Sqoop for fetching data from different systems to analyze in HDFS, and again putting it back to the previous system for further processing.
  • Knowledge of creating Map Reduce codes in Java as per the business requirements.
  • Have technical exposure on Cassandra CLI creating key spaces, column families and analyzing fetched data.
  • Worked with Big Data distributions Cloudera CDH 3 and 4 . Also have knowledge of other distributions IBM BigInfoSights.
  • Worked in ETL tools Talend to simplify Map Reduce jobs from the front end
  • Have knowledge of Pentaho on data integration for executing ETL jobs in and out of Big Data environments.
  • Experienced in creating and analyzing Software Requirement Specifications SRS and Functional Specification Document FSD . Strong knowledge of Software Development Life Cycle SDLC .
  • Extensive knowledge in using SQL Queries for backend database analysis.
  • Extensive knowledge in creating PL/SQL stored Procedures, packages, functions, cursors against Oracle 9i, 10g, 11g , and MySQL server.
  • Experienced in preparing and executing Unit Test Plan and Unit Test Cases after software development.
  • Expertise in Defect Management and Defect Tracking to do Performance Tuning for delivering utmost Quality product.
  • Experience using middleware architecture using Sun Java technologies like J2EE, JSP, Servlets, and application servers Web Sphere and Web logic.
  • Excellent Java development skills using J2EE, J2SE, JDBC, Java Script, Servlets, JSP, JQuery, JUnit.
  • Excellent Java development skills using frameworks Struts, Hibernate, Spring MVC and AJAX.
  • Worked in Windows, Unix/Linux Platform with different technologies such as Big Data, Java, XML, HTML, SQL, PL/SQL, and Shell Scripting.
  • Experience in Scrum, Agile and Waterfall models.
  • Good communication Skills, committed, result oriented, hard working with a quest to learn new technologies.
  • Experienced to work with multi-cultural environment with a team and also individually as per the project requirement.

TECHNICAL SKILLS

Hadoop Ecosystem

HDFS, Map Reduce, Hive, Pig, HBase, Zookeeper, Oozie, Cassandra, Sqoop, Flume, and Avro.

Web Technologies

AJAX, REST, SOAP, WSDL, J2EE, JDBC, JSP, Java Script, XML, XHTML, HTML.

Methodologies

Scrum, Agile, Waterfall, UML, Design Patterns Core Java and J2EE .

NO SQL Databases

HBase, Cassandra.

Frameworks

MVC, Struts, Hibernate, Spring.

Programming Languages

Java, C, C , Python, Linux shell scripts, PL/SQL.

Operating Systems

Windows, Unix/Linux.

Data Bases

Oracle 9i, 10g, 11g , DB2, MySQL, MS-Access.

Web Services

Web Logic, Web Sphere, Apache Tomcat.

Monitoring

Reporting Tools

Ganglia, Nagios, Custom Shell scripts.

PROFESSIONAL EXPERIENCE

Confidential

Title: Hadoop Developer

Responsibilities:

  • Responsible for preparing and analyzing technical and functional specs, development and maintenance of code.
  • Developed multiple Map Reduce jobs in Java for data cleaning and pre-processing.
  • Reviewed large data sets by running Hive Queries and Pig Scripts.
  • Developed numerable Pig batch programs for both implementation, and optimization needs.
  • Used Pig Latin and Hive HQL to work on unmodified data and produced reports for client's analytic purposes for large data analysis.
  • Involved in design and implementation of Hadoop Framework to store and analyze large collections of documents.
  • Used HBase in accordance with Hive/Pig as per the requirement.
  • Associated with creating Hive Tables, and loading and analyzing data using Hive Queries.
  • Involved in running Hadoop jobs for processing millions of records of text data.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as needed.
  • Involved in loading data from Linux file system to HDFS.
  • Involved in creating tables, partitioning, bucketing of tables.
  • Extracted files from Cassandra through Sqoop and placed in HDFS and processed.
  • Also used Sqoop to transfer data between RDBMS and HDFS.
  • Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive Queries and Pig Scripts.
  • Experience in running Hadoop streaming jobs to process terabytes of XML format data.
  • Transform large sets of structured, semi structured and unstructured data.
  • Ensuring adherence to guidelines and standards in project process.
  • Facilitating Testing in different environments.

Environment: Hadoop, HDFS, HBase, Pig, Hive, Map Reduce, Cassandra, Sqoop, Flume, Java, XML, SQL, Linux.

Confidential

Title: Hadoop Developer

Responsibilities:

  • Developer in Big Data team, worked with Hadoop, and its ecosystem.
  • Installed and configured Hadoop, Map Reduce, HDFS.
  • Used Hive QL to do analysis on the data and identify different correlations.
  • Developed multiple Map Reduce jobs in Java for data cleaning and preprocessing.
  • Installed and configured Pig and also written Pig Latin scripts.
  • Wrote Map Reduce job using Pig Latin.
  • Great understanding of REST architecture style and its application to well performing web sites for global usage.
  • Developed and maintained Hive QL, Pig Latin Scripts, and Map Reduce.
  • Worked on the RDBMS system using PL/SQL to create packages, procedures, functions, triggers as per the business requirements.
  • Involved in ETL, Data Integration and Migration.
  • Worked on Talend to run ETL jobs on the data in HDFS.
  • Imported data using Sqoop to load data from Oracle to HDFS on a regular basis.
  • Developing scripts and batch jobs to schedule various Hadoop Programs.
  • Have written Hive Queries for data analysis to meet the business requirements.
  • Creating Hive Tables and working on them using Hive QL.
  • Importing and exporting data into HDFS from Oracle Database, and vice versa using Sqoop.
  • Experienced in defining job flows.
  • Experience with NoSQL database HBase.
  • Hybrid implementation using Oracle.
  • Wrote and modified stored procedures to load and modifying of data according to business rule changes.
  • Involved in creating Hive Tables, loading the data and writing Hive Queries that will run internally in a map reduce way.
  • Developed a custom file system plugin for Hadoop to access files on data platform.
  • The custom file system plugin allows Hadoop Map Reduce programs, HBase, Pig, and Hive to access files directly.
  • Extracted feeds from social media sites such as Facebook, Twitter using Python scripts.
  • Organized and benchmarked Hadoop/HBase Clusters for internal use.

Environment: Hadoop, HDFS, HBase, Pig, Hive, MapReduce, Sqoop, Flume, ETL, REST, Java, Python, PL/SQL, Oracle 11g, Unix/Linux.

Confidential

Title: Java Developer

Responsibilities:

  • Developed an end to end vertical slice for a JEE based application using popular frameworks Spring, Hibernate, JSF, Facelets, XHTML, Maven2, and AJAX by applying OO Design Concepts, JEE, and GoF Design Patterns.
  • Designed the logical and physical data model, generated DDL scripts, and wrote DML scripts for Oracle 9i database.
  • Tuned SQL statements, Hibernate Mapping, and Web Sphere application server to improve performance, and consequently met the SLAs.
  • Collected business requirements and wrote functional specifications and detailed design documents.
  • Detected and fixed transactional issues due to wrong exception handling and concurrency issues because of unsynchronized block of code.
  • Employed MVC Struts framework for application design.
  • Assisted in designing, building, and maintaining database to analyze life cycle of checking and debit transactions.
  • Used Web Sphere to develop JAX-RPC web services.
  • Developed Unit Test Cases, and used JUNIT for Unit Testing of the application.
  • Involved in the design team for designing the Java Process Flow architecture.
  • Used Web Sphere to develop JAX-RPC web services.
  • Worked with QA, Business and Architect to solve various Defects in to meet deadlines.

Environment: Spring, Hibernate, Struts MVC, AJAX, Web Sphere, Maven2, Java, Java Script, JUnit, XHTML, HTML, DB2, SQL, UML, Oracle, Eclipse, Windows.

Confidential

Title: Java/J2EE Developer

Responsibilities:

  • Implemented services using Core Java.
  • Handling of design reviews and technical reviews with other project stake holders.
  • Developed analysis level documentation such as Use Case, Business Domain Model, Activity, Sequence and Class Diagrams.
  • Effective role in the team by interacting with welfare business analyst/program specialists and transformed business requirements into System Requirements.
  • Developed and deployed UI layer logics of sites using JSP.
  • Spring MVC is used for implementation of business model logic.
  • Worked with Struts MVC objects like action Servlet, controllers, and validators, web application context, Handler Mapping, message resource bundles, and JNDI for look-up for J2EE components.
  • Developed dynamic JSP pages with Struts.
  • Employed built-in/custom interceptors, and validators of Struts.
  • Developed the XML data object to generate the PDF documents, and reports.
  • Employed Hibernate, DAO, and JDBC for data retrieval and medications from database.
  • Messaging and interaction of web services is done using SOAP.
  • Developed JUnit test cases for Unit Test cases and as well as system, and user test scenarios

Environment: Struts, Hibernate, Spring MVC, SOAP, WSDL, Web Logic, Java, JDBC, Java Script, Servlets, JSP, JUnit, XML, UML, Eclipse, Windows.

Confidential

Title: Java /J2EE Developer

Responsibilities:

  • Involved in the Analysis, Design, Implementation, and Testing of the project.
  • Implemented the presentation layer with HTML, XHTML and Java Script.
  • Developed web components using JSP, Servlets and JDBC.
  • Designed Tables and Indexes.
  • Have written SQL Queries and stored procedures.
  • Involved in Fixing Bugs and Unit Testing with test cases using JUnit.
  • Actively involved in the System Testing.
  • Involved in implementing service layer using Spring IOC module.
  • Used GitHub as a code repository.
  • Used Gradle as a build tool.
  • Developed the Installation, Customer guide and Configuration document which were delivered to the customer along with the product.

Environment: Spring, Java, JDBC, Java Script, Servlets, JSP, XHTML, HTML, SQL, JUnit, UML, MySQl, Eclipse IDE, Windows XP.

Confidential

Title: Junior Java Developer

Responsibilities:

  • Involved in designing the Project Structure, System Design and every phase in the project.
  • Responsible for developing platform related logic and resource classes, controller classes to access the domain and service classes.
  • Involved in Technical Discussions, Design, and Workflow.
  • Participate in the Requirement Gathering and Analysis.
  • Employed JAXB to unmarshall XML into Java Objects.
  • Developed Unit Testing cases using JUnit Framework.
  • Implemented the data access using Hibernate and wrote the domain classes to generate the Database Tables.
  • Involved in implementation of view pages based on XML attributes using normal Java classes.
  • Involved in integration of App Builder and UI modules with the platform.

Environment: Hibernate, Java, JAXB, JUnit, XML, UML, Oracle11g, Eclipse, Windows XP.

We'd love your feedback!