We provide IT Staff Augmentation Services!

Hadoop/ssandra Developer Resume

3.00/5 (Submit Your Rating)

CA

SUMMARY

  • Over 6 +years of professional experience in field of IT wif expertise in Enterprise Application Development including 2 plus years in Big Data analytics.
  • Over 2 years of experience as Hadoop Developer wif good noledge in Hadoop ecosystem technologies.
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Working noledge on major Hadoop ecosystems PIG, HIVE, HBASE and Cloudera Manager.
  • Experience in developing PIG Latin Scripts and using Hive Query Language.
  • Experience working on NoSQL databases including Cassandra and Hbase.
  • Experience using Sqoop to import data into HDFS from RDBMS and vice - versa.
  • 5 years of experience in Core Java programming wif hands-on in Frameworks Spring and Struts.
  • Good noledge in advanced java topics such as Generics, Collections and multi-threading
  • Hands on experience working on Amazon SQS.
  • Experience in data-warehousing wif ETL tool Oracle Warehouse Builder (OWB).
  • Experience in database development using SQL and PL/SQL and experience working on databases like Oracle 9i/10g, SQL Server and MySQL.
  • Excellent interpersonal skills, good experience in interacting wif clients wif good team player and problem solving skills.

TECHNICAL SKILLS

Hadoop/Big Data Technologies: HDFS, Map Reduce, Hbase, Pig, Hive, Sqoop, Flume, Cassandra, Oozie, Zookeeper, YARN

Programming Languages: Java, C,Python, SQL, PL/SQL, Shell Scripting

Frameworks: MVC, Spring, Struts, Hibernate, .NET

Web Technologies: HTML, XML, Ajax, SOAP

Databases: Cassandra, Oracle 9i/10g/11g, SQL Server, MySQL

Database Tools: TOAD, Chordiant CRM tool, Kenan-Fx 2.0 Billing tool, Oracle Warehouse Builder (OWB).

Operating Systems: Linux, Unix, Windows, Mac, CentOS

Other Concepts: OOPS, Data Structures, Algorithms, Software Engineering, ETL

PROFESSIONAL EXPERIENCE

Confidential, CA

Hadoop/Cassandra Developer

Responsibilities:

  • Conducted POC for Hadoop and Cassandra as part of Nextgen platform implementation. Includes connecting to Hadoop cluster and Cassandra ring and executing sample programs on servers.
  • Developed several advanced Map Reduce programs to process data files received.
  • Developed Pig Scripts, Pig UDFs and Hive Scripts, Hive UDFs to load data files into Hadoop.
  • Usage of Sqoop to import data into HDFS from MySQL database and vice-versa.
  • Bulk loaded data into Cassandra using SStableloader.
  • Developed Java programs to process huge JSON files received from marketing team to convert into format standardized for the application.
  • Developed Java programs to apply verbatim cleaning rules for responses.
  • Consumed messages from Amazon SQS.
  • Experience in storing and retrieval of documents in Apache Solr.
  • Responsible for handing batch process.
  • Used Sqoop to import data into HDFS and Hive from other data systems.
  • Knowledge transfer sessions on the developed applications to colleagues.
  • Bulk load the data into Oracle using JDBCTemplate.

Environment: Apache Hadoop 1.2.1, Apache Cassandra 1.2, Spring, Extjs, Apache Solr, Sencha, Alfreso, Apache 2.2, Apache Tomcat v7.0, Eclipse Kepler, SVN repository, Linux, Putty, WinSCP

Confidential, Boston, MA

Hadoop Developer

Responsibilities:

  • Installed and configured Apache Hadoop 1.0.1 to test the maintenance of log files in Hadoop cluster.
  • Developed Java MapReduce programs for the analysis of sample log file stored in cluster.
  • Involved in the installation of CDH3 and up-gradation from CDH3 to CDH4.
  • Developed Map Reduce Programs for data analysis and data cleaning.
  • Developed PIG Latin scripts for the analysis of semi structured data.
  • Used Hive and created Hive tables and involved in data loading and writing Hive UDFs.
  • Used Sqoop to import data into HDFS and Hive from other data systems.
  • Migration of ETL processes from MySQL to Hive to test the easy data manipulation.
  • Developed Hive queries to process the data for visualizing.

Environment: Apache Hadoop 1.0.1, HDFS, CentOS 6.4, Java, MapReduce, Eclipse Indigo, Hive, PIG, Sqoop, Oozie and MySQL

Confidential, Washington, DC

Hadoop Developer

Responsibilities:

  • Installed and configured Hadoop and responsible for maintaining cluster and managing and reviewing Hadoop log files.
  • Developed MapReduce programs in Java for Data Analysis.
  • Load data from various data sources into HDFS using Flume.
  • Worked on Cloudera to analyze data present on top of HDFS.
  • Worked extensively on Hive and PIG.
  • Worked on large sets of structured, semi-structured and unstructured data.
  • Use of Sqoop to import and export data from HDFS to Oracle RDBMS and vice-versa.
  • Developed PIG Latin scripts to play wif the data.
  • Involved in creating Hive tables, loading wif data and writing hive queries which will run internally in map reduce way.
  • Good noledge on reading data from Cassandra and also writing to it.

Environment: Apache Hadoop 0.20.203, Cloudera Manager (CDH3), HDFS, Java MapReduce, Eclipse, Hive, PIG, Sqoop, Oozie and SQL, Oracle 11g

Confidential, Herndon, VA

J2EE/Hadoop Developer

Responsibilities:

  • Involved in the design of core implementation logic.
  • Extensively worked on application development using Spring MVC and Hibernate frameworks.
  • Extensively used Spring JDBC Template to implement DAO methods.
  • Used WebSphere as an application server and used Apache Maven to deploy and build the application in WebSphere.
  • Performed unit testing using JUnit.
  • Developed JAX-WS client and JAX-WS web services to coordinate wif outer systems.
  • Involved in design of data migration strategy to migrate the data from legacy system to Kenan FX 2.0 billing system.
  • Involved in the design of staging database as part of migration strategy.
  • Developed efficient PL/SQL packages for data migration and involved in bulk loads, testing and reports generation.
  • Installed and configured Apache Hadoop 0.20.1 and developed MapReduce programs to analyze sample test files.

Environment: Java, J2EE, Spring IOC, Spring AOP, Spring JDBC, Hibernate 3.0, WebSphere 7, TOAD, Oracle-9i, Kenan Fx-2.0, Chordiant CRM, PL/SQL, Apache Hadoop 0.20.1

Confidential, NY

Java/J2EE Developer

Responsibilities:

  • Involved in almost all the phases of SDLC.
  • Complete involvement in Requirement Analysis and documentation on Requirement Specification.
  • Developed prototype based on the requirements using Struts2 framework as part of POC (Proof of Concept).
  • Prepared use-case diagrams, class diagrams and sequence diagrams as part of requirement specification documentation.
  • Involved in design of the core implementation logic using MVC architecture.
  • Used Apache Maven to build and configure the application.
  • Configured struts.xml file wif required action-mappings for all the required services.
  • Developed implementation logic using struts2 framework.
  • Developed JAX-WS web services to provide services to the other systems.
  • Developed JAX-WS client to utilize few of the services provided by the other systems.
  • Involved in developing EJB 3.0 Stateless Session beans for business tier to expose business to services component as well as web tier.
  • Implemented Hibernate at DAO layer by configuring hibernate configuration file for different databases.
  • Developed business services to utilize hibernate service classes that connect to the database and perform the required action.
  • Developed JSP pages using struts JSP-tags and inhouse tags to meet business requirements.
  • Developed JavaScript validations to validate form fields.
  • Performed unit testing for the developed code using JUnit.
  • Developed design documents for the code developed.
  • Used SVN repository for version control of the developed code.
  • Mentored junior developers in their development activities.

Environment: Java, J2EE, MVC, Struts2, Hibernate, Apache Tomcat Server, XML.

Confidential, Hyderabad

Java/J2EE Developer

Responsibilities:

  • Involved in requirements analysis and prepared Requirements Specifications document.
  • Designed implementation logic for core functionalities.
  • Developed service layer logic for core modules using JSPs and Servlets and involved in integration wif presentation layer.
  • Involved in implementation of presentation layer logic using HTML, CSS, JavaScript and XHTML.
  • Design of MySQL database to store customer’s general and billing details.
  • Used JDBC connections to store and retrieve data from the database.
  • Development of complex SQL queries and stored procedures to process and store the data.
  • Used ANT, a build tool to configure application.
  • Developed test cases using JUnit.
  • Involved in unit testing and bug fixing.
  • Prepared design documents for code developed and defect tracker maintenance.

Environment: Java, J2EE (JSPs & Servlets), JUnit, HTML, CSS, JavaScript, Apache Tomcat, MySQL

We'd love your feedback!