We provide IT Staff Augmentation Services!

Sr. Hadoop Developer Resume

0/5 (Submit Your Rating)

Fair Haven, MA

SUMMARY

  • Having more than 7+ years of experience in IT industry, involved in Developing, Implementing, testing and maintenance of various web based applications using J2EE technologies and Big Data ecosystems experience on Linux environment.
  • Including 2+ years of comprehensive experience as a Hadoop, Big Data & Analytics Developer.
  • Expertise on Hadoop architecture and ecosystems such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Experience in installation, configuration, supporting and monitoring Hadoop clusters using Apache, Cloudera distributions and AWS.
  • Knowledge in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, Zookeeper, Falcon, Spark and Flume.
  • Imported and exported data using Sqoop from RDMS to HDFS to RDBMS.
  • Knowledge of R background with statistics.
  • Strong Knowledge of Kafka. (To provide a unified, high - throughput, low-latency platform for handling real-time data feeds).
  • Knowledge of Gradle and Elastic Search.
  • Knowledge of Falcon to enhance operations, support for transactional applications and improved tooling.
  • Experience in importing and exporting data using Apache Sqoop from HDFS to Relational Database Systems / Non-Relational Database Systems and vice-versa.
  • Extending Hive and Pig core functionality by writing custom UDFs
  • Awareness of Meta Data tools and techniques.
  • Experienced in analyzing data using HiveQL, Pig Latin, and custom Map Reduce programs in Java.
  • Experience in building, maintaining multiple Hadoop clusters (prod, dev etc.,) of different sizes and configuration and setting up the rack topology for large clusters.
  • Worked on NoSQL databases including MarkLogic, HBase, Cassandra and MongoDB
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
  • Experienced in designing, developing and implementing connectivity products that allow efficient exchange of data between the core database engine and the Hadoop ecosystem
  • Experienced in Data warehousing and using ETL tools like Informatica and Pentaho
  • Expert level skills in developing intranet/internet application using JAVA/J2EE technologies which includes Struts framework, MVC design Patterns, Chrodiant, Servlets, JSP, JSLT, XML/XLST, Java Script, AJAX, EJB, JDBC, JMS, JNDI, RDMS, SOAP, Hibernate and custom tag Libraries
  • Experience using XML, XSD and XSLT
  • Experience with web-based UI development using jQuery UI, jQuery, ExtJS, CSS, HTML, HTML5, XHTML and JavaScript, node.js.
  • Extensive experience in middle-tier development using J2EE technologies like JDBC, JNDI, JSP, Servlets, JSP, JSF, Struts, Spring, Hibernate, JDBC, EJB
  • Possess excellent technical skills, consistently outperformed schedules and acquired interpersonal and communication skills

TECHNICAL SKILLS

Hadoop/Big Data: HDFS, Mapreduce, HBase, Pig, flume, Hive, Sqoop, Cassandra NoSQL DB, Storm, Kafka CDH3, CDH4, Apache Hadoop, Apache Mahout.

Java& J2EE Technologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans

IDE’s: Eclipse, Net beans

Frameworks: MVC, Struts, Hibernate, Spring

Programming languages: C, C++, Java, Java Script, Node.JS, JSON, R, Python, Ant scripts, Perl, Linux shell scripts, Scala

Build Management Tools: Maven, Apache Ant

Version control: SVN, ClearCase

Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server

Web Servers: Web Logic, Web Sphere, Apache Tomcat

Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL

Network Protocols: TCP/IP, UDP, HTTP, DNS, DHCP

ETL Tools: Informatica, Pentaho, Ab initio

Testing: Win Runner, Load Runner, QTP,UAT

Reporting Tools: Tableau,D3

Architecture: Unified modeling Language Design Patters and Object Oriented Analysis and Designing

PROFESSIONAL EXPERIENCE

Confidential, Fair Haven, MA

Sr. Hadoop Developer

Responsibilities:

  • Developed a framework that wills sqoop the data incrementally as per the requirement based on a timeline and automated this process using Zeke enterprise scheduler.
  • Involved in developing an automated framework that sqoops the data and creates the hive table on top of that data using Shell scripts.
  • Worked extensively with Sqoop for importing data from DB2 to HDFS.
  • Created tables and partitions in Hive.
  • Created the developer Unit test plans and executed testing in development cluster.
  • Involved in creating REST services, writing Xqueries and XSLT transformations to ingest the data in the Marklogic
  • Involved in the implementation of Marklogic Framework on Roxy. shark17
  • Involved in developing a framework that will create external and manageable tables in a batch processing based on the metadata files.
  • Along with the Infrastructure team, helped in much of theKafka-Storm based data pipeline. This involved work inJava.
  • Data miningalgorithms forclustering, classification and batch based collaborative filtering are implemented on top of Apache Hadoop using themapreduceparadigm
  • Implemented a large scale Cassandra deployment on RHEL 6.X
  • Implemented the Cassandra platform leveraging infrastructure automation via puppet
  • Assisted in tuning of Cassandra platform
  • Experienced with SDLC including coding standards, code reviews, source control management and build processes.
  • Experienced in gathering requirements, converting requirements into technical specification, exploring efficient implementation methods.
  • Designed and documented operational problems by following standards and procedures using IBM rational clear case.
  • Worked on Hortonworks Hadoop distribution.
  • Developed visualizations on Tableau to visually update and report the outputs.

Environment: Hadoop2.2, Kafka, Storm DB2, hbase, SQL, NoSQL Cssandra, IBM Data Studio, Tableau, HDFS, MapReduce, Apache Mahout, Sqoop, Pig, Hive, Zeke, LINUX, MarkLogic.

Confidential, Providence, RI

Hadoop Developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop
  • Installed and configured Hive, Pig, Sqoop and Oozie on the Hadoop cluster
  • Developed Simple to complex Map/Reduce Jobs using Hive and Pig
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
  • Handled importing of data from various data sources, performed transformations using Hive,MapReduce, loaded data into HDFS and extracted the data from Oracle into HDFS using Sqoop
  • Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
  • Implemented Cascading Hive 2.0to read and write ACID ORC tables
  • Configured Cassandra and it has been used as a backend databases for streaming purposes
  • Implemented Cassandra0.6+ based on implementations of Inputsplit,InputFormatt, andRecordReaderfor Hadoop MapReduce jobs to retrieve data from Cassandra
  • Used Pig UDF's to implement business logic in Hadoop
  • Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other sources.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team
  • Involved in Beta or User Acceptance Testing to support client

Environment: Hadoop, MapReduce, HDFS, Hive, Java, SQL, Cloudera Manager, Pig, Apache Sqoop, Oozie, HBase, PL/SQL, MySQL

Confidential, San Jose, CA

Java Developer

Responsibilities:

  • Played a very vital role starting from Requirements analysis, designing the Presentation templates and CTDs based on the business expectations.
  • Created the UI pages using Java, JSP, and Servlets.
  • Developed web pages of the application using Java.
  • Worked on Node.js framework for developing UI
  • Used Vignette tool to enter the content and also generate the pages for the application.
  • Generated the pages in the form of XML’s and rendered them into live pages.
  • Used Eclipse 3.3.0 as IDE.
  • Was the only person handled the entire project right from analysis phase to the implementation.
  • Communicated to the clients and business on the daily basis.
  • Supported user Acceptance testing of the application.
  • Successfully delivered the project on time without any failures.
  • Supported client in UAT Testing

Environment: Java, Servlets, JSP, Vignette Tool, XML,Node.JS Jason,UAT Testing Eclipse 3.3.0

Confidential

Java Developer

Responsibilities:

  • Developed Admission & Census module, which monitors a wide range of detailed information for each resident upon pre-admission or admission to your facility.
  • Involved in development of Care Plans module, which provides a comprehensive library of problems, goals and approaches.
  • You have the option of tailoring (adding, deleting, or editing problems, goals and approaches) these libraries and the disciplines you will use for your care plans.
  • Involved in development of General Ledger module, which streamlines analysis, reporting and recording of accounting information.
  • General Ledger automatically integrates with a powerful spreadsheet solution for budgeting, comparative analysis and tracking facility information for flexible reporting.
  • Developed UI using HTML, JavaScript, and JSP, and developed Business Logic and Interfacing components using Business Objects, XML, and JDBC.
  • Designed user-interface and checking validations using JavaScript.
  • Managed connectivity using JDBC for querying/inserting & data management including triggers and stored procedures.
  • Developed various EJBs for handling business logic and data manipulations from database.
  • Involved in design of JSP’s and Servlets for navigation among the modules.
  • Designed cascading style sheets and XML part of Order entry Module & Product Search Module and did client side validations with java script.

Environment: J2EE, Java/JDK, JDBC, JSP, Servlets, JavaScript, EJB, JNDI, JavaBeans, XML, XSLT, Oracle 9i, Eclipse, HTML/ DHTML.

Confidential

Java Developer

Responsibilities:

  • Developed Controllers for request handling using Spring framework
  • Involved in Command controllers, handler mappings and View Resolvers.
  • Designed and developed application components and architectural proof of concepts using Java, EJB, JSP, JSF, Struts, and AJAX.
  • Participated in Enterprise Integration experience web services
  • Configured JMS, MQ, EJB and Hibernate on Web sphere and JBoss
  • Focused on Declarative transaction management
  • Developed XML filesfor mapping requests to controllers
  • Coded Spring Portlets to build portal pages for application using JSR 286 API
  • Hibernate templates were used to access database
  • Use the DAO in developing application code
  • Developed stored procedures.
  • Extensively used Java Collection framework and Exception handling.

Environment: Java 1.5, J2EE5, Spring, JSP, XML, Spring TLD, JSP, Servlets, Hibernate Criteria API, XSLT, CSS, JSF, JSF RichFaces, WASD 5.1, Java Swing, Web service, AXIS Server2, WSDL, XML, Glassfish, JSR 286 API, UML, EJB, Java script, JQuery, Hibernate, SQL, CVS, Agile, JUnit.

We'd love your feedback!