Sr. Hadoop Developer Resume Profile
MA
Professional Summary:
- Overall 7 years' of professional IT experience with 5 years of experience in analysis, architectural design, prototyping, development, Integration and testing of applications using Java/J2EE Technologies and 2 years of experience in Big Data Analytics as Hadoop Developer.
- 2 years of experience as Hadoop Developer with good knowledge in Hadoop ecosystem technologies.
- Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
- Experienced on major Hadoop ecosystem's projects such as PIG, HIVE, HBASE and monitoring them with Cloudera Manager
- Extensive experience in developing PIG Latin Scripts and using Hive Query Language for data analytics
- Hands on experience working on NoSQL databases including Hbase, Cassandra and its integration with Hadoop cluster
- Good working experience using Sqoop to import data into HDFS from RDBMS and vice-versa
- Good knowledge in using job scheduling and monitoring tools like Oozie and ZooKeeper
- Experience in Hadoop administration activities such as installation and configuration of clusters using Apache, Cloudera and AWS
- Developed UML Diagrams for Object Oriented Design: Use Cases, Sequence Diagrams and Class Diagrams using Rational Rose,Visual Paradigm and Visio
- Hands on experience in solving software design issues by applying design patterns including Singleton Pattern, Business Delegator Pattern, Controller Pattern, MVC Pattern, Factory Pattern, Abstract Factory Pattern, DAO Pattern and Template Pattern
- Experienced in creative and effective front-end development using JSP, JavaScript, HTML 5, DHTML, XHTML Ajax and CSS
- Expert level skills in programming with Struts Framework, Custom Tag Libraries, Spring tag Libraries and JSTL.
- Good Working experience in using different Spring modules like Spring Core Container Module, Spring Application Context Module, Spring MVC Framework module, Spring ORM Module in Web applications
- Aced the persistent service, Hibernate and JPA for object mapping with database. Configured xml files for mapping and hooking it with other frameworks like Spring, Struts
- Used Jquery to select HTML elements, to manipulate HTML elements and to implement AJAX in Web applications.Used available plug-ins for extension of JQuery functionality
- Good exposure of Web Services using CFX/ XFIRE and Apache Axis, for the exposure and consumption of SOAP Messages
- Working knowledge of database such as Oracle 8i/9i/10g, Microsoft SQL Server, DB2
- Experience in writing numerous test cases using JUnit framework with Selenium.
- Strong experience in database design, writing complex SQL Queries and Stored Procedures
- Experienced in using Version Control Tools like SubVersion, Git
- Have extensive experience in building and deploying applications on Web/Application Servers like Weblogic, Websphere, and Tomcat
- Experience in Building, Deploying and Integrating with Ant, Maven
- Experience in development of logging standards and mechanism based on Log4J
- Strong work ethic with desire to succeed and make significant contributions to the organization
- Strong problem solving skills, good communication, interpersonal skills and a good team player
Technical Skills:
Hadoop/
Big Data Technologies HDFS, MapReduce, Hive, Pig, Sqoop, Flime, Hbase, Cassandra, Oozie, Zookeeper, YARN.
Programming Languages Java JDK1.4/1.5/1.6 JDK 5/JDK 6 , C/C , Matlab, R, HTML, SQL, PL/SQL
Framework Hibernate 2.x/3.x , Spring 2.x/3.x,Struts 1.x/2.x and JPA
Web Services WSDL, SOAP, Apache CXF/XFire, Apache Axis, REST, Jersey
Client Technologies JQUERY, Java Script, AJAX, CSS, HTML 5, XHTML
Operating Systems UNIX, Windows, LINUX
Application Servers IBM Web sphere, Tomcat, Web Logic, Web Sphere
Web technologies JSP, Servlets, Socket Programming, JNDI, JDBC, Java Beans, JavaScript, Web Services JAX-WS
Databases Oracle 8i/9i/10g, Microsoft SQL Server, DB2 MySQL 4.x/5.x
Java IDE Eclipse 3.x, IBM Web Sphere Application Developer, IBM RAD 7.0
Tools TOAD, SQL Developer, SOAP UI , ANT, Maven, Visio, Rational Rose
Professional Experience:
Confidential
Role: Sr. Hadoop Developer
Responsibilities:
- Responsible for building scalable distributed data solutions using Hadoop.
- Implemented nine nodes CDH3 Hadoop cluster on Red hat LINUX.
- Involved in loading data from Oracle database into HDFS using Sqoop queries.
- Implemented Map reduces programs to get Top K Results using Map Reduce programs by fallowing Map Reduce Design Patterns.
- Involved in loading the created HFiles into HBase for faster access of large customer base without taking Performance hit.
- Implemented working with different sources using Multi Input formats using Generic and Object Writable.
- Implemented best income logic using Pig scripts and Joins to transform data to AutoZone custom formats.
- Implemented custom comparators and partioners to implement Secondary Sorting.
- Worked on tuning the performance of Hive queries.
- Implemented Hive Generic UDF's to implement business logic.
- Responsible to manage data coming from different sources.
- Configured Time Based Schedulers that get data from multiple sources parallel using Oozie work flows.
- Installed Oozie workflow engine to run multiple Hive and pig jobs.
- Used Zookeeper for providing coordinating services to the cluster.
- Coordinated with end users for designing and implementation of analytics solutions for User Based Recommendations using R as per project proposals.
- Assisted monitoring Hadoop cluster using Gangila
- Implemented test scripts to support test driven development and continuous integration.
- Configured build scripts for multi module projects with Maven and Jenkins CI.
- Involved in story-driven agile development methodology and actively participated in daily scrum meetings.
Environment: Hadoop, Map Reduce, HDFS, Pig, Hive, Oozie, Java, Linux, Maven, Oracle 11g/10g, Zookeeper, SVN, Gangila
Confidential
Role: Hadoop developer
Responsibilities:
- As a Big Data Developer, implemented solutions for ingesting data from various sources and processing the Data-at-Rest utilizing Big Data technologies such as Hadoop, MapReduce Frameworks, HBase, Hive, Oozie, Flume, Sqoop etc.
- Designed and Implemented real-time Big Data processing to enable real-time analytics, event detection and notification for Data-in-Motion.
- Hands-on experience with IBM Big Data product offerings such as IBM InfoSphere BigInsights, IBM InfoSphere Streams, IBM BigSQL.
- Developed software to process, cleanse, and report on vehicle data utilizing various analytics and REST API languages like Java, Scala and Akka Asynchronous programming Framework
- Involved in Developing Assert Tracking project where we use to collect real-time vehicle location data using IBM streams from JMS queue and processed that data in Vehicle Tracking using ESRI GIS Mapping Software, Scala and Akka Actor Model.
- Involved in developing web-services using REST, HBase Native API and BigSQL Client to query data from HBase.
- Experienced in Developing Hive queries in BigSQL Client for various use cases.
- Involved in developing few Shell Scripts and automated them using CRON job scheduler
- Implemented test scripts to support test driven development and continuous integration.
- Responsible to manage data coming from different sources.
- Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
Environment: Hadoop 1x, Hive 0.10, Pig 0.11, Sqoop, HBase, UNIX Shell Scripting, Scala, Akka, IBM InfoSphere BigInsights, IBM InfoSphere Streams, IBM BigSQL, Java
Confidential
Role: Hadoop Developer
Responsibilities:
- Imported Data from Different Relational Data Sources like RDBMS, Teradata to HDFS using Sqoop.
- Imported Bulk Data into HBase Using Map Reduce programs.
- Perform analytics on Time Series Data exists in HBase using HBase API.
- Designed and implemented Incremental Imports into Hive tables.
- Used Rest ApI to Access HBase data to perform analytics.
- Worked in Loading and transforming large sets of structured, semi structured and unstructured data
- Involved in collecting, aggregating and moving data from servers to HDFS using Apache Flume
- Written Hive jobs to parse the logs and structure them in tabular format to facilitate effective querying on the log data.
- Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
- Experienced in managing and reviewing the Hadoop log files.
- Migrated ETL jobs to Pig scripts do Transformations, even joins and some pre-aggregations before storing the data onto HDFS.
- Worked with Avro Data Serialization system to work with JSON data formats.
- Worked on different file formats like Sequence files, XML files and Map files using Map Reduce Programs.
- Involved in Unit testing and delivered Unit test plans and results documents using Junit and MRUnit.
- Exported data from HDFS environment into RDBMS using Sqoop for report generation and visualization purpose.
- Worked on Oozie workflow engine for job scheduling.
- Created and maintained Technical documentation for launching HADOOP Clusters and for executing Pig Scripts.
Environment: Hadoop, HDFS, Map Reduce, Hive, Oozie, Sqoop, Pig, Java, Rest API, Maven, MRUnit, Junit.
Confidential
Role: Java /J2EE Developer
Responsibilities:
- Responsible for gathering and analyzing requirements and converting them into technical specifications
- Used Rational Rose for creating sequence and class diagrams
- Developed presentation layer using JSP, Java, HTML and JavaScript
- Used Spring Core Annotations for Dependency Injection
- Designed and developed a 'Convention Based Coding' utilizing Hibernates persistence framework and O-R mapping capability to enable dynamic fetching and displaying of various table data with JSF tag libraries
- Designed and developed Hibernate configuration and session-per-request design pattern for making database connectivity and accessing the session for database transactions respectively. Used HQL and SQL for fetching and storing data in databases
- Participated in the design and development of database schema and Entity-Relationship diagrams of the backend Oracle database tables for the application
- Implemented web services with Apache Axis
- Designed and Developed Stored Procedures, Triggers in Oracle to cater the needs for the entire application. Developed complex SQL queries for extracting data from the database
- Designed and built SOAP web service interfaces implemented in Java
- Used Apache Ant for the build process
- Used ClearCase for version control and ClearQuest for bug tracking
Environment: Java, JDK 1.5, Servlets, Hibernate, Ajax, Oracle 10g, Eclipse, Apache Ant, Web Services SOAP , Apache Axis, Apache Ant, Web Logic Server, JavaScript, HTML, CSS, XML
Confidential
Role: Java /J2EE Developer
Responsibilities:
- Involved in the design and development phases of Rational Unified Process RUP
- Involved in creation of UML diagrams like Class, Activity, and Sequence Diagrams using modeling tools of IBM Rational Rose
- Used IBM Rational Software Architect for development
- Involved in the development of JSPs and Servlets for different User Interfaces
- Used Struts action forms and developed Action Classes, which act as the navigation controller in Struts framework
- Implemented the template-based categorization of presentation content using Struts-Tiles. MVC implementation using Struts framework
- Employed Hibernate to create the persistence layer and to make the transactions to the backend
- Used AJAX for highly intensive user operations
- Developed Web Services using SOAP
- Worked on parsing the XML files using DOM/SAX parsers
- Involved in Unit Testing of Various Modules based on the Test Cases
- Involved in Bug fixing of various modules that were raised by the Testing teams in the application during the Integration testing phase
- Involved and participated in Code reviews
- Used Log4J logging framework for logging messages
- Used Rational ClearCase for version control
- Used Rational Clear Quest for bug tracking
- Involved in deployment of application on IBM Websphere Application Server
Environment: Java, J2EE, Hibernate, XML, XML Schemas, JSP, HTML, CSS, IBM Rational Rose, JMS, DB2, PL/SQL, Junit, Log4j, IBM Web sphere Application Server, Rational ClearCase, Rational ClearQuest.