Hadoop Developer Resume
Grandrapdis, MI
SUMMARY
- 8+ Years of experience in IT industry comprising of extensive work experience includes 5 years’ experience in Big Data technologies and 4 years of experience in JAVA and MAINFRAMES technologies
- Worked in finance, Information technology domains.
- Expertise in various components of Hadoop Ecosystem - Map Reduce, Hive, Pig, Sqoop, Impala, Flume, Oozie, HBase, Apache Solr, Apache storm, YARN
- Hands-on Experience in working wif Cloudera Hadoop Distributio
- Written, executed and deployed complex Map Reduce java code using various Hadoop API’s
- Experienced in Map Reduce code tuning and performance optimization
- Knowledge in installing, configuring and using Hadoop ecosystem components
- Proficient in Hive Query language and experienced in hive performance optimization using Partitioning, Dynamic-Partitioning and bucketing concepts
- Expertise in developing PIG Scripts. Written and implemented custom UDF’s in Pig for data filtering
- Used Impala for data analysis.
- Hands-On experience in using teh data ingestion tools - Sqoop and flume
- Collected teh log data from various sources (webservers, Application servers and consumer devices) using Flume and stored in HDFS to perform various analysis
- Performed Data transfer between HDFS and other Relational Database Systems (MySQL, SQLServer, Oracle and DB2) using Sqoop
- Used Oozie job scheduler to schedule Map Reduce, Hive and pig jobs. Experience in automating teh job execution
- Experience wif NoSQL databases like HBase and fair noledge in MongoDB and Cassandra.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions
- Experience in working wif different relational databases like MySQL, SQLServer, Oracle and DB2
- Strong experience in database design, writing complex SQL Queries
- Expertise in development of multi-tiered web based enterprise applications using J2EE technologies like Servlets, JSP, JDBC and Hibernat
- Extensive coding experience in Java and Mainframes - COBOL, CICS and JCL
- Experience of working in all teh phases of Software Development in various methodologies
- Strong base in writing teh Test plans, perform Unit Testing, User Acceptance testing, Integration Testing, System Testing
- Proficient in software documentation and technical report writing.
- Worked coherently wif multiple teams. Conducted peer reviews, organized and participated in noledge transfer (technical and domain) sessions.
- Experience in working wif Onsite-Offshore model.
- Developed various UDFs in Map-Reduce and Python for Pig and Hive.
- Decent experience and noledge in other SQL and NoSQL Databases like MySQL, MS SQL, MongoDB, HBase, Accumulo, Neo4j and Cassandra.
- Good Data Warehouse experience in MS SQL.
- Good noledge and firm understanding of J2EE frontend/backend, SQL and database concepts.
- Good experience in Linux, Mac OS environment.
- Used various development tools like Eclipse, GIT, Android Studio and Subversion.
- Knowledge wif Cloudera Hadoop and Map-R distribution components and their custom packages.
TECHNICAL SKILLS
Hadoop/Big Data: Map Reduce, Hive, Pig, Impala, Sqoop, Flume, Spark,HDFS, Oozie, Hue, HBase, Zookeeper
Operating Systems: Windows, Ubuntu, RedHat Linux, Unix
Java & J2EE Technologies: Core Java, Servlets, JSP, JDBC
Frameworks: Hibernate
Databases/Database Languages: Oracle 11g/10g/9i, MySQL, DB2, SQLServer, SQL, HQL, NoSQL (HBase)
Web Technologies: JavaScript, HTML, XML, REST, CSS
Programming Languages: Java, Unix shell scripting, COBOL, CICS, JCL
IDE’s: Eclipse, Net beans
Web Servers: Apache Tomcat 6
Methodologies: Waterfall, Agile and Scrum
PROFESSIONAL EXPERIENCE
Confidential, GrandRapdis, MI
Hadoop Developer
Responsibilities:
- Using Sqoop to import and export data from Oracle and DB2 into HDFS to use it for teh analysis
- Worked on Pig, Map Reduce along wif Sqoop to develop a data pipeline for moving teh customer behavioral data and transaction histories into HDFS for further analysis
- Performed Data Ingestion from multiple internal clients using Apache Kafka
- Developed multiple Map Reduce jobs for data cleaning
- Using Oozie as teh workflow engine, scheduled these jobs. Actions can be performed both sequentially and parallel using Oozie
- Developed wrapper shell scripts to hold teh Oozie workflow
- Developed custom aggregate functions using Spark SQLand to create tables as per teh data model and performed interactive querying
- Experience developing iterative algorithms using Spark Streaming in Scalaand Pythonto build near real-time dashboards
- Implemented Avro and Parquet data formats for Apache Hive computationsto handle custom business requirements
- Tested Apache Tez, an extensible framework for building high performance batch and interactive data processing applications, on Pig and Hive jobs
- Worked on custom Pig Loaders and storage classes to work wif variety of data formats such as JSON and XML file formats
- Using Hive and Pig developed teh ad-hoc queries required for teh business users to generate data metrics
- Developed PIG Latin scripts to extract teh data from teh web server output files and to load into HDFS
- Used Hive to analyze teh partitioned and bucketed data and compute various metrics for reporting
- Created many Java UDF and UDAFs in Hive for functions dat were not preexisting in Hive
- Implementing different performance optimization techniques such as using distributed cache for small datasets, partitioning and bucketing in hive, doing map side joins etc.
- By using Zookeeper implementation in teh cluster, provided concurrent access for hive tables wif shared and exclusive locking
- Wrote teh shell scripts to monitor teh health check of Hadoop daemon services and respond accordingly to any warning or failure conditions
- Used Visualization tools such as Power view for excel, Tableau for visualizing and generating reports.
- Worked on teh NoSQL databases HBase and Cassandra.
- Worked on HBase Java API to populate operational HBase table wif Key value.
Environment: CDH 5.0.6, Hadoop, Map Reduce, Hive, HDFS, Pig, Java, Sqoop, Oozie, Avro, Spark, Tez, Kafka, HBase, Zookeeper, Cassandra, Oracle, NoSQL and Unix/Linux,Phoenix,Isilon.
Confidential, Birmingham, Alabama
Hadoop Developer
Responsibilities:
- Working on teh development of a web application and Spring batch applications. Teh web application allows teh customers to sign up and get teh cellular and music services.
- Tools: MySQL, Tomcat Server, Mybatis, Spring MVC, REST, AWS (Amazon Web Services)
- Working on teh development of User Interface
- Tools: Angular JS, Backbone JS, java script, velocity
- Working on teh mobile payment functionality using PayPal, Angular JS and Spring MVC
- Have been involved in Spring Integratio
- Have been involved in teh building and deployment of teh applications using Ant build.
- Involved in fixing teh production bugs and also involved in teh deployment process.
- Have been working on Spring Batch applications to make sure teh customer cellular and music services gets renewed Spring Batch
- Involved in deploying teh applications in AWS.
- Proficiency in Unix/Linux shell commands.
- Maintains teh EC2(Elastic Computing Cloud) and RDS (Relational Database Services) in amazon web services.
- Created RESTful web services interface for supporting XML message transformation.
- Developed Junit test case using TestNG.
- Involved in designing teh web applications and I closely work wif architect.
Environment: Hadoop (CDH), MapReduce, HDFS, Hive, Pig, Sqoop, Flume, Oozie, Java, SQL, Kafka, Cassandra
Confidential, Newark, NJ
Hadoop Developer
Responsibilities:
- Implemented CDH3 Hadoop cluster on CentOS.
- Design and develop a daily process to do incremental import of raw data from Oracle into Hive tables using Sqoop.
- Launching Amazon EC2 Cloud Instances using Amazon Images (Linux/Ubuntu) and Configuring launched instances wif respect to specific applications.
- Launching and Setup of HADOOP Cluster which includes configuring different components of HADOOP.
- Hands on experience in loading data from UNIX file system to HDFS.
- Cluster coordination services through Zookeeper.
- Installed and configured Flume, Hive, Pig, Sqoop and Oozie on teh Hadoop cluster.
- Involved in creating Hive tables, loading data and running hive queries in those data.
- Extensive Working noledge of partitioned table, UDFs, performance tuning, compression-related properties, thrift server in Hive.
- Involved in writing optimized Pig Script along wif involved in developing and testing Pig Latin Scripts.
- Working noledge in writing Pig's Load and Store functions.
Environment: Apache Hadoop 1.0.1, MapReduce, HDFS, CentOS, Zookeeper, Sqoop, Hive, Pig, Oozie, Java, Eclipse, Amazon EC2, JSP, Servlets, Oracle.
Confidential
JAVA Developer
Responsibilities:
- Installation, Configuration & Upgrade of Solaris and Linux operating system.
- Actively participated in requirements gathering, analysis, design, and testing phases
- Designed use case diagrams, class diagrams, and sequence diagrams as a part of Design Phase
- Developed teh entire application implementing MVC Architecture integrating JSF wif Hibernate and Spring frameworks.
- Developed teh Enterprise Java Beans (Stateless Session beans) to handle different transactions such as online funds transfer, bill payments to teh service providers.
- Implemented Service Oriented Architecture (SOA) using JMS for sending and receiving messages while creating web services
- Developed XML documents and generated XSL files for Payment Transaction and Reserve Transaction systems.
- Developed SQL queries and stored procedures.
- Developed Web Services for data transfer from client to server and vice versa using Apache Axis, SOAP and WSDL.
- Used JUnit Framework for teh unit testing of all teh java classes.
- Implemented various J2EE Design patterns like Singleton, Service Locator, DAO, and SOA.
- Worked on AJAX to develop an interactive Web Application and JavaScript for Data Validations.
- Developed teh application under JEE architecture, developed Designed dynamic and browser compatible user interfaces using JSP, Custom Tags, HTML, CSS, and JavaScript.
- Deployed & maintained teh JSP, Servlets components on Web logic 8.0
- Developed Application Servers persistence layer using, JDBC, SQL, Hibernate.
- Used JDBC to connect teh web applications to Data Bases.
- Implemented Test First unit testing framework driven using Junit.
- Developed and utilized J2EE Services and JMS components for messaging communication in Web Logic.
- Configured development environment using Web logic application server for developer’s integration testing.
Environment: Java/J2EE, SQL, Oracle 10g, JSP 2.0, EJB, AJAX, Java Script, Web Logic 8.0, HTML, JDBC 3.0, XML, JMS, log4j, Junit, Servlets, MVC, My Eclipse
Confidential
Jr. JAVA Developer
Responsibilities:
- Worked closely wif teh Development Team in teh design phase and developed use case diagrams using Rational Rose.
- Designed and developed teh web based application using Spring MVC module and Spring Boot.
- Worked on developing backend Java code for performing CRUD operations using Rest Controller and Spring Data JPA.
- UsedAngularJS, HTML5 and CSS as front end technologies to develop teh Web application’s UI.
- Developed complex applications dat sends and consumes REST web services.
- Developed extensively wif RESTful APIs’ using JSON object data, writing Spring MVC code.
- Developed teh web application using Spring Java Based Configuration and by teh use of components, auto wiring and qualifiers.
- Developed User Interfaces which are Single page applications using JavaScript, CSS, HTML5 and AngularJS.
- Good familiarity wif micro service architecture.
- Participated in Planning, Analysis, Design & development of self-service healthcare application. Worked wif product owners to understand desired application capabilities and testing scenarios.
- Designing UML, sequence diagrams for better understanding of project from high level to low level.
- Working on continues integration methodology for building and deployment of code in different environments using Jenkins Continues Integration.
- Written Integration Tests at Controller level wif MockMVC along wif unittestswif Junitusing Mockito and data from HSQL.
- Worked wif various databases across modules such as MySQL, Oracle, Postgres and Liquibase.
- Worked on end to end development along wif writing end to end Integration Tests dat include testing at teh service and repository level.
- Built application using Maven as build tool, Intel ij as IDE and GIT as repository.
- Involved in Peer Reviews and Team Code Reviews throughout teh two-week agile Sprint.
- Participated in requirement collection. Designed application using UML. Elaborated on teh Use Cases based on business requirements and responsible for creation of Class Diagrams, Object Interaction Diagrams (Sequence and Collaboration) and Activity Diagrams for documentation.
- Implemented business logic using java classes, SQL and Spring Data JPA.
- Experience wif Agile methodologies and JIRA.
- Used Jira as ticket issuer for creating tasks, tracking bugs and resolving accordingly.
- Participating in sprint planning for review teh user stories and dividing teh tasks in JIRA.
- Using Putty and Git Bash to connect tunnels and higher environment servers like DEV, QA and Prod.
- Used Postman extensions for testing Rest Web Services and JSON data format.
- Developed functionalities, generated Java classes and developed frameworks which can be used across various projects.
- Documentation of Dao objects, Restful API calls and object constraints using Spring REST DOCS.
Environment: Java 1.7, Spring4.0, Restful Web Services, JSON, Spring JPA, Spring Security, IntelliJ, Junit, Mock MVC, HSQL, Maven, Oracle/MySQL/Postgres,Liquibase, Jenkins, GIT, Tomcat Server, JIRA, Agile Software Development