Hadoop Developer Resume
Dublin, OH
SUMMARY
- Hadoop Developer with 8+ years of professional IT experience, which includes experience in the Big Data ecosystem, related technologies and advanced analytics.
- Hands on experience in configuring and using ecosystem components likeHDFS,MapReduce,Oozie,Hive,Sqoop,Pig,Flume,HbaseandZookeeper.
- Experience in AWS cloud environment.
- Experience with advanced analytics techniques likeK - Meansclustering and high dimensional data visualization.
- Working experience with large scale Hadoop environments build and support includingdesign, configuration, installation, performance tuning, Analyticsandmonitoring.
- Hands on experience onVPC, EC2, S3, Redshift, Cloudwatch.
- Hands on experience in writingMapReducejobs in Java.
- Understanding of Data Structures and Algorithms.
- Analyzed large data sets usingHivequeries andPigScripts
- Experience in developing and extending serialization frameworks likeAvro.
- Very Good knowledge on Hadoop Architecture, Administration, HDFS File system and Streaming API along with Data warehousing Concepts.
- Working experience with Hadoop Clusters using Cloudera (CDH) distribution.
- Experience in architecting and creatingHBASE database systems.
- Experience in HBASEsystems backup and recovery, HBASE security.
- Experienced in processing Big data on the Apache Hadoop framework usingMapReduceprograms.
- Experienced in installation, configuration, supporting and monitoring Hadoop clusters using Apache, Cloudera distributions.
- Imported and exported data usingSqoopfrom HDFS to RDBMS.
- ExtendedHiveandPigcore functionality by writing customUDFs.
- Experienced in analyzing data usingHiveQL,PigLatin, and customMapReduceprograms in Java.
- Worked on NoSQL databases includingHBase.
- Knowledge in job workflow scheduling and monitoring tools likeoozieandZookeeper.
- Experience in Big Data analysis using PIG and HIVE and understanding ofSQOOP.
- Good understanding of HDFS Designs, Daemons, federation and HDFS high availability (HA).
- Experience in installation, configuration, supporting and managing- CloudEra's Hadoop platformalong withCDH3&4clusters.
- Well versed in designing and implementing MapReduce jobs using JAVA on Eclipse to solve real world scaling problems
- Excellent Java development skills usingJ2EE,J2SE,Servlets,JSP,EJB,JDBC.
- Good knowledge in integration of various data sources like RDBMS, Spreadsheets, Text files and XML files.
- Basic Knowledge ofUNIXand shell scripting.
- Working experience with testing tools likeJunit.
- Familiarity working with popular frameworks like Struts, Hibernate, SpringMVC.
- Worked with the software development models,Waterfall Modeland theAgileSoftware Development Methodology.
TECHNICAL SKILLS
Big Data Technologies: Hadoop, HDFS, Hive, Map-Reduce, Pig, Sqoop, Zookeeper,Oozie,Avro,HBase
Languages: Java, J2EE, Hibernate,Spring, JPA,C/C++,SQL.
Web Technologies: JavaScript, JSF, Ajax, Jquery, JSP, Servlets, Java Swings, Java Beans, JSON, EJB, JMS, HTML, XML, CSS
IDE: Eclipse, RSA,Vmware, Apache
GUI: Visual Basic 5.0, Oracle, MS Office (Word, Excel, Outlook, PowerPoint, Access).
Browsers: Google Chrome, Mozilla Firefox, IE8, safari
Testing Tools: Junit
Application Servers: Tomcat
DB Languages: SQL, PL/SQL
NoSQL Databases: Hbase
Operating Systems: LINUX/Unix, all Windows, Mac OS X
PROFESSIONAL EXPERIENCE
Confidential, NJ
Hadoop Developer
Responsibilities:
- Built end to end advanced analytics pipeline for customer segmentation using K-Means clustering and high dimensional data visualization.
- Launching and Setup of HADOOP Cluster on AWS, which includes configuring different components of HADOOP.
- Experience in using Sqoop to connect to the DB2 and move the pivoted data to hive tables.
- Managed the Hive database, which involves ingest and index of data.
- Expertise in exporting the data from avro files and indexing the documents in sequence or serde file format.
- Hands on experience in writing custom UDF’s and also custom input and output formats.
- Provides an enterprise ready data platform with Cloudera (CDH) distribution.
- Involved in GUI development using Javascript.
- Developed Unit test case using Junit framework and automated the scripts.
- Hands on experience on Oozie workflow.
- Worked on NoSQL databases including Hbase.
- Worked in Agile environment, which uses Jira to maintain the story points and Kanban model.
- Involved in brainstorming JAD sessions to design the GUI.
- Hands on experience on maintaining the builds in Bamboo and resolved the build failures in Bamboo.
Environment: Hadoop,Bigdata,Hive,Hbase,Sqoop,Oozie,HDFS,MapReduce, Jira,Junit, Lucene,, Unix, Sql, AWS(Amazon Web Services), Cloudera (CDH) distribution..
Confidential, Dublin, OH
Hadoop Developer
Responsibilities:
- Involved in the Complete Software development life cycle (SDLC) to develop the application.
- Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase database and Sqoop.
- Used Spark for Parallel data processing and better performances.
- Involved in loading data from LINUX file system to HDFS.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Implemented test scripts to support test driven development and continuous integration.
- Installed and maintained Apache Hadoop, Map Reduce, HDFS (Hadoop Distributed File System), and Pig, Hbase, zookeeper and Sqoop.
- Developed multiple Map Reduce jobs in java for data cleaning
- Installed and configured Hadoop Map Reduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Created Pig Latin scripts to sort, group, join and filter the enterprise wise data.
- Involved in creating Hive tables, loading with data and writing hive queries that will run internally in Map Reduce way.
- Supported MapReduce Programs those are running on the cluster.
- Analyzed large data sets by running Hive queries and Pig scripts.
- Worked on tuning the performance Pig queries.
- Mentored analyst and test team for writing Hive Queries.
- Installed Oozie workflow engine to run multiple Mapreduce jobs.
- Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as Required
Environment: Hadoop,HDFS, Map Reduce, Hive, Pig, Sqoop, Linux, Java,Oozie,Hbase
Confidential, Westbrook, ME
Java/J2EE Developer
Responsibilities:
- Analyze the requirements and communicate the same to both Development and testing teams.
- Involved in the designing of the project using UML.
- Followed J2EE Specifications in the project.
- Designed the user interface pages in JSP.
- Used XML and XSL for mapping the fields in database.
- Used JavaScript for client side validations.
- Created stored procedures and triggers that are required for project.
- Created functions and views in Oracle. Responsible for updating database tables and designing SQL queries using PL/SQL.
- Developed the application by using the Spring MVC framework.
- Spring IOC being used to inject the parameter values for the Dynamic parameters.
- Created bean classes for communicating with database.
- Involved in documentation of the module and project.
- Prepared test cases and test scenarios as per business requirements.
- Prepared coded applications for unit testing using JUnit.
Environment: Struts, Hibernate, SpringMVC, EJB, JSP, Servlets, JMS, XML, JavaScript, UML, HTML, JNDI, CVS, Log4J, JUnit, Windows 2000, Web Sphere App server, RAD, Rational Rose, Oracle 9i
Confidential, San Antonio, TX
Java Developer
Responsibilities:
- Description:Worked on development of Customer Order Sales Processing and fulfillment System. This system deals with the sales portal and tracks the customer orders after order have been fulfilled. It also tracks Defects and Customer Satisfaction. Built a new Candidate Address System (CAS). The CAS allowed the client to enter a postal code and returns the list of all streets names in the provided postal code.
- Responsibilities:
- Performed in different phases of the Software Development Lifecycle (SDLC) of the application, including: requirements gathering, analysis, design, development and deployment of the application.
- Model View Control (MVC) design pattern was implemented with Struts MVC, Servlets, JSP, HTML, AJAX, JavaScript, CSS to control the flow of the application in the Presentation/Web tier, Application/Business layer (JDBC) and Data layer (Oracle 10g).
- Performed the analysis, design, and implementation of software applications using Java, J2EE, XML and XSLT.
- Developed Action Forms and Controllers in Struts 2.0/1.2 framework.
- Utilized various Struts features like Tiles, tagged libraries and Declarative Exception Handling via XML for the design.
- Created XML Schema, XML template and used XML SAX/DOM API to parse them.
- Implemented design patterns like Data Access Objects (DAO), Value Objects/Data Transfer Objects (DTO), and Singleton etc.
- Developed JavaScript validations on order submission forms.
- Designed, developed and maintained the data layer using Hibernate.
- JUnit was used to do the Unit testing for the application.
- Used Apache Ant to compile java classes and package into jar archive.
- Used Clear Quest to keep track of the application bugs as well as to coordinate with the Testing team.
- Involved in tracking and resolving defects, which arise in QA & production environments.
Environment: Java, J2EE, JSP, Servlets, Struts 2.0/1.2, Hibernate, HTML, CSS, JavaScript, XML, JUnit, Apache Tomcat, PL/SQL, Oracle 11g, Apache Ant, Eclipse, Rational Rose.
ConfidentialJr. Java Developer
Responsibilities:
- Used Hibernate ORM tool as persistence Layer - using the database and configuration data to provide persistence services (and persistent objects) to the application.
- Implemented Oracle Advanced Queuing using JMS and Message driven beans.
- Responsible for developing DAO layer using Spring MVC and configuration XML’s for Hibernate and to also manage CRUD operations (insert, update, and delete).
- Implemented Dependency injection of spring frame work.
- Developed reusable services using BPEL to transfer data.
- Participated in Analysis, interface design and development of JSP.
- Configured log4j to enable/disable logging in application.
- Wrote SPA (Single Page Web Applications) using RESTFUL web services plus Ajax and AngularJS.
- Developed Rich user interface using HTML, JSP, AJAX, JSTL, Java Script, JQuery and CSS.
- Wrote UNIX Shell scripts and used UNIX environment to deploy the EAR and read the logs.
- Implemented Log4j for logging purpose in the application.
- Involved in code deployment activities for different environments.
- Implemented agile development methodology.
Environment: Java, Spring, Hibernate, JMS, EJB, Web logic Server, JDeveloper, SQLDeveloper, Maven, XML, CSS, JavaScript, JSON.