Hadoop Developer Resume
OhiO
SUMMARY:
- Around 4 years of IT experience in Analysis, design, development, implementation, maintenance and support with experience in developing strategic methods for deploying big data technologies to efficiently solve Big Data processing requirement.
- Around 3 years of experience on BIG DATA using HADOOP framework and related technologies such as HDFS, HBASE, Map Reduce, HIVE, PIG, FLUME, OOZIE, SQOOP, SPARK and ZOOKEEPER.
- Experience in data analysis using HIVE, Pig Latin, HBase and custom Map Reduce programs in Java.
- Excellent understanding /knowledge on Hadoop (Gen - 1 and Gen-2) and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager (YARN).
- Experience in managing and reviewing Hadoop log files.
- Excellent understanding and knowledge of NOSQL databases like HBase
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS.
- Implemented Hadoop based data warehouses, integrated Hadoop with Enterprise Data Warehouse systems.
- Experience in developing jobs using Spark framework modules like Spark-Core, Spark-SQL and Spark-Streaming
- Good experience working with Horton works Distribution and Cloudera Distribution.
- Experience in Object Oriented Analysis Design (OOAD) and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
- Experience in designing both time driven and data driven automated workflows using Oozie.
- Experience in writing UNIX shell scripts.
- Experience working with JAVA, J2EE, JDBC, ODBC, JSP, Java Eclipse, Java Beans, EJB, Servlets, MS SQL Server.
- Experience in all stages of SDLC (Agile, Waterfall), writing Technical Design document, Development, Testing and Implementation of Enterprise level Data mart and Data warehouses.
TECHNICAL SKILLS:
Technology: Hadoop Ecosystem/J2SE/J2EE / Data base
Operating Systems: Windows Vista/XP/NT/2000/ LINUX (Ubuntu, Cent OS), UNIX
DBMS/Databases: My SQL, SQL
Programming Languages: C, C++, Core Java, XML, JSP/Servlets.
Big Data Ecosystem: HDFS, Map Reducing, HDFS, Oozie, Hive, Pig, Sqoop, Flume, Zookeeper, Spark and HBase.
Methodologies: Agile, Water Fall
NOSQL Databases: HBase
Version Control Tools: SVN, GIT HUB
PROFESSIONAL EXPERIENCE:
Hadoop Developer
Confidential, Ohio
Responsibilities:- Involved in loading data from UNIX file system to HDFS
- Creating Hive tables and working on them using Hive QL.
- Involved in creating Hive tables loading with data and writing hive queries which will run internally in map reduce way.
- Experience in installing configuring and using Hadoop ecosystem components.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Experienced in defining job flows.
- Experienced in managing and reviewing Hadoop log files.
- Cluster coordination services through Zookeeper.
- Imported data using Sqoop to load data from MySQL to HDFS on regular basis
- Developing Scripts and Batch Job to schedule various Hadoop Program
- Written Hive queries for data analysis to meet the business requirements.
- Involved in Requirement Gathering to connect with BA.
- Working Closely with BA & vendor for creating technical Documents and Design specifications.
- Writing Hive queries to read from HBase.
- Writing Shell scripts to automate the process flow.
- Serializing JSON data and storing the data into tables using Hive.
- Experience in loading data from Hive/SQL and performing spark transformations using Spark
- RDDs.
- Experience working with DateFrames and Datasets in Spark.
- Involved in converting Hive/SQL queries into Spark transformations using Spark RDD's.
- Hive Data sampling, Buckets and Cluster methods for schema.
- Writing the Hadoop Job workflows & scheduling using Oozie.
- Good experience in Oozie Framework and Automating daily import jobs.
- Implemented Partitioning, Dynamic Partitions and Buckets in HIVE for efficient data access.
- Created an e-mail notification service upon completion of job for the particular team which requested for the data.
- Defined job work flows as per their dependencies in Oozie.
- Developed Hive queries to process the data for visualizing.
- Involved in HP Application life cycle Management and JIRA - Agile methodology for task distribution with estimates.
Environment: Hadoop Framework, Map Reduce, Hive, Sqoop, Pig, HBase, Flume, Oozie, Java(JDK1.8), UNIX Shell Scripting, Oracle 11g/12g, Windows NT, NIFI.
Hadoop Developer
Confidential,OH
Responsibilities:- Participated in development/implementation of Cloudera’s Hadoop environment.
- Tracking of overall mapping by preparing burn down chart.
- Hands on experience with SCALA for the batch processing and spark streaming data.
- Evaluating Data Sources, Data Mapping, data analysis, data profiling, data quality
- Writing SQL queries and optimizing the queries in Oracle and Big data Hadoop platform.
- Used Scala to write code for all Spark use cases.
- Performing data analysis and data profiling using complex SQL on various sources systems including Oracle and Hadoop Data Validation, Data Cleansing, Data Verification and identifying data mismatch, Data quality Data Migration using ETL tools.
- Writing and Testing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
- Developed Map Reduce code to transform unstructured data into a structured data which was later pushed into HBase from HDFS.
- Integrated Hive and HBase for effective usage and performed MR Unit testing for the Map Reduce jobs.
- Involved in transforming data from Mainframe tables to HDFS, and HBase tables using Sqoop and Pentaho kettle.
- Designed and Developed work flows by writing simple to complex Oozie workflows jobs as per the requirement.
- Ability to create Hive Avro Tables to read and write the data with the appropriate compressions.
- Good knowledge on Agile Methodology and the scrum process.
- Hands- on experience in developing web applications using Python on Linux and UNIX platform.
- Experience in Automation Testing, Software Development Life Cycle (SDLC) using the Waterfall Model and good understanding of Agile Methodology.
- Maintaining quality of the project as per the standards
- Implemented the secure authentication for the Hadoop Cluster using Kerberos Authentication protocol.
Environment: Apache Hadoop, Map Reduce, HDFS, Hive, Java, SQL, PIG, Zookeeper, Java (jdk1.6), Flat files, Oracle 11g/10g, MySQL, Windows NT, UNIX, Sqoop, Hive, Oozie, HBase.
Assoc. Java Developer
Confidential
Responsibilities:- Designed and developed Web Services using Java /J2EE in WebLogic environment. Developed web pages using Java Servlet, JSP, CSS, Java Script, DHTML, and HTML. Added extensive Struts validation. Wrote Ant scripts to build and deploy the application.
- Involve in the Analysis, Design, and Development and Unit testing of business requirements.
- Developed business logic in JAVA /J2EE technology.
- Implemented business logic and generated WSDL for those web services using SOAP.
- Worked on Developing JSP pages
- Implemented Struts Framework.
- Developed Business Logic using Java/J2EE.
- Modified Stored Procedures in Oracle Database.
- Developed the application using Spring Web MVC framework.
- Worked with Spring Configuration files to add new content to the website.
- Worked on the Spring DAO module and ORM using Hibernate. Used Hibernate Template and Hibernate Dao Support for Spring-Hibernate Communication.
- Configured Association Mappings such as one-one and one-many in Hibernate
- Worked with JavaScript calls as the Search is triggered through JS calls when a Search key is entered in the Search window
- Worked on analyzing other Search engines to make use of best practices.
- Collaborated with the Business team to fix defects.
- Worked on XML, XSL and XHTML files.
- As part of the team to develop and maintain an advanced search engine, would be able to attain expertise on a variety of new software technologies.
Environment: Java 1.6, J2EE, Eclipse SDK 3.3.2, Java Spring 3.x, jQuery, Oracle 10i, Hibernate, JPA, Json, Apache Ivy, SQL, stored procedures, Shell Scripting, XML, HTML and JUnit, TFS, Ant, Visual Studio.