We provide IT Staff Augmentation Services!

Hadoop Developer Resume

2.00/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • 9 years IT experience, which includes experience in requirement gathering, design and development, and Implementation of Hadoop and Data Warehousing solutions and also embedded application.
  • Over 3 years of experience in dealing with Apache Hadoop components like BIGData, HDFS, MapReduce, Hive,, Pig, Sqoop, Oozie, and Big Data and Big Data Analytics
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Familiar with Flume, HBase Technologies
  • Experience in analyzing data using HiveQL, Pig Latin and custom MapReduce programs in Java. Extending Hive and Pig core functionality by writing custom UDFs.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Zookeeper, Oozie, Hive, Sqoop, Pig, and Flume.
  • Experience in data extraction and transformation using MapReduce jobs.
  • Deep understanding of data import and export from relational database into Hadoop cluster
  • Expertise in writing MapReduce programs using JAVA
  • Well versed in Core Java, J2EE, Web Services (SOA), JDBC, Swings, MySQL
  • Very Good understanding of SQL, ETL and Data Warehousing Technologies.
  • Active Member of the Hadoop Core Team, actively participating in defining various Hadoop Design Standards and getting them implemented.
  • Has very good data modeling skills in understanding the data requirements and subsequently building the data model in Logical & Physical.
  • Has very good Data Analysis and Data Validation skills and good exposure to the entire Software Development Lifecycle (SDLC).
  • Expert in requirements collection, analysis, excellent troubleshooting and debugging skills.
  • Expert in Building, Deploying and Maintaining the Applications.
  • Very Good understanding and Working Knowledge of Object Oriented Programming (OOPS) and Multithreading.
  • Very Good understanding and working Knowledge in Cloud Technologies.
  • Expert in Embedded, Automotive, Avionics, Telecom, Product, and Maintenance. Ability to contribute individually and work well with minimal supervision.

TECHNICAL SKILLS

Open Source Tools & Technologies: Cloudera Hadoop, Java Technologies, CWindows SDK, TNSDL, OpenGL

Development Tools & Operating Systems: Java, J2EE, Pig-Latin, Hive, Oozie, HiveQL, Hibernate, Spring, Junit, Ant, SQL script, Shell Script, Programming with R, Linux, MAC OS X

Build Tools: Maven, VSTS 2010, JAM, DDK

Data Bases: Oracle 9i/10g, HDFS, MySQL, XML, Microsoft SQL Server 2000, MS Access

Tools: & Utilities: Subversion, Team Foundation Server(TFS), Rational Clear case, PVCS, Confluence, VMWare, Eclipse, Microsoft Visio & office, TOAD, SQL developerCode flow, Source Control

Application Servers: Apache Tomcat

PROFESSIONAL EXPERIENCE

Confidential, Phoenix, AZ

Hadoop Developer

Responsibilities:

  • Extracting and Transforming Data using Hive MapReduce Jobs.
  • Responsible to manage data coming from different sources.
  • Supported MapReduce programs those are running on the cluster.
  • Requirement gathering from the Business Partners.
  • Understand the data requirements and prepare the requirement document and get agreement from the client dat all the data requirements are complete and understood.
  • Experience installing Hadoop Ecosystem components.
  • Experience managing and reviewing the Hadoop log files.
  • Involved in HDFS maintenance and loading the data using Sqoop
  • Written Hive queries for data analysis to meet the business requirements.
  • Creating Hive tables and working on them using Hive QL.
  • Oozie work flow is used to load data into hive table.
  • Installed and configured Pig and also written PigLatin scripts.
  • Wrote MapReduce job using Pig Latin, MySQL to HDFS on regular basis.
  • Developed Java UDFs for operational assist
  • Flume is used moving large amounts of log data from many different sources to a centralized data store
  • Developed Batch Job and Scripts to schedule various Hadoop Program.
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.

Environment: Hadoop, MapReduce, HDFS, Hive, Sqoop, Java, Sqoop, Flume, Oozie, UNIX Shell Scripting.

Confidential

Module Lead

Responsibilities:

  • Responsible for the system user interface creation.
  • Developed and implemented GUI of the system (Java 2.0, Swing)
  • Developed server-side business logic software modules using JDBC
  • Proficient in database development MySQL
  • Associated in development in System management and Out patience modules using JAVA, Swing
  • Experience in preparing Unit Test cases prepared
  • Participated code review implemented by peers.
  • Development of several screens using JSP, HTML and JavaScript.
  • Modifications of Action classes, Utility classes, Java Server Pages to facilitate new journey for WLR3 customer
  • Implementing business logic using java beans and did integration testing of the application.
  • Implemented database interactions using JDBC with MySql server
  • The client side validations were done using JavaScript

Confidential

Module Lead

Responsibilities:

  • Prepared test cases and tested the application
  • Created different kind setups for different kind of environments.
  • Documented the setups for next level maintenance.
  • Bug fixing.
  • Crated a simulator for test sequence to auto mate the test

Confidential

Module Lead

Responsibilities:

  • Implemented the client requirements as per IS for MRF module me.e. activation phase of File based provision plan and done unit testing for the same.
  • Tele Confidential Standard Descriptive Language (TNSDL) is used for implementation.
  • DMX compiler is used to compile the code. Hit tool is used to run the scripts for unit testing.
  • Participated in pronto fixing.
  • Implemented the scripts for unit testing.
  • Pronto fixing for existing issues.

Confidential

Sr. Developer

Responsibilities:

  • Worked as a senior developer to implement the simulator for KSN770 product support application.
  • Responsible for creating and Graphics Application for user interface to support application for KSN products using OpenGL
  • Implemented user interface controls, provided click and knob call backs on software application window using C++.
  • Worked on Nucleus OS real-time operating System.
  • Designed and Developed 3D objects using Format and Menu Designer Tools dat generates Open GL (Graphics Library) Output.
  • Loading and testing the software application on hardware setup.
  • dis is being developed over MPC Architecture and shall run on Nucleus Plus.
  • Involved in project planning, task creation, estimation and assignments.
  • Responsible for Design of Functional Requirement Specifications.
  • Responsible for Bug Fixation

Confidential

Sr. Developer

Responsibilities:

  • Worked as a senior developer to support new printers for Forms automation
  • Integrated LZW compression algorithm to driver for bitmaps.
  • Released the build for client with all checklist.

We'd love your feedback!