We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Senior Developer with 12+ years of working experience in Analysis, Design, Development and Testing of Web and Stand - alone applications.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and vice-versa.
  • Supported Data Ingestion application with 500+ Hadoop production jobs covering 1600+ tables and 50+ Databases.
  • Experience in designing and developing of Web applications using Java (J2EE), JavaScript, HTML, XML, XSLT, DB2 and Web services.
  • In depth knowledge of J2EE including the Struts and Spring Framework.
  • Experience in using tools like RAD, Clear Case, Clear Quest and Putty.
  • Completed one-week external training on MarkLogic provided by the Professional from MarkLogic Corporation
  • During training done some hands on in XML, XSD, XQuery and XPath.
  • Have hands-on experience on the below functionalities
  • Loading and managing XML, JSON and binary documents using MarkLogic content pump (MLCP), REST and XQuery APIs
  • Using XPATH to navigate XML documents, work with element and attribute values and implement predicates
  • Using content display framework to display XML and binary documents from MarkLogic server database
  • Writing XQuery code using FLWOR expressions
  • Working with node, string, rudimentary text search, number and date XQuery functions.
  • Building search queries using Java APIs, REST APIs, Search APIs and CTS
  • Creating reusable functions using MarkLogic library modules
  • Created Spring boot application for POC.

TECHNICAL SKILLS

Operating System: Windows 98/2000/XP/10.

Data Access Technologies: DB2, JDBC and ODBC.

RDBMS: MS-SQL Server 2000/2005, Oracle, DB2.

Programming Languages: Java(J2ee), SQL.

Web Technology: HTML, CSS, JSP, Servlet.

Frameworks: Spring, Spring Boot, Struts for Java and J2EE and AEFW Confidential framework.

Servers: Tomcat and WebSphere.

Scripting Languages: XSLT, XPath, XML, JavaScript, Unix Shell Script and Apache Ant.

Tools: & Utilities: Eclipse SDK 3.2.2, NetBeans IDE 6.0.1, MyEclipse IDE v7.1.1, Rational Application Developer (RAD 6.0/6.1/7.5), Putty, Beyond Compare, PLSQL Spring Tool Suite and Aginity.

Xml Parsers: JAXP, SAX and DOM.

Config Management Tools: Subclipse 1.0, Tortoise SVN and Clear case.

Big Data Ecosystem : Hadoop, HDFS, Hive, Sqoop, Pig

PROFESSIONAL EXPERIENCE

Confidential

Operating System: Windows 10

Tools: Hadoop, Spark, Scala, Hive, Sqoop

Hadoop Developer

Responsibilities:

  • Support Integration to/from Confidential Dataland platform and Confidential PeopleSoft Enterprise application.
  • Work with Infrastructure team to setup and integrate Big Data tools and frameworks required to provide requested capabilities
  • Review and provided recommendations to Infrastructure team on management of Hadoop cluster, with all included services such as Hive, HBase, Map Reduce, Spark, Python, Scala and Sqoop
  • Clean data as per business requirements using streaming API’s or user defined functions.
  • Assess the quality of datasets for a Hadoop data lake
  • Troubleshoot and debug any Hadoop ecosystem run time issues
  • Perform Data analysis, meta-data, data cataloguing, data quality/trending analysis and reports working with large Databases and different eternal systems including PeopleSoft Enterprise.
  • Provide Balancing of Data using Big Data querying tools, such as Pig

Confidential

Operating System: Windows XP/Win7

Tools: Java, Db2, HTML5, CSS, Hadoop, Hive, Python, GIT, Eclipse, Spring

Environment: s: Hue, Aginity workbench for Hadoop, Zeke, AWS - S3, Python, UNIX Shell Scripting, Crontab, Tidal, Rally, JIRA, SNOW, Beeline, Eclipse IDE, Git Hub

Senior Developer

Responsibilities:

  • Worked on end-to-end Data Ingestion process for requests created and approved by Data Governance team.
  • Supported Data Engineering application having 500+ Hadoop Data Ingestion jobs covering 1600+ tables and 50+ Databases.
  • Created various Sqoop Import Data ingestion jobs to ingest data from RDBMS such as DB2 Z/OS, DB2 LUW, Teradata, Oracle, Netezza, SQL Server & MySQL to HDFS and Hive.
  • Ingested 100+ internal/external sources of data into secure Hadoop Data Fabric platform.
  • Loaded XML, JSON, TXT, PSV, CSV fixed width/delimited files to Hive external partition tables.
  • Worked with Data Governance & Data Stewards in reviewing data dictionary of both source input and target Hadoop hive tables .Extracted AWS S3 files to edge node and ingest further into Hadoop Data lake.
  • Transferred Mainframe files of ASCII/EBCDIC format to HDFS Hadoop cluster and created shell scripts to get latest GDG files from Mainframe and ftp the file to HDFS.
  • Senior Developer in batch job-based and web-based development project.
  • Involvement in requirement analysis, development, and defect fixing for the Member Welcome, Json UI using Spring .
  • Managing and developing team capability for resolving complex incidents.
  • Performing peer review for all the code changes.
  • Worked extensively with the Java application integrated with the Hadoop
  • Reported any problem or fault in the project to the project manager or supervisor.
  • Developing and delivering the requirements and defect fixes.
  • I have been mostly involved in developing parsers for different sets of health policy and its dependent xml’s using Java

Confidential

Operating System: Windows XP/Win7

Tools: Java, JSP, Struts and Spring Web services.

Developer

Responsibilities:

  • Playing as an Onsite/Offshore Developer(Agile) for this project
  • Responsible for performing the analysis, design, coding and unit testing for various modules in the project.
  • Doing the peer reviews for design and coding
  • Involved fixing the defects.
  • Involved in mentoring new team members functionally
  • Responsible for providing access to application access for all the onsite environments.

Confidential

Operating System: Windows XP/Win7

Tools: Java, JSP, Spring, Web Sphere portal and Spring Web services.

Onsite Tech Lead

Responsibilities:

  • Playing as an Onsite Tech Lead for this project
  • Responsible for performing the analysis, design, coding and unit testing for various modules in the project.
  • Interacted with the Confidential business users and has done the feasibility study of the different requests raised by clients.
  • Doing the peer reviews for design and coding
  • Onsite and Offshore co-ordination.
  • Involved fixing the defects.

Confidential

Operating System: Windows XP/Win7

Tools: JSP, Struts, Spring 2.5.1, AEFW framework / RAD7.5.5, DB2 8.1/9.7.3, Web Services.

Team Member

Responsibilities:

  • Involved in all the phases of SDLC. Requirement analysis, Design, Coding, Testing and Implementation.
  • Documented the Design.
  • Guided and helped team members to implement their task.
  • Reviewed the code and unit test cases.
  • Supported the System Testing Team.

Confidential

Operating System: Windows XP/Win7

Tools: J2EE, Struts, Spring, AEFW framework / RAD6.0, DB2 8.1/9.0.

Team Member

Responsibilities:

  • Involved in all the phases of SDLC. Requirement analysis, Design, Coding, Testing and Implementation.
  • Documented the Design.
  • Guided and helped team members to implement their task.
  • Reviewed the code and unit test cases.
  • Supported the System Testing Team.

Confidential

Operating System: Windows XP/Win7

Tools: Java 1.6, Struts 1.2, .NET Framework, MyEclipse IDE v7.1.1, Spring Web Flow 2.0.5, Tomcat

Team Member

Responsibilities:

  • Involved in all the phases of SDLC. Requirement analysis, Design, Coding, Testing and Implementation.

We'd love your feedback!