We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Denver, CO

SUMMARY

  • More than 10 years of work experience across technologies ranging from Legacy Mainframe Applications (3 years) to Java/J2EE technologies (5 years) and Big Data Systems (2 Years).
  • Experience in all phases of software development life cycle, including providing L2 Production Support.
  • Sound working knowledge of Big Data Technolgies like Apache Spark, Apache Hive, Pig, Mrv2, Hbase and Zookeeper
  • Strong knowledge of SQL on Oracle, DB2 and MySQL and data modelling.
  • Basic knowledge of Non - SQL technologies like Hbase and Cassandra
  • Strong working knowledge in Unix Shell Scripting and Python Programming
  • Expertise in Object Oriented Programming using Java and J2EE related technologies.
  • Proficiency in developing secure web applications using JSP, Servlets, Java Beans, JavaScript, XML.
  • Working knowledge of developing REST Based web services using Java.
  • Working knowledge on XML Technologies JAXP (DOM and SAX parsers) and JSON.
  • Hands on the tools such as Eclipse, RAD, Textpad, Editplus, Toad, Linux vi editor, etc.
  • Worked extensively on various flavors of UNIX operating system like Linux, Solaris.
  • Hands on experience in VPN, Putty, winSCP, etc.
  • Experienced in writing ANT/Maven scripts to build and deploy Java applications.
  • Hands on experience in project management tools like Clear case and TFS.
  • Strong analytical skills with ability to quickly understand client’s business needs.
  • Involved in meetings to gather information and requirements from the clients.
  • Research-oriented, motivated, proactive, self-starter with strong technical, analytical and interpersonal skills.

TECHNICAL SKILLS

Big Data Frameworks: Apache Spark, Spark SQL, Hive, Pig, MRv2, Zoo Keeper

Programming Languages (Expertise): Java, Python, Shell Scripting, SQL

Programming Languages: Scala, C, C++, PL/SQL, Mainframe Techonologies

Web/XML Technologies: HTML, CSS, JavaScript,, Servlets, JSP, XML, JSON

Apache Projects: Maven, Hadoop, Spark, Pig, Hive

Tools: & Utilities: Eclipse, RAD, WSAD, Putty, winSCP

Application/Web Servers: IBM WebSphere, TomCat

RDBMS: Oracle, IBM DB2, MySql

Source Control: Rational Clear Case, TFS, VSS, Changeman

Operating Systems: Windows 9x/2000/XP, Linux, UNIX, Sun Solaris, Z/OS MVS

PROFESSIONAL EXPERIENCE

Confidential, Denver, CO

Hadoop Developer

Responsibilities:

  • Worked with the Hadoop Infrastructure team on capacity planning of our MapR Cluster across Dev, QA and Production regions.
  • Involved in design, analysis and architectural meetings involving the
  • Involved in migration of Legacy applications to Hadoop Framework.
  • Worked on migrating the Transaction Pricing and invoicing appliations from Oracle to HDFS and subsequently to Hbase using Sqoop.
  • Worked on writing Pig and Hive queries to fetch and query data in HDFS.
  • Worked on developing the Apache Spark Simulation Pricing application using Python and Spark SQL
  • As part of the Enterprise Logging Initiative, we re-designed the logs of all the Java Batch applications and loaded them into HDFS via Flume.
  • Worked on oozie to schedule scripts across all clusters.

Environment: CDH 5, Putty, iPython, Java, Shell Scripting, Hive, Pig, Sqoop, Flume, SQL.

Confidential, Den, CO

Java Batch/J2EE Developer/Production Support

Responsibilities:

  • Extensively worked on back end of Relationship Pricing Tool application using Core Java, Python and Unix Shell Scripting.
  • Have working knowledge of Java Struts, and MVC frameworks
  • Involved in design, development and testing phases of project.
  • Developed REST webservices for internal Schwab downstream applications
  • Involved in design, analysis and architectural meetings. Created Architecture Diagrams, and Flow Charts using Microsoft Visio.
  • Followed Agile software development practice paired programming, test driven development and scrum status meetings.
  • Developed use case diagrams, class diagrams, database tables, and mapping between relational database tables and object oriented java objects using Spring.
  • Used JUnit to test persistence and service tiers.
  • Used Eclipse Integrated Development Environment (IDE) in entire project development.
  • Worked on tools like Microsoft VISIO, TFS and JIRA for work and bug tracking.

Environment: Java 7, IBM Websphere, Hibernate 3.0, Spring 2.0, REST Web Services, Log4j 1.4, Maven, Eclipse, JIRA, Microsoft TFS, Visio, PL/SQL and Linux.

Confidential, Alpharetta, GA

Java Developer/Legacy Migration

Responsibilities:

  • To understand the current Mainframe Legacy applications written in MVS assembler and COBOL, prepare documentation and design migration to Java.
  • Developed the new multi threaded Core java application based to re-write the Triggers application.
  • Worked on multi-threaded batch application in Java.
  • Performed L1 Support on the Legacy application until it is migrated to new system
  • Provided Warranty Support to the new Application

Environment: Z/OS Mainframes, Core Java, Linux, IBM DB2.

Confidential

Java Developer/Legacy Migration

Responsibilities:

  • Involved in the analysis of the Constraints of the Legacy application and design of the new Distributed system
  • Derived logic from COBOL programs on Mainframes and prepared documentation for the same
  • Involved in Database migration from Legacy Hierachical IMS/DB database to IBM DB2.
  • Involved in data modelling on the new IBM DB2 system using ER diagrams
  • Developed a new web application for the legacy Rates/Reservations system using Struts framework.
  • Extensively used Struts server side validation, tiles and exceptional handler.
  • Developed build scripts for ear and war applications to deploy on Web logic server.
  • Developed web services using Axis soap engine.
  • Configured Web logic for connection pools, data sources, jms connection factories, jms server queues and deployment of ear and war files.
  • Tested persistence layer and service layer with transactions using JUnit test cases.

Environment: Java, J2EE, COBOL, MVS Assembler, JCL, IMS DB, IBM DB2.

We'd love your feedback!