We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Fairfax, VA

SUMMARY

  • 7+ years of professional experience working with Java and Big Data technologies - Hadoop Ecosystem/ HDFS/ Map-Reduce Framework, HIVE, Sqoop,NoSQL DB - HBase.
  • Experience in installing, configuring and troubleshooting Hadoop ecosystem components like Map Reduce, HDFS, Sqoop, Impala, Pig, Flume, Hive, HBase, Zoo Keeper.
  • Experience on Hadoop CDH3 and CDH4,CDH5 and also MapR.
  • Experience in upgrading the existing Hadoop cluster to latest releases.
  • Experienced in using NFS (network file systems) for Name node metadata backup.
  • Experience in using Cloudera Manager 4.0, 5.0 for installation and management of Hadoop cluster.
  • Experience in supporting data analysis projects using Elastic Map Reduce on the Amazon Web Services (AWS) cloud.
  • Exporting and importing data into S3.
  • ConfiguredName node HA on the existingHadoopcluster using Zookeeper quorum.
  • Expertise in writing Shell scripting in UNIX using ksh and bash.
  • Experienced in developing and implementing Map Reduce jobs using java to process and perform various analytics on large datasets.
  • Good experience in writing Pig Latin scripts and Hive queries.
  • Good understanding of Data Structure and Algorithms.
  • Good experience on developing of ETL Scripts for Data cleansing and Transformation.
  • Experience in Data migration from existing data stores and mainframe NDM to Hadoop.
  • Experience in designing both time driven and data driven automated workflows using Oozie.
  • Experience in supporting analysts by administering and configuring HIVE.
  • Hands-on programming experience in various technologies like JAVA, JSP, Servlets, SQL, JDBC, HTML, XML, UNIX.
  • Experience writing SQL queries and working with Oracle and My SQL.
  • Expertise in Object-oriented analysis and programming(OOAD) like UML and use of various design patterns.
  • Have dealt with end users in requirement gathering, user experience and issues.
  • Experience in preparing deployment packages and deploying to Dev and QA environments and prepare deployment instructions to Production Deployment Team.
  • Team player with excellent analytical, communication and project documentation skills.
  • Agile Methodology and Iterative development.

TECHNICAL SKILLS

Programming Languages: Java, C, C++, SQL

Hadoop Ecosystems: HDFS, Hive, Pig, Flume, Impala, Oozie, Zookeeper, HBASE and Sqoop.

Operating System: Linux, Windows 7, Server 2003, Server 2008.

Databases: Oracle, MySQL and SQL Server.

Web Technologies: Applets, JavaScript, CSS, HTML and XHTML

Tools: Ant, Maven, TOAD, AgroUML, WinSCP, Putty

Version Control: VSS, SVN, CVS and GIT

Scripting Languages: JSP & Servlets, XML, HTML, JSON, JavaScript, jQuery

IDE: Eclipse,Net Beans

PROFESSIONAL EXPERIENCE

Confidential, Fairfax, VA

Hadoop Developer

Responsibilities:

  • Worked on setting up the 100 node Hadoop cluster for the Production Environment.
  • Upgraded production Hadoop clusters from CDH4U1 to CDH5.2 and CM 4.x to CM 5.1.
  • Migrating production Hadoop clusters from MRV1 to Yarn and application migration.
  • Worked on pulling the data from relational databases, Hive into the Hadoop cluster using the Sqoop import for visualization and analysis.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
  • Developed Java Mapreduce programs on semi structured data to transform into structured data.
  • Supported 300+ servers and 250+ users to use Hadoop platform and resolve tickets and issues they run into and provide training to users to make Hadoop usability simple and updating them for best practices.
  • Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page views, visit duration, most purchased product on website.
  • Generated reports using Tableau report designer.

Technologies: Hadoop, HDFS, MapReduce, Hive, Sqoop, Pig, Oracle, XML, Cloudera Manager

Confidential,San Mateo, CA

Hadoop Developer

Responsibilities:

  • Worked on setting up the 100 node Hadoop cluster for the Production Environment.
  • Upgraded production Hadoop clusters from CDH4U1 to CDH5.2 and CM 4.x to CM 5.1.
  • Migrating production Hadoop clusters from MRV1 to Yarn and application migration.
  • Involved in full life-cycle of the project from Design, Analysis, logical and physical architecture modeling, development, Implementation, testing.
  • Responsible to manage data coming from different sources and involved in HDFS maintenance and loading of structured and unstructured data.
  • Developed Map Reduce programs to parse the raw data and store the refined data in tables.
  • Designed and Modified Database tables and used HBASE Queries to insert and fetch data from tables.
  • Involved in moving all log files generated from various sources to HDFS for further processing through Flume.
  • Involved in loading and transforming large sets of structured, semi structured and unstructured data from relational databases into HDFS using Sqoop imports.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
  • Created Hive tables, loaded data and wrote Hive queries that run within the map.
  • Used OOZIE Operational Services for batch processing and scheduling workflows dynamically.
  • Populated HDFS and Cassandra with huge amounts of data using Apache Kafka.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Involved in fetching brands data from social media applications like Facebook, twitter.
  • Create a complete processing engine, based on Cloudera's distribution, enhanced to performance.
  • Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page views, visit duration on the website.
  • Generated reports using Tableau report designer.

Confidential - Philadelphia, PA

Java Developer

Responsibilities:

  • Involved in each stage of a software development life cycle (SDLC), for development and implementation of the project and assisted at defining the technical design of the assigned modules.
  • Developed and designed the front end using HTML, CSS and JavaScript with JSF, Ajax and tag libraries
  • Developed the entire application implementing MVC Architecture integrating Hibernate and spring frameworks.
  • Involved in development of presentation layer using JSP and Servlets with Development tool Eclipse IDE.
  • Worked on development of Hibernate, including mapping files, configuration file and classes to interact with the database.
  • Implemented Object-relation mapping in the persistence layer using Hibernate frame work in conjunction with spring functionality.
  • Involved in injecting dependencies into code using spring core module.
  • Used Spring Core Annotations for Dependency Injection.
  • Involved in developing code for obtaining bean references in Spring framework using Dependency Injection (DI) or Inversion of Control.
  • Used SQL for fetching and storing data in databases.
  • Designed and developed the Validations, Controller Classes andJavabean components.
  • Used XML/XSL and Parsing using both SAX and DOM parsers.
  • Used Web services - WSDL and SOAP for getting required information from third partyImplemented web services with Apache Axis
  • Designed and Developed Stored Procedures, Triggers in Oracle to cater the needs for the entire application.
  • Developed complex SQL queries for extracting data from the database.
  • Used Apache Ant for the build process.

Technologies: Java, J2EE, JDK 1.5, Servlet, JSP, JavaScript, HTML, CSS, JQuery, Ajax, Hibernate, spring, SOAP, Eclipse, SQL, DB2, XML/XSL, Apache Ant.

Confidential

Java Developer

Responsibilities:

  • Developed the user interface screens using Swing for accepting various system inputs such as contractual terms, monthly data pertaining to production, inventory and transportation.
  • Involved in designing Database Connections using JDBC.
  • Involved in design and Development of UI using HTML, JavaScript and CSS.
  • Involved in creating tables, stored procedures in SQL for data manipulation and retrieval using SQL SERVER 2000, Database Modification using SQL, PL/SQL, Stored procedures, triggers, Views in Oracle.
  • Developed the business components (in core Java) used for the calculation module (calculating various entitlement attributes).
  • Involved in the logical and physical database design and implemented it by creating suitable tables, views and triggers.
  • Created the related procedures and functions used by JDBC calls in the above components.
  • Involved in fixing bugs and minor enhancements for the front-end modules.

Technologies: JDK 1.3, Swing, JDBC, JavaScript, HTML, Resin, SQL Server 2000, Textpad, Toad, MS Visual SourceSafe,Windows 2000, HP UNIX.

We'd love your feedback!