We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • Data Engineer with 8 years of experience implementing Big data projects using Hadoop eco system and Developed Enterprise web applications using Java/J2EE technology stack.
  • Having 3+ years of experience in End - to-end in Big Data implementation with strong experience on major components of Hadoop Ecosystem like Hadoop Map Reduce, HDFS, HIVE, PIG, HBase, Zookeeper, Sqoop, Oozie, Cassandra, Flume, Kafka, Spark and Storm.
  • Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture.
  • Experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on amazon web services (AWS)
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, Map Reduce, Hadoop GEN2 Federation, High Availability and YARN architecture and good understanding of workload management, schedulers, scalability and distributed platform architectures
  • Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.
  • Strong experience on Hadoop distributions like Cloudera and MapR.
  • Experience with Oozie Workflow Engine in running workflow jobs with actions that run Hadoop Map/Reduce and Pig jobs
  • Good understanding of NoSQL Data bases and hands on work experience in writing applications on No SQL databases like Cassandra, HBase.
  • Strong experience and knowledge of real time data analytics using Storm, Kafka, Flume and Spark
  • Experienced in moving data from different sources using Kafka producers, consumers and preprocess data using Storm topologies.
  • Extensive experienced working with Spark tools like RDD transformations, spark mlib and spark QL.
  • Experienced in handling semi/unstructured data using complex map-reduce programs.
  • Written multiple MapReduce programs in Java for data extraction, transformation and aggregation from multiple file formats including XML, JSON, CSV and other compressed file formats.
  • Extensive experience in working with structured data using Hive QL, optimization queries, in corporate complex UDF's in business logic.
  • Experienced in migrating ETL transformations using Pig Latin Scripts, transformations, join operations.
  • Extensively developed Simple to complex Map/reduce streaming jobs that are implemented using Hive and Pig and Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
  • Experienced in working with streaming data using Flume sources, interceptors.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experienced in working with monitoring tools to check status of cluster using Cloudera manager, Ambari and Ganglia.
  • Extensive experience in working with SOA based architectures using Rest based web services using JAX-RS and SOAP based web services using JAX-WS.
  • Hands on experience in designing and coding web applications using Core Java and J2EE technologies like Spring, Hibernate, JMS, Angular JS.
  • Good experience in designing, developing database to create its Objects like Tables, Stored Procedures, Triggers and Cursors using PL/SQL
  • Experience in developing solutions to analyze large data sets efficiently
  • Experience in working with web development technologies such as HTML, CSS, JavaScript and Jquery.
  • Extensive experienced in build/deploy multi module projects using Ant, Maven and CI servers like Jenkins.
  • Extensive experienced in working with Unix, Linux and writing shell scripts.
  • Hands on Experience in using IDE tools like Eclipse, MyEclipse and NetBeans
  • Good working experience on Installing and maintaining the Linux servers.
  • Good experienced in working with agile, scrum and Waterfall methodologies.

TECHNICAL SKILLS

Hadoop/Big Data: HDFS, Map reduce, HBase, Pig, Hive, Sqoop, Flume, Cassandra, Oozie, Spark, Kafka, Storm.

Java & J2EE Technologies: Core Java, Servlets, JSP, Spring, Hibernate, SOAP/Rest web services.

Programming languages: C, Java, Python, Ant scripts, Linux shell scripts

Databases/ NOSql: Oracle, MySQL, DB2, MS-SQL Server, HBase, Cassandra

Web Servers: Web Logic, Web Sphere, Apache Tomcat, JBoss

Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL, angular JS

PROFESSIONAL EXPERIENCE

Confidential, Phoenix, AZ

Hadoop Developer

Environment: Hadoop, HDFS, pig, Hive, Flume, Sqoop, Oozie, Hbase, Zookeeper, My SQL, Sqoop, Shell Scripting, Linux Red Hat, Java (Jdk 1.6), Eclipse

Responsibilities:

  • Worked on analyzing Hadoopcluster and different big data analytic tools including Map Reduce, Hive and Spark.
  • Involved in loading data from LINUX file system, servers, Java web services using Kafka Producers, partitions.
  • Implemented Kafka Custom encoders for custom input format to load data into Kafka Partitions.
  • Implemented Storm topologies to pre-process data before move into HDFS system.
  • Implemented Kafka High level consumers to get data from Kafka partitions and move into HDFS.4
  • Migrated complex Map reduce programs into Spark RDD transformations, actions.
  • Implemented Spark RDD transformations to map business analysis and apply actions on top of transformations.
  • Developed the Map Reduce programs to parse the raw data and store the pre Aggregated data in the portioned tables.
  • Installed and Configured Hadoop cluster using Amazon Web Services (AWS) for POC purposes.
  • Load and transform large sets of structured, semi structured, and unstructured data with map reduce, hive, and pig.
  • Implemented Hive complex UDF's to execute business logic with Hive Queries.
  • Exporting of result set from HIVE to MySQL using Sqoop export toolfor further processing.
  • Evaluated usage of Oozie for Workflow Orchestration.
  • Automation of all the jobs starting from pulling the Data from different Data Sources like MySQL and pushing the result dataset to Hadoop Distributed File System and running MR, PIG, and Hive jobs using Kettle and Oozie (Work Flow management)
  • Worked with NoSQL databases like HBase in creating Hbase tables to load large sets of semi structured data coming from various sources.
  • Installed and configured Hive and also written Hive UDFs.
  • Created partitioned tables in Hive, mentored analyst and test team for writing Hive Queries.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
  • Involved in cluster setup, monitoring, test benchmarks for results.
  • Involved in build/deploy applications using Maven and integrated with CI server Jenkins.
  • Involved in agile methodologies, daily scrum meetings, Spring planning's.

Confidential, Minneapolis, MN

Hadoop Developer

Environment: Hadoop, HDFS, pig, Hive, Flume, Sqoop, Oozie, HBase.

Responsibilities:

  • Transfer drugs purchase transaction details from legacy systems to HDFS
  • Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the Hadoop cluster
  • Wrote the Map Reduce jobs to parse the web logs which are stored in HDFS
  • Developed the services to run the Map-Reduce jobs as per the requirement basis.
  • Importing and exporting data into HDFS from Oracle 10.2 database and vice versa using SQOOP
  • Extracted files from NoSQL database, HBase through Sqoop and placed in HDFS for processing
  • Responsible to manage data coming from different sources
  • Analyzed the data using the Pig to extract number of unique patients per day and most purchased medicine
  • Implemented the workflows using Apache Oozie framework to automate tasks
  • Wrote UDF's for Hive and Pig that helped spot market trends
  • Good knowledge in running Hadoopstreaming jobs to process terabytes of xml format data
  • Identifying the target patient
  • Analyzed the Functional Specifications

Confidential, ST. Louis, MO

JAVA Developer

Environment: Java, JDBC, Servlets, JSP, XML, Design Patterns, CSS, HTML, JavaScript 1.2, Apache Tomcat, My SQL Server 2008.

Responsibilities:

  • Extensively used Core Java, Servlets, JSP and XML
  • Responsible for writing functional and technical documents for the modules developed.
  • Used Struts 1.2 in presentation tier
  • Generated the Hibernate XML and Java Mappings for the schemas
  • Used DB2 Database to store the system data
  • Actively involved in the system testing
  • Involved in fixing bugs and unit testing with test cases using JUnit
  • Wrote complex SQL queries and stored procedures
  • Developed user interfaces using JSP, JSF framework with AJAX, Java Script, HTML, DHTML, and CSS.
  • Used Asynchronous JavaScript for better and faster interactive Front-End
  • Used IBM Web-Sphere as the Application Server

Confidential, New York City, NY

JAVA Developer

Environment: Spring MVC, Oracle 11g J2EE, Java, JDBC, Servlets, JSP, XML, Design Patterns, CSS, HTML, JavaScript 1.2, Junit, Apache Tomcat, My SQL Server 2008.

Responsibilities:

  • Created design documents and reviewed with team in addition to assisting the business analyst / project manager in explanations to line of business.
  • Created Use case, Sequence diagrams, functional specifications and User Interface diagrams.
  • Responsible for understanding the scope of the project and requirement gathering.
  • Involved in analysis, design, construction and testing of the application
  • Developed the web tier using JSP to show account details and summary.
  • Assisted in design and development of Avon M-Commerce application from the scratch using HTTP, XML, Java, Oracle objects, Toad and Eclipse.
  • Used Tomcat web server for development purpose.
  • Involved in creation of Test Cases for JUnit Testing.
  • Used Oracle as Database and used Toad for queries execution and also involved in writing SQL scripts, PL/SQL code for procedures and functions.
  • Developed user interfaces using JSP, HTML, XML and JavaScript.
  • Created Stored Procedures & Functions.
  • Actively involved in code review and bug fixing for improving the performance.

Confidential

JAVA Developer

Environment: J2EE (JSP's, Servlets, EJB), HTML, Struts, DB2.

Responsibilities:

  • EJB (Session Beans and Entity Beans) on Web sphere Studio Application Developer.
  • Used different Design patterns, like MVC, EJBs Session facade, Controller Servlets, while implementing the Framework.
  • Front End was built using JSPs, jQuery, JavaScript and HTML.
  • Designed and developed user interfaces using JSP, Java script and HTML.
  • Built Custom Tags for JSPs.
  • Built the report module on reports based from Crystal reports.
  • Integrating data from multiple data sources.
  • Generating schema difference reports for database using toad.
  • Built Prototypes for internationalization.
  • Wrote Stored Procedures in DB2.

Confidential

Junior JAVA Developer

Environment: Java 1.3, Servlets, JSPs, Java Mail API, Java Script, HTML, Spring Batch XML Processing, MySQL 2.1, Swing, Java Web Server 2.0, JBoss 2.0, RMI, Rational Rose, Red Hat Linux 7.1.

Responsibilities:

  • Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
  • Involved in complete requirement analysis, design, coding and testing phases of the project.
  • Participated in JAD meetings to gather the requirements and understand the End Users System.
  • Developed the front-end user interface using J2EE, Servlets, JDBC, HTML, DHTML, CSS, XML, XSL, XSLT and JavaScript as per Use Case Specification.
  • Generated XML Schemas and used XML Beans to parse XML files.
  • Created Stored Procedures & Functions. Used JDBC to process database calls for DB2/AS400 and SQL Server databases.
  • Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.
  • Responsible for Project Documentation, Status Reporting and Presentation.
  • Developed web application called iHUB (integration hub) to initiate all the interface processes using Struts Framework, JSP and HTML.

We'd love your feedback!