We provide IT Staff Augmentation Services!

Big Data And Hadoop Developer Resume

2.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY:

  • Around 16 Months of development experience using Hadoop, MySql and Oracle, which includes Big Data ecosystem, design, development and administration.
  • Have extensive experience in Big Data and excellent understanding/knowledge of Hadoop architecture and various components such as Spark SQL, HDFS, Pig, Hive, Sqoop, Flume, Yarn, Kafka and Cassandra.
  • Experience in loading structured, semi - structured and unstructured data from different sources like csv, xml files, Teradata, MS SQL Server, Oracle into Hadoop.
  • Experience in importing and exporting the different formats of data into HDFS, HBASE from different RDBMS databases and vice versa.
  • Experience in writing Scala programs.
  • Expertise is working with distributed and global project teams.
  • Experience in using various Hadoop distributions like Cloudera, Hortonworks.
  • Good exposure on Yarn environment with Spark, Kafka and dealing with file formats like Avro, Json, Xml and sequence files.
  • Experience in writing workflows and scheduling jobs using Oozie.
  • Involved in project planning, setting up standards for implementation and design of Hadoop based applications.
  • Experience in Work independently and end to end on projects.
  • Proficiency in creating business and technical project documentation.

TECHNICAL SKILLS:

Hadoop/Big Data: Apache Spark, HDFS, Hive, Pig, Flume, ScoopNoSQL

Databases: HBase, Cassandra.

Languages: C, C++, Java, Pig Latin, HiveQL, Unix shell scripts, Scala.

ETL: Oracle

Operating Systems: Sun Solaris, UNIX, Red Hat Linux, Ubuntu Linux and Windows XP/Vista/7/8

Web Technologies: HTML, DHTML, XML

Web/Application servers: Apache Tomcat, WebLogic, JBoss

Databases: Oracle, SQL Server, MySQL.

Tools and IDE: Eclipse, NetBeans & Maveen, SBT, JDeveloper, DB Visualizer, SQL Developer.

Version control: SVN, Git, Bit Bucket

PROFESSIONAL EXPERIENCE:

Confidential, Atlanta, GA

Big Data and Hadoop Developer

Roles & Responsibilities:

  • Contributed towards architecture and building of initial framework for the EDS Data lake project.
  • Work on data integration and ingestion from Oracle, SQL Server source systems and EDW into Hadoop.
  • Participated in Agile project development lifecycle using Git and Jenkins for CI/CD process.
  • Worked on setting up key components for the project like Kerberos authentication renewals, password encryption mechanism in Hadoop and creation of environment profiles for ease of code deployments to higher environments.
  • Worked on data modeling and design of Hive and HBase Table structures based on the project reporting and analytic needs.
  • Developed shell scripts and Spark SQL jobs to handle large volumes of ETL workloads.
  • Worked on development and implementation of incremental data (CDC) loads from source systems into Hadoop using Apache Spark SQL.
  • Worked with Sqoop, Flume and Pig for data integration and import data from source systems to Hadoop Data lake.
  • Worked extensively with Hive and HBase for data validation and analysis.
  • Designed Oozie workflows and coordinators to enable scheduling and automation of ETL jobs.
  • Worked on AppOpps support project to help team with Production support activities like job monitoring, code deployments and creation of run books.
  • Worked on projects involving both on-prem and cloud data integration.

Environment: Apache Hadoop, Pig, Hive, Sqoop, Spark, Spark Sql, Kafka, MapReduce, HDFS, LINUX, Oozie.

Confidential

Software Engineer (Internship)

Roles and Responsibilities

  • Developed web components using JSP, Servlets and JDBC
  • Designed tables and indexes
  • Designed, Implemented, Tested and Deployed Enterprise Java Beans both Session and Entity using WebLogic as Application Server
  • Developed stored procedures, packages and database triggers to enforce data integrity. Performed data analysis and created crystal reports for user requirements
  • Implemented the presentation layer with HTML, XHTML and JavaScript
  • Used EJBs to develop business logic and coded reusable components in Java Beans
  • Development of database interaction code to JDBC API making extensive use of SQL
  • Query Statements and advanced Prepared Statements
  • Used connection pooling for best optimization using JDBC interface
  • Used EJB entity and session beans to implement business logic and session handling and transactions Developed user-interface using JSP, Servlets, and JavaScript
  • Wrote complex SQL queries and stored procedures
  • Actively involved in the system testing
  • Prepared the Installation, Customer guide and Configuration document which were delivered to the customer along with the product
  • Responsible for creating work model using HTML and JavaScript to understand the flow of the web application and created class diagrams.
  • Participated in the daily stand up SCRUM agile meetings as part of AGILE process for reporting the day to day developments of the work done
  • Design and develop user interfaces using HTML, JSP.
  • J2EE is used to develop the application based on MVC architecture
  • Created interactive front-end GUI using JavaScript, JQuery, DHTML and Ajax
  • Used SAX and DOM XML parsers for data retrieval

Environment: Windows NT 2000/2003, XP, and Windows 7/ 8 C, Java, JSP, Servlets, JDBC, EJB, DOM, XML, SAX

We'd love your feedback!