We provide IT Staff Augmentation Services!

Big Data Developer Resume

4.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • 6 years of professional IT experience which includes 3+ years of experience in Big data ecosystem.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Hands on experience in installing, configuring and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, HBase, Oozie, Hive, Sqoop, Zookeeper, Pig, and Flume.
  • Experience in managing and reviewing Hadoop log files.
  • Hands on experience in designing and creating Hive tables using shared meta - store with partitioning and bucketing.
  • Hands on experience in using Pig for analysis on large unstructured data sets.
  • Experience in analyzing data using HiveQL, Pig Latin, HBase.
  • Involved in project planning, setting up standards for implementation and design of Hadoop based applications.
  • Implemented Oozie for writing work flows and scheduling jobs.
  • Familiar with the automation tool Autosys.
  • Written Hive queries for data analysis and to process the data for visualization.
  • Hands on experience working with Shell Scripting.
  • Extending Hive and Pig core functionality by writing custom UDFs.
  • Experience in data management and implementation of Big Data applications using Hadoop frameworks.
  • Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and the Hadoop ecosystem.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Expert level knowledge on ETL tools Informatica Power center and Talend.
  • Familiar with data visualization tool Tableau.
  • Excellent technical skills, consistently outperformed schedules and acquired interpersonal and communication skills.
  • Experience with Scrum methodology.
  • Ability to adapt to evolving technology, strong sense of responsibility and accomplishment.
  • Versatile team player with excellent analytical, presentation, communication and inter-personal skills.

TECHNICAL SKILLS:

Hadoop Eco System: Hadoop, MapReduce, HDFS, Zookeeper, Spark, Flume, Sqoop, Hive, Pig, Oozie, Apache Kafka, HBase, HCatalog

SQL Data bases: MySQL, Microsoft SQL Server 2000, Sparksql, Impala

NoSQL: HBase, MongoDB, Cassandra

Languages: Java, Python, Basic Scala, SQL, PL/SQL, PIG-Latin, HiveQL

Source Control Tools: MS Visual Source Safe 6.0

Operating Systems: Windows 2000/XP/2003/Vista/7, Unix, Linux

Domain: Aviation, Manufacturing & Entertainment

PROFESSIONAL EXPERIENCE:

Confidential

Big Data Developer

Responsibilities:

  • Translate complex functional and technical requirements into detailed design.
  • Design a star schema using HiveQL.
  • Load data into Hadoop using a Hive connector.
  • New Hive data tables build and setup.
  • Wrote complex Hive SQL queries to automate the Hive data population.
  • Automation of data flow jobs using Control-M.
  • Load data from MySQL to HDFS using Sqoop.
  • Load data from disparate data sets, and high-speed querying.
  • Wrote Hive queries for data analysis and to process the data for visualization
  • Participated in Scrum and Sprint meetings.
  • Developing data visualization dashboard using Tableau.

Confidential

Big Data/ Hadoop Developer

Responsibilities:

  • Translate complex functional and technical requirements into detailed design.
  • Designing, building, installing, configuring and supporting Hadoop.
  • Loading from disparate data sets, and high-speed querying.
  • Managing and deploying HBase
  • Pre-processing using Hive and Pig.
  • Analyzing data using HiveQL, Pig Latin, HBase and custom MapReduce programs in Java
  • Writing Hive queries for data analysis and to process the data for visualization.
  • Implemented Oozie for writing work flows and scheduling jobs.
  • Being a part of a POC effort to help build new Hadoop clusters.

Confidential 

SQL & Essbase Developer - Analyst Programmer

Responsibilities:

  • Involved in gathering business requirements, defining, designing, developing, testing and deploying the application.
  • ERP data is loaded into Cube from Data mart using Data Load Rule.
  • Involved in Essbase Cube Building, Loading Data in Cube & Report Scripts.
  • Designed and developed load rules using the Prep Editor.
  • Handled backup and recovery for Essbase Database.
  • Involved in designing and developing the corporate financial charts (Look and Feel) and other ad-hoc management financial reports using the Hyperion Analyzer.
  • Created Ad hoc Financial Reports for the end users to slice n dice the data using Hyperion Analyzer.
  • Involved in tuning the applications for better performance by modifying the server wide, application, and database wide settings.
  • Involved in designing mechanisms for the server and application / database security, developing Filters.
  • Development of MS SQL procedures and triggers as per Design Specifications.
  • Technical/Functional guidance to the less experienced team-members.

Confidential 

SQL Developer & Programmer

Responsibilities:

  • Analyze and Study the requirement of the module.
  • Development of MS SQL procedures and triggers.
  • Unit testing and Integration Testing for associated module.

Environment: HTML, MS SQL Server 2000

We'd love your feedback!