We provide IT Staff Augmentation Services!

Sr Hadoop Developer Resume

4.00/5 (Submit Your Rating)

San Leandro, CaliforniA

SUMMARY:

  • Rich experience spanning over 8+ years with all phases of Software Development Life Cycle SDLC, including Analysis, Design, Development and Testing of Client - Server and Web-based n-tier applications using Big Data and .NET technologies.
  • 4 years of relevant experience in Hadoop Ecosystem and architecture (HDFS, Apache Spark, MapReduce, YARN, Zookeeper, Pig, Hive, HBase, Sqoop, Flume, Oozie).
  • Experience in analyzing data using HiveQL, Pig Latin, HBase and custom Map Reduce programs in Python.
  • Written MapReduce programs with custom logics based on the requirement and custom UDFs in pig and hive based on the user requirement.
  • Implemented Oozie for workflows and scheduling jobs. Written Hive queries for data analysis and to process the data for visualization.
  • Experience in importing and exporting the different formats of data into HDFS, HBASE from different RDBMS databases and vice versa.
  • Developed applications using Spark for data processing.
  • Replaced existing map-reduce jobs and Hive scripts with Spark Data-Frame transformation and actions for the faster analysis of the data.
  • Experience in different Hadoop distributions like Cloudera (CDH) and Hortonworks (HDP).
  • Developed web pages using C#, ASP.NET, ASP.NET MVC and Visual Studio.
  • Hands on experience in designing and developing interactive, responsive and dynamic User Interfaces using Angular, JQUERY, HTML, CSS, JavaScript, XML and AJAX.
  • Hands on experience in design using Object Oriented Programming Principles like Encapsulation, Inheritance, Polymorphism, reusability and Design patterns like Abstract factory, Singleton a standard solution to common problems in software design.
  • Expertise in building Web Services/ WCF/Web API and consuming/publishing Web Services.
  • Designed and Implemented application using ADO.NET objects like Dataset, Data Table and Data Adapter for manipulating, retrieving, storing and displaying data from SQL Server.
  • Hands on experience with SQL Server 2005, 2008 and 2012.
  • Sound knowledge in writing T-SQL queries, stored procedures, Triggers along with Data Transformation Services (DTS), views, user defined functions, packages, database performance tuning, indexing, database backup and restore.
  • Experience in using Team Foundation Server 2010/2008, Microsoft Visual SourceSafe for Version Controlling and SVN.
  • Strong application design and development skills.
  • Strong Debugging and problem solving skills.
  • Analyzing, Designing, and Preparing High Level design (HLD), Coding and Acceptance Test Plan (ATP).
  • Very good Team player, Self-motivated, hard working professional with good organizational, leadership, interpersonal and communication skills.
  • Quick learner and adapt quickly to any new situation/changes and work pro-actively towards meeting deadlines is my forte. Possess Good Communication and Listening Skills.
  • Creating Enthusiastic Environment around me.
  • Working with Self-defined Deadlines.
  • Owning the things with at most dedication.

TECHNICAL SKILLS:

Hadoop Ecosystem: Hadoop, HDFS, MapReduce, Hive, Pig, Spark Core, Spark SQL, Spark Streaming, Impala, Kafka, YARN, Oozie, Zookeeper, Sqoop

Languages: C#

Databases: MS SQL Server

.NET Technologies: ASP.NET, ADO.NET, ASP.NET MVC, LINQ, Web Forms, Win Forms

Scripting languages: Python, Javascript, AngularJS, AJAX, jQuery, JSON

Markup languages: XML, HTML, XSL, XSLT, CSS

Web services: WCF, RESTful, Web services

Methodology: AGILE, SCRUM, Waterfall

Operating Systems: Windows 7/XP/Vista/Win 98/Win 95, Unix and Linux

Software: .NET IIS 6/7/, SSRS, SSIS, Visual Source Safe, TFS, SVN, Crystal reports XI

PROFESSIONAL EXPERIENCE:

Confidential, San Leandro, California

Sr Hadoop Developer

Responsibilities:

  • Worked extensively with Sqoop for importing and exporting the data from HDFS to Relational Database systems and vice-versa.
  • Worked in developing Pig scripts to create the relationship between the data present in the Hadoop cluster.
  • Experience in analyzing data using Hive, HBase and custom Map Reduce program.
  • Worked on Oozie Workflow Engine in running workflow jobs with actions that run Hadoop Map/Reduce and Pig jobs.
  • Created Hive tables, partitions and loaded the data to analyze using Hive QL queries.
  • Worked on Sequence files, Map side joins, Bucketing, Static and Dynamic Partitioning for Hive performance enhancement and storage improvement.
  • Good understanding of ETL tools and how they can be applied in a Big Data environment.
  • Worked on Spark SQL and Spark Streaming.
  • Good knowledge in writing spark application using Python and Scala.
  • Implemented Spark SQL to access hive tables into Spark for faster processing of data.
  • Worked on Spark streaming using Apache Kafka for real time data processing.
  • Experience in creating Kafka producer and Kafka consumer for Spark streaming.
  • Used Hive to do transformations, joins, filter and some pre-aggregations after storing the data to HDFS.

Technologies: Hadoop, Big Data, HDFS, MapReduce, Sqoop, Oozie, Spark, Scala, Pig, Hive, Impala, Hbase, Flume, Storm, Kafka, Python, CDH, SQL.

Confidential, Burbank, CA

Hadoop Developer

Responsibilities:

  • Worked on writing transformer/mapping Map-Reduce pipelines using Python and Streaming API.
  • Involved in creating Hive Tables, loading with data and writing Hive queries, which will invoke and run Map Reduce jobs in the backend.
  • Involved in loading data into HBase using HBase Shell, HBase Client API, Pig and Sqoop.
  • Designed and implemented Incremental Imports into Hive tables.
  • Involved in collecting, aggregating and moving data from servers to HDFS using Apache Flume.
  • Written Hive jobs to parse the logs and structure them in tabular format to facilitate effective querying on the log data.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
  • Migrated ETL jobs to Pig scripts do Transformations, even joins and some pre-aggregations before storing the data onto HDFS.
  • Implemented workflows using Apache Oozie framework to automate tasks.
  • Load data from various data sources into HDFS using Kafka.
  • Sqoop jobs, PIG and Hive scripts were created for data ingestion from relational databases to compare with historical data.
  • Utilized Storm for processing large volume of datasets.
  • Used Pig as ETL tool to do transformations, event joins, filter and some pre-aggregations.
  • Extensively worked with Cloudera Distribution Hadoop and HortonWorks Data Platform.
  • Involved in story-driven agile development methodology and actively participated in daily scrum meetings.

Technologies: Hadoop, Big Data, HDFS, MapReduce, Sqoop, Oozie, Pig, Hive, Hbase, Flume, Storm, Kafka, Python, CDH, HDP, SQL.

Confidential

Hadoop Developer

Responsibilities:

  • Gathered the business requirements from the Business Improvement Team and Subject Matter Experts.
  • Responsible for loading unstructured data into Hadoop File System (HDFS).
  • Involved in managing and reviewing Hadoop log files.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Supported Map Reduce Programs those are running on the cluster.
  • Imported data using Sqoop to load data from RDBMS to HDFS on regular basis.
  • Developed Scripts and Batch Job to schedule various Hadoop Program.
  • Developed PIG-Latin script to generate report.
  • Created jobs to load data from HBase into Data warehouse.
  • Wrote Hive queries for data analysis to meet the business requirements.

Technologies: Hadoop, HDFS, Pig, Hive, HBase, Oozie, MapReduce, flume, Hbase, Sqoop, Oozie, Big Data, SQL and Windows.

Confidential

Senior System Engineer

Responsibilities:

  • Designed and developed data driven Web Forms using ASP.NET, ADO.NET, HTML, JavaScript and CSS technologies.
  • Worked on upgrading the application using AngularJS, HTML, CSS, JavaScript and JQUERY.
  • Implemented code according to coding standards and Created AngularJS Controller, which isolate scopes perform operations.
  • Developed custom directives and Services in AngularJS.
  • Worked on AngularJS, used its two-way data binding to achieve the feedback functionality from the user.
  • Wrote SPA (Single Page Applications) using RESTful web services plus AJAX and AngularJS.
  • Involved in designing and development of REST services.
  • Developed the Master Pages and applied that Master Pages to all Content Pages using ASP.NET.
  • Created User Controls and Custom Controls to enable reusability and used rich server controls to design ASP.NET pages.
  • Used AJAX controls to minimize server round trips to enhance customer experience and to improve application performance.
  • Created MVC controller models and views according to the requirement of client.
  • Created Stored Procedures, Views, Triggers, and Complex T-SQL queries in SQL Server.
  • Testing the code with the production support team
  • Involved in maintenance and enhancements of an application using Microsoft .NET Framework 4.0 C# .NET, ASP.NET, LINQ, WCF, Web API, AJAX, JavaScript and WEB SERVICES.
  • Performed query optimization and performance tuning for complex SQL queries.
  • Used Team Foundation Server (TFS) for version controlling and assisted in documentation and creating Help files.
  • Coordinated the build/migration of releases to test and production environment.

Technologies: .Net Framework 4.0, C#.Net, ASP.NET, ADO.NET, MVC, JavaScript, HTML, CSS, AngularJS, JQUERY, SSIS, SSRS, SQL 2012.

Confidential

System Engineer

Responsibilities:

  • Customize application functionalities based on client requirements using ASP.NET, ADO.NET, ASP.NET MVC and AngularJS.
  • Designed and coded application components in an Agile environment utilizing a test driven development approach.
  • Developed controllers, custom directives and custom services in AngularJS.
  • Developed Restful Web Services, which can be used by a number of clients in getting useful data.
  • Experience in deploying, configuring and maintaining the Restful services on IIS 7.0.
  • Created & hosted WCF Services and also used different bindings to make it available to different types of clients.
  • Created User Controls, Data Access Layer, Business Logic Layer Classes using C# and .Net 4.0 for web pages.
  • Used ASP.NET MVC Page Controller pattern to add functionality to individual pages such that to accept input from the page request, invoke the requested actions.
  • Used ASP.NET MVC framework to support the use of Dependency Injection to inject objects into a class, instead of relying on the class to create the object itself.
  • Used SVN for version controlling and assisted in documentation and creating Help files.
  • Interact with project team and client to understand requirements
  • Upgrade the LMS Database from one version to other
  • Implement new features in the core LMS application
  • Communicating with teams while upgrading, customizing and productizing the application
  • Conducting and attending team meeting to provide updates and to keep track on the work

Technologies: .Net Framework 4.0, C#.Net, ASP.Net, ADO.NET, MVC, JavaScript, HTML, CSS, AngularJS, Oracle, SQL 2005 and SQL 2008.

Confidential

System Engineer

Responsibilities:

  • Developed and maintained Web Forms with ASP.Net and C#.
  • Master Page was used for consistent layout of application.
  • Used JavaScript to hide and show the page content for client side validation.
  • Responsible for designing, developing, enhancing, maintaining various sub-modules.
  • Extensively using Datasets, Data table of ADO.NET to retrieve and manipulate data and display it in an ASP.NET page and user control.
  • Using Data Grid, Data List to display data in a customized format in the ASP.NET web pages.
  • Developed web service to generate documents in pdf format.
  • Created complex Stored Procedures, Triggers, and Views Functions using SQL Server 2005.
  • Handled all issues regarding database, its connectivity maintenance.
  • Responsible for code testing and preparing technical documentation for defects and orders change.
  • Interact with Project managers and Business Analyst to resolve any issue in timely manner.

Technologies: Visual Studio .NET 2005, C#, MS-SQL Server 2005, AJAX Toolkit, JavaScript, ADO.NET, IIS, HTML, Windows Server 2008.

We'd love your feedback!