We provide IT Staff Augmentation Services!

Sr. Hadoop Consultant Resume

2.00/5 (Submit Your Rating)

MO

SUMMARY

  • Senior Hadoop Consultant having 11 years of professional IT experience with around 3 years of Big Data
  • Ecosystem experience in ingestion, storage, querying, processing and analysis of big data.
  • Hands on experience in Hadoop Ecosystems HDFS, YARN, ETL, MapReduce, Hbase, Hive, Impala, Pig, Flume, Kafka, Oozi, Sqoop, Unix & Linux
  • Experience in working with Cloudera distribution(CDH)
  • Experienced in analyzing data using HiveQL, Pig Latin and custom Map Reduce programs in Java
  • Strong experience in writing Map Reduce programs for Data cleansing
  • Created partitioned tables and loaded data using static and dynamic partitions
  • Optimized Hive tables using optimization techniques like Partitioning and Bucketing
  • Extended Hive and Pig core functionality by writing custom UDFs
  • Developed Pig Latin scripts for business transformations
  • Imported and exported data usingSqoop
  • Experienced in optimizing ETL workflows
  • Imported streaming data and loading weblog entries into HDFS using Flume
  • Imported activity data and event messages using Kafka
  • Experienced in NoSQL HBase data store
  • Experienced in job workflow scheduling and monitoring using Oozie
  • Strong Experience in development of web - based applications using Java
  • Well versed in designing and implementing MapReduce jobs using Java on Eclipse to solve real world scaling problems
  • Good experience in RDBMS like DB2, MySQL, Oracle
  • Extensive experience in Creating and maintained Database Objects like Tables, Views, Indexes
  • Experience in JAVA/J2ee, Struts, JSF, Spring, Hibernate and web services

TECHNICAL SKILLS:

Hadoop/Big Data: HDFS, MapReduce, HBase, Pig, Hive, Impala, Flume, Kafka, Sqoop, Oozie, YARN

Languages: Java, Python, COBOL

J2EE Technolokgies: JSP, Servlets, JDBC

Fraame Works: Struts, Spring, TSF, Hibernate, Hadoop

Java IDE: Eclipse

Scripting: Unix Shell Scripting

Operating System: Linux, Unix, Cloudera, MVS/ZOS, Windows

Data Bases: MySQL, DB2, Oracle

Others: Hue, Control-M, Work Bench

PROFESSIONAL EXPERIENCE:

Sr. Hadoop Consultant

Confidential, MO

Responsibilities:

  • Involved in loading structured and semi structured data from multiple source to HDFS
  • Developed custom MapReduce programs for cleansing and transforming the data
  • Created staging tables and ingested data as dynamic partitions
  • Created final tables in Parquet format
  • Imported data using Sqoop from the MySQL tables
  • Design and implement ETL solutions with tools like Apache Sqoop, MapReduce, Hive
  • Implemented Oozie workflow for automating the complete process
  • Responsible to manage data coming from different sources
  • Imported activity data and event messages using Kafka from different producers
  • Created tables, design row key and storing data in column family oriented NoSQL HBase data store
  • Experience in managing and reviewing Hadoop log files
  • Implemented various optimization techniques
  • Bench marked various query performances
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it

Environment:Hadoop, HDFS, ETL, YARN, HBase, Sqoop, Hive, Impala, Parquet, Kafka, Oozie, Shell Scripting, Java & MapReduce.

Sr. Hadoop

Consultant Confidential, FL

Responsibilities:

  • Involved in loading structured and semi structured data from multiple source to HDFS
  • Developed custom Pig UDFs for transforming the data
  • Designed and Developed Pig Latin scripts
  • Created staging tables and ingested data as dynamic partitions in Hive
  • Created final tables in Parquet format in Impala
  • Design and implement ETL solutions with tools like Apache Sqoop, MapReduce, Hive
  • Imported data using Sqoop from the MySQL tables
  • Implemented Oozie workflow for automating the complete process
  • Responsible to manage data coming from different sources
  • Imported streaming data and loading weblog entries into HDFS using Flume
  • Created tables, design row key and storing data in column family oriented NoSQL HBase data store
  • Experience in managing and reviewing Hadoop log files
  • Implemented various optimization techniques
  • Bench marked various query performances
  • Leading the team and assign work to the team members

Environment:HDFS, YARN, ETL, MapReduce, HBase, Java, Hive, Impala, Pig, Flume, SQL and Sqoop.

Sr. Developer

Confidential, FL

Responsibilities:

  • Preparation/Review of Impact analysis documents
  • Preparation/Review of Test Plan documents
  • Leading the team and assign work to the team members
  • Code Developed for Break fix Enhancements
  • Production Support
  • Integration and Unit Testing

Environment:Java, Struts 2.0, DB2, IntelliJIdea, XML, DMS, Author

Sr. Software Engineer

Confidential, OH

Responsibilities:

  • Involved in the development of Functional Requirements Specifications, Technical Specifications, detailed design documents, user guides, test procedures, and test cases for the application components
  • Followed Object Oriented Design and Analysis by preparing Use Cases, business domain model, Sequence Diagrams and Class Diagrams and designed the UML components for technical specification in Microsoft Visio.
  • Implemented the Software Development Guidelines based on Agile Methodologies.
  • Developed front end of application on MVC architecture employing Struts Framework.
  • Responsible for setting up configuration files- web.xml, struts- config.xml, tiles-defs.xml, and validation.xml; developed UI layer using JSP, Struts Tag Libraries, JavaScript, AJAX, HTML/DHTML
  • Developed Action classes, Action Forms performed form validations using Struts Validation Frameworks, and used Tiles Frame work.
  • Used Hibernate in DAO layer to access and update information in Oracle database, developed Hibernate configuration files (hbm.xml) for object relational mapping with database, fine-tuned performance by optimizing query and data caching mechanisms.
  • Developed SQL queries and Procedures using SQL and PL/SQL.
  • Involved in Test data creation, Unit testing using JUNIT

Environment:Java, J2EE, SQL Server, J2EE Web Services, XML, Eclipse, Ajax, HTML, JavaScript and WebLogic Application Server

Programmer Analyst

Responsibilities:

  • Designed and developed an interactive module in Java
  • Implemented Presentation layer using JSP, Servlets
  • Developed the application using Struts Framework that leverages the classical MVC architecture
  • Worked on query handling, customer support, helpdesk
  • Migrated a poorly performing and outdated application to Struts, Hibernate based system for Sprint
  • Maintained the interface of Oracle using JDBC
  • Wrote procedures and queries to extract data from database
  • Tested the flow of modules using JUNIT
  • Monitored the error logs using Log4J and fixed the problems
  • Handled the JDBC backend operations of the respective modules also.
  • Code Development
  • Preparation/Review of Impact analysis documents
  • Preparation/Review of Test Plan documents.

Environment:Java, JSTL, EJB 2.0.

We'd love your feedback!