We provide IT Staff Augmentation Services!

Hadoop Consultant Resume

5.00/5 (Submit Your Rating)

Fremont, CA

SUMMARY

  • 8+ years’ experience wif skills in analysis, design, and development, debugging and deploying various software applications.
  • Over 3+ years of experience in Hadoop Eco system and Big - Data Analytics.
  • Excellent understanding/noledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Hands on experience in installing, configuring and using ecosystem components like Hadoop Map Reduce,HDFS, HBase, Oozie, Sqoop, Flume, Pig & Hive.
  • Expertise wif managing and reviewing Hadoop log files.
  • Experience in analyzing data using Pig Latin, HQL, HBase and custom Map Reduce programs in Java.
  • Extending Hive and Pig core functionality by writing custom UDFs.
  • Hands on experience in writing Map Reduce jobs using Java.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational DB systems.
  • Good understanding of NoSQL Data bases and hands on work experience in writing application on No SQL databases.
  • Excellent communication skills, interpersonal skills, problem solving skills, and a very good team player along wif can do attitude and ability to effectively communicate wif all levels of teh organization such as technical, management and customers.
  • Ability to perform at a high level, meet deadlines, adaptable to ever changing priorities.

TECHNICAL SKILLS

Big Data Technologies: Hadoop, MapReduce, Pig, Hive, HBase, Oozie, Sqoop, Zookeeper

Programming Languages: Java, PHP, Python, Perl, R, SQL, CQL, JavaScript, jQuery, Ajax, HTML, CSS

RDBMS: MySQL, SQL Server, MS Access

NoSQL: HBase

Cloud: Amazon EMR, Google Cloud.

Other Tools and Technologies: GIT, SVN, Eclipse, ER Studio

Operating System: Linux, Windows

PROFESSIONAL EXPERIENCE

Confidential, Fremont, CA

Hadoop Consultant

Responsibilities:

  • Involved in Design and Development of technical specification documents using Hadoop technology.
  • Developed MapReduce programs to parse teh raw data, populate staging tables and store teh refined data in partitioned tables in teh EDW.
  • Installation & configuration of entire Hadoop eco system wif MySQl.
  • Installing, configuring, monitoring, and maintaining HDFS, HBase, Pig, Hive.
  • Managed and reviewed Hadoop log files.
  • Developed and written Apache PIG scripts and HIVE scripts to process teh HDFS data.
  • Designing and creating Hive external tables using shared meta-store instead of derby wif partitioning, dynamic partitioning and buckets.
  • Monitoring Hadoop scripts which take teh input from HDFS and load teh data into Hive.
  • Migrating teh needed data from Oracle, MySQL in to HDFS using Sqoop and importing various formats of flat files in to HDFS.
  • Defined job work flows as per their dependencies in Oozie.
  • Maintain System integrity of all sub-components related to Hadoop.
  • Workedextensively on Apache and Cloudera’s Hadoop clusters.

Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Java, Eclipse, Hive, Pig, Sqoop, Flume, Oozie, MySQL, Hadoop Distribution of Cloudera.

Confidential, Atlanta, GA

Hadoop Consultant

Responsibilities:

  • Involved in gathering business requirements and prepared detailed specifications that follow project guidelines required to develop written programs.
  • Designed and developed MapReduce process flow to collect and parse log files and upload to HDFS.
  • Configured Hadoop system files to accommodate new sources of data and updated teh existing configuration Hadoop cluster
  • Developed UDFs using JAVA as and when necessary to use in PIG and HIVE queries
  • Analyzed teh data by performing Hive queries and running Pig scripts to understand teh requirements.
  • Extracted teh data from Teradata into HDFS using Sqoop.
  • Continuous monitoring and managing teh Hadoop cluster through Cloudera Manager.
  • Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
  • Installed Oozie workflow engine to run multiple Hive.
  • Extracted feeds form social media sites such as Facebook, Twitter.
  • Involved in loading data from UNIX file system to HDFS.
  • Actively participating in teh code reviews, meetings and solving any technical issues.

Environment: Java 7, Eclipse, Oracle 10g, Hadoop, Hive, HBase, Oozie, Linux, MapReduce, HDFS, Hive, Hadoop Distribution of Cloudera, SQL, Toad 9.6.

Hadoop Developer

Confidential, Pittsburgh, PA

Responsibilities:

  • Developed and executed custom MapReduce programs and Pig scripts for Portfolio Valuation and Fraud Detection program involving more than 70 TB of data.
  • Performed Hive test queries on local sample files and HDFS files.
  • Extensively used Hive for data cleansing.
  • Cross compared large datasets from predictive models wif actual market data to forecast portfolio valuations.
  • Ra Hadoop streaming jobs to process more than 50TB of XML data.
  • Worked closely wif data science team to gather requirements for various reports.
  • Analyzed user request patterns and generated reports for various performance optimization measures.
  • Imported cleaned data into MySQL from HDFS for business intelligence, visualization and reports.

Environment: Hadoop, HDFS, Pig, Hive, MySQL, XML, Core JAVA, Amazon AWS, Linux

Confidential, Hartford, CT

Java Developer

Responsibilities:

  • Prepared user requirements document and functional requirements document for different modules.
  • Designed teh application architecture in lines of Struts Frame work based on MVCII.
  • Architecture wif JSP as View, Action Class as Controller and combination of EJBs and Java classes as Model.
  • Used Struts, JSTL, Struts-eland Tag Libraries.
  • Responsible for designing, writing code in Action Class, Validators, Action forms and developing teh system flow for teh module using Struts Framework.
  • Involved in coding Session-beans and Entity-beans to implement teh business logic.
  • Designed and developed presentation layer using JSP, HTML wif client-side form validation by JavaScript and Struts built-in form validations.
  • Used AJAX for asynchronous data transfer (HTTP requests) between teh browser and teh web server.
  • Used SAX and DOM for parsing XML documents retrieved from different data sources.
  • Prepared SQL script for database creation and migrating existing data to teh higher version of application.
  • Installed and configured required software's for application development (Eclipse IDE, oracle database, WebSphere, Tomcat, plugin's for eclipse, required framework jars.
  • Developed different Java Beans and halper classes to support Server Side programs.
  • Written test cases for unit testing using JUnit testing Framework.
  • Involved in development of backend code for email notifications to admin users wif multi excel sheet using teh xml.
  • Involved wif responsibility to assist in cleaning teh dojo on a daily basis.
  • Involved wif teh dojo used for different purpose according to teh requirement.
  • Modified teh existing Backend code for different level of enhancements.
  • Used Axis to implementing Web Services for integration of different systems.
  • Designing error handling flow and error logging flow.
  • Developing build files for teh project using ANT build tool.

Environment: Java 1.5, J2EE, JSP, Servlets, Struts 1.3, Dojo, TagLibs, RAD, XML, EJB 3.0, Ant, SQL, CVS, PVCS, Web Services, SOAP, WSDL, MVC, JavaScript, CSS, AJAX, Oracle10g, Web Sphere, Toad, UNIX.

Confidential, Denver, CO

Java/J2EE Developer - Consultant

Responsibilities:

  • Involved in document analysis and technical feasibility discussions for implementing new functionalities. Extended current project wif new functionality.
  • Design: Developed teh UML Use cases, Activity Sequence and Class diagrams using Rational Rose.
  • Designed and coded application components in an agile environment utilizing a test driven development approach.
  • Development: Developed application using Spring Framework. Used Spring MVC for flexibility and integrated framework like Hibernate.
  • Used Spring framework along wif JSP, JSTL, HTML, CSS, Javascript and AJAX to construct dynamic web pages (presentation layer) for teh application.
  • Extensive use of Ajax to update teh part of webpage which improved teh performance of teh application.
  • Backend Persistence: Managed Object persistence, data retrieval using Hibernate, Spring framework wif Oracle as backend.
  • Written queries, stored procedures and functions using SQL, PL/SQL in Oracle.
  • Implemented Log4j to maintain system log.
  • Version Control: Used SVN for software configuration management and version control.

Environment: Java/J2EE, REST Web Services, spring, Hibernate, JSON, HTML, CSS, Ajax, JQuery

Confidential

Software Developer

Responsibilities:

  • Involved in creation of Detail Design by analyzing existing web applications and adding in new functionalities.
  • Designed Software Architecture and coordinated throughout teh software development life cycle ( Confidential )
  • Designed and Deployed database for teh website.
  • Integrated wif Facebook Connect for login
  • Involved in teh development of teh complex backend.

Environment: JSP/Servlets/Beans, MySQL, JavaScript, AJAX, XML, HTML, CSS, Facebook Connect.

We'd love your feedback!