We provide IT Staff Augmentation Services!

Hadoop Developer Resume

Malvern, PennsylvaniA

SUMMARY

  • Over 7+ years of experience in all aspects of Software Design & Development, Technical Documentation, Analysis, Testing and Deployment web applications using Hadoop and java/J2EE technologies with specializing in Finance, Retail and Telecom Domains.
  • Strong Knowledge of Software Development Life Cycle (SDLC) and the Role of Hadoop and Java developer in different developing methodologies like Agile and Waterfall.
  • Expertise in different components of Hadoop Ecosystem - Hive, Pig, Sqoop, Flume, Zookeeper.
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, Name Node, Data Node, and MapReduce concepts and experience in working with MapReduce programs using Apache Hadoop for working with Big Data to analyze large data sets efficiently.
  • Hands-on experience on YARN (MapReduce 2.0) architecture and components such as Resource Manager, Node Manager, Container and Application Master and execution of a MapReduce job.
  • Strong knowledge of Pig and Hive's analytical functions, extending Hive and Pig core functionality by writing custom UDFs.
  • Expertise in developing PIG Latin Scripts and Hive Query Language for data Analytics.
  • Well-versed in and implemented Partitioning, Dynamic-Partitioning and bucketing concepts in Hive to compute data metrics.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems/ Non-Relational Database Systems and vice-versa.
  • Used Oozie job scheduler to schedule MapReduce jobs and automate the job flows and Implemented cluster coordination services using Zookeeper.
  • Reviewed the HDFS usage and system design for future scalability and fault-tolerance.
  • Experience in working with different relational databases like MySQL, MS SQL and Oracle.
  • Expertise in various faces of Software Development including analysis, design, development and deployment of applications using Servlets, JSP, Struts, Framework, JDBC.
  • Having Experience on Development applications like Eclipse, NetBeans etc.
  • Proficient in software documentation and technical report writing.
  • Versatile team player with good communication, analytical, presentation and inter-personal skills.

TECHNICAL SKILLS

Big Data: Hadoop 2.x, Hive, Pig, MapReduce, Sqoop, YARN, Zookeeper, Scala, Spark, Kafka

Programming Languages: Java SE 6, J2EE(Servlets, JSP), HTML, JavaScript, CSS Familiar with JQuery, Angular JS

Database & Connectivity: Oracle 9i, Oracle 11g, JDBC, MS SQL Server

Servers: Apache Tomcat 7.0, IBM WebSphere

Tools: Various IDEs, Database Studios, Version control & build tools

Modeling: Unified Modeling Language

PROFESSIONAL EXPERIENCE

Confidential, Malvern, Pennsylvania

Hadoop Developer

Responsibilities:

  • Wrote multiple Hive UDFs, script to get hive database size
  • PoC to demonstrate Hive query optimization using cost based optimization
  • Troubleshooting & analytical problem solving for production incidents while adhering to SLA
  • Production support included checking hive server logs, hive connection issues, using Tez view to track time consuming jobs & debugging Hadoop environment issues in general using Ambari to monitor critical metrics.
  • Created Hive internal/external tables with proper static and dynamic partitions.
  • Sqoop import, export and incremental updates.
  • Performance tuning using Partitioning, bucketing of HIVE tables.
  • Implemented MapReduce programs using Java.
  • Installed Oozie workflow engine to run multiple MapReduce, Hive, Zookeeper and Pig jobs which run independently with time and data availability.
  • Design, develop, maintain, test and document various development projects.
  • Experienced in Requirement gathering, create Test Plan, constructed and executed positive/negative test cases in-order to prompt and arrest all bugs within QA environment.

Environment: Hortonworks Hadoop 2.x, MapReduce, HDFS, Hive, Java (jdk1.6)

Confidential, St Louis, MO

Hadoop Developer

Responsibilities:

  • Involved in loading and transforming large sets of structured, semi-structured and unstructured data.
  • Adept in complete Implementation lifecycle, specialized in writing custom MapReduce, Pig and Hive programs.
  • Developed data pipeline using Sqoop, Pig and Java map reduce to ingest claim data and financial histories into HDFS for analysis.
  • Worked on importing data from HDFS to MySQL database and vice-versa using SQOOP.
  • Extensive experience in writing HDFS & Pig Latin commands.
  • Develop UDF's to provide custom hive and pig capabilities and apply business logic on that data.
  • Created Hive internal/external tables with proper static and dynamic partitions.
  • Using Hive analyzed unified historic data in HDFS to identify issues & behavioral patterns
  • Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior.
  • Monitored workload, job performance and capacity planning using Cloudera Manager .

Environment: HDFS, Map Reduce, HIVE, PIG, Sqoop, Oozie, Zookeeper, MySQL, and Java .

Confidential

Senior Analyst

Responsibilities:

  • Implementation of request and Approval modules in AngularJS, JavaScript, PL/SQL Stored Procedures, prepare test scenarios; perform manual testing, code review.
  • Established a Change Management process workflow to ensure all change requests meet the organizational compliance policy and there are no hiccups in Requirement gathering & analysis.
  • Involved in Business Requirements gathering & analysis and documented every detail.
  • Engaged in analysis, design, and development and testing.
  • Involved in High level and Low-level design and implement Code re-factors changes.
  • Interaction with middleware team to understand business requirements and develop the System design and involved in technical discussions, design reviews with Architects.

Environment: Java, J2EE, AngularJS, Spring JPA, HTML, Java Script, JQuery, Eclipse, Log4j, RESTful web services, Oracle, Windows.

Confidential

Software Programmer

Responsibilities:

  • End-to-end control from requirements gathering, preparing test data, test cases, SQL scripts, changes in the front end.
  • Involved significantly in requirement analysis & design.
  • Development of Phoenix, WEbGdv (Global demand Visibility) modules for the provisioning access.
  • Played the roles of Developer, Test Engineer and Quality Engineer

Environment: Java 6, J2EE, UNIX, Eclipse, Apache Tomcat, UML & related SDLC documentation, XSLT.

Confidential

Software Programmer

Responsibilities:

  • Migration of existing Validate and Save system from PL/SQL to Java
  • Involved significantly in requirement analysis & design.
  • End-to-end development of critical modules quote, validate and Save process.
  • Unit testing of the modules using HP quality center.

Environment: Java 6, J2EE, JSP, Struts 1.x, Oracle 11g PL/SQL, Eclipse, SVN, Apache Tomcat, Related SDLC documentation.

Hire Now