Hadoop Developer Resume
Houston, TX
SUMMARY
- 7 years of extensive experience in teh IT Industry which includes Big Data Analytics, Hadoop implementations, and Java.
- 2.5 years of experience with Hadoop, HDFS, MapReduce and Hadoop Ecosystem (Pig, Hive, and Sqoop).
- Hands on experience in writing MapReduce jobs, Pig scripts, and Hive scripts.
- Worked on Agile methodology, SOA for many of teh applications.
- Excellent analytical, problem solving, communication and interpersonal skills with ability to interact with individuals at all levels and can work as a part of a team as well as independently.
- Ability to perform at a high level, meet deadlines, adaptable to ever changing priorities.
TECHNICAL SKILLS
Big Data: Hadoop, MapReduce, HDFS, Hive, Pig, Oozie, Flume, Zookeeper, Avro, HBase, Cassandra, MongoDB, CDH4, Cloudera Manager, MapR
Languages: Java, C, C++, PHP, Python, HTML, XML, SQL
Platforms: Linux (CentOS, RedHat, Ubuntu), Windows, Mac
Databases: Oracle, Netezza, MySQL, Teradata, MS SQL Server
BI Tools: Crystal Reports, MicroStrategy, SSIS, SSAS, SSRS
PROFESSIONAL EXPERIENCE
Confidential, Houston, TX
Hadoop Developer
Responsibilities:
- Transferring data between Oracle, MySQL, Netezza and HDFS using Sqoop with connectors.
- Moving log data periodically into HDFS using Flume. Building multi - hop flows, fan-out flows, and failover mechanism.
- Processing large amount of GPS messages (xml format) using Avro. Defining and compiling schemas, serializing and deserializing data.
- Wrote MapReduce jobs to read data files and scrub teh data.
- Creating and populating Hive tables and writing Hive queries for data analysis to meet teh business requirements.
- Developed Pig Latin scripts for data processing.
- Migrating data from Oracle database to HBase. Running MapReduce jobs to access HBase data from application using Java Client APIs.
- Exported teh analyzed data to Oracle using Sqoop for visualization and generating reports for teh BI team.
- Automating teh jobs using Oozie. Defining Oozie workflow jobs to chain together Sqoop imports, MapReduce jobs, and Pig scripts (multiple decision, fork and join nodes), and defining Oozie coordinator jobs to execute reoccuring workflow jobs.
- Used SVN for version control.
- Actively participated in software development lifecycle (design, implement, deploy, test), including design and code reviews, test development, test automation.
- Involved in solution-driven Agile development methodology and actively participated in daily scrum meetings.
Environment: Hadoop, HDFS, MapReduce, Sqoop, Hive, Pig, Oozie, Flume, Avro, HBase, Cassandra, MongoDB, SVN, CDH4, Cloudera Manager, Oracle, MySQL, Netezza, Eclipse, Application Lifecycle Management (ALM), CentOS
Confidential, Peoria, IL
Hadoop/Big Data Consultant
Responsibilities:
- Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, HBase NoSQL database and Sqoop.
- Importing and exporting data in HDFS and Hive using Sqoop.
- Extracted files from MongoDB through Sqoop and placed in HDFS and processed.
- Experience with NoSQL databases.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in MapReduce way.
- Familiar with job scheduling using Fair Scheduler so dat CPU time is well distributed amongst all teh jobs.
- Involved in teh regular Hadoop Cluster maintenance such as patching security holes and updating system packages.
- Managed Hadoop log files.
- Analyzed teh web log data using teh HiveQL.
Environment: Java 6, Eclipse, Hadoop, Hive, HBase, MangoDB, Linux, MapReduce, HDFS, Shell Scripting, MySQL
Confidential, Chicago, IL
Java/J2EE Developer
Responsibilities:
- Involved in Java, J2EE, struts, web services and Hibernate in a fast paced development environment.
- Rich experiences of database design and hands on experience of large database systems: Oracle 8i and Oracle 9i.
- Involved in design and implementation of web tier using Servlets and JSP.
- Used Apache POI for Excel files reading.
- Written build scripts with Ant for deploying war and ear applications.
- Developed user and technical documentation.
Environment: Java, J2EE, JDBC, Struts, SQL language. Hibernate, Eclipse, Apache POI, CSS
Confidential, San Diego, CA
Java/J2EE Developer
Responsibilities:
- Developed teh user interface using JSP and Java Script to view all online trading transactions.
- Wrote SQL for DAO access.
- Coded Java Server Pages for teh Dynamic front end content dat use Servlets and EJBs.
- Coded HTML pages using CSS for static content generation with JavaScript for validations.
- Used JDBC API to connect to teh database and carry out database operations.
- Used JSP and JSTL Tag Libraries for developing User Interface components.
- Performing Code Reviews.
- Performed unit testing, system testing and integration testing.
- Followed agile methodology, interacted directly with teh client provide/take feedback on teh features, suggest/implement optimal solutions, and tailor application to customer needs.
Environment: s: Java, J2EE, Tomcat, Ant, Eclipse, JavaScripts, CSS, Servlets, JSP, XML, HTML, JDBC
Confidential
Java Developer
Responsibilities:
- Involved in teh analysis, design, implementation, and testing of teh project.
- Implemented teh presentation layer with HTML, XHTML and JavaScript.
- Developed web components using JSP, Servlets and JDBC.
- Designed and developed Data Access Objects (DAO) to access teh database.
- Used DAO Factory and value object design patterns to organize and integrate teh JAVA Objects
- Implemented database using SQL Server.
- Involved in fixing bugs and unit testing with test cases using JUnit.
- Involved in building and deployment of application in Linux environment.
- Deploying application in Development and Production servers.
Environment: s: Java/J2EE, JSP, Servlets, Websphere 6.x, JDBC, JavaScript, SQL Server, JUnit, Eclipse IDE