Senior Hadoop Developer Resume
San Francisco, CA
SUMMARY:
- Over 7 years of experience in the field of IT including four years of experience inHadoop ecosystem and good object oriented programming skills.
- Good knowledge ofHadoopArchitecture and various components such as HDFS, Job Tracker, Task Tracker, Data Node, Name Node and Map - Reduce concepts Responsible for writing Map Reduce programs.
- Knowledge of administrative tasks such as installingHadoopand its ecosystem components.
- Expertise with the tools inHadoopEcosystem including Pig, Hive, HDFS, Map Reduce, SQOOP, Kafka, Yarn, Oozie, and Zookeeper.Hadoop architecture and its components.
- Experience in tuning and troubleshooting performance issues inHadoopcluster.
- Experience in using Cloudera Manager for installation and management of single-node and multi-nodeHadoopcluster.
- Worked with Cassandra for non-relational data storage and retrieval on enterprise use cases.
- Extensive experience in using the MOM with Active MQ, Apache storm, Apache Spark & Kafka, Apache Solr, Maven and Zookeeper.
- Written Apache Spark streaming API on Big Data distributions in the active cluster environment.
- Administrator for Pig, Hive and HBase installing updates, patches and upgrades.
- Experience in importing and exporting data using SQOOP from HDFS to Relational Database Systems and vice-versa. .
- Experienced in managing Hadoop Cluster using Cloudera Manager Tool.
- Work experience with cloud infrastructure like Amazon Web Services (AWS).
- Worked on Service Oriented Architecture (SOA) such as Apache Axis web services which use SOAP.
- Proficient in programming with Java/J2EE and strong experience in technologies such as JSP Servlets, Struts, Spring, Hibernate, EJBs, MDBs, Session Beans, JDBC, Solr and JNDI.
- Efficient in packaging & deploying J2EE applications using ANT, Maven & Cruise Control on WebLogic, WebSphere & JBoss.
- Worked on the performance & load test related tools like JProfiler and JMeter.
- Extensive experience in developing the SOA middleware based out of Fuse ESB and Mule ESB.
- Expertise in n-tier and three-tier Client/Server development architecture and Distributed Computing Architecture.
- Experience in IP Management (IP Addressing, Sub-netting, Ethernet Bonding, Static IP).
- Excellent Java development skills using J2EE, J2SE, Servlets, JSP, JDBC.
- Analytical thinker that consistently resolves on-going issues or defects, often called upon to consult on problems as well a fast learner.
TECHNICAL SKILLS:
Hadoop/Big Data: HDFS, HBase, MapReduce, YARN, Hive, Pig, Sqoop, Flume, Oozie, Kafka, MangoDB, Zookeeper.
Hadoop Distribution: Cloudera, Hortonworks, Apache
Database: Oracle 11g, MySQL, MS SQL Server, IBM DB2 NoSQL Databases HBase.
NoSQL Database: Cassandra, MongoDB, HBase.
Java Technologies: Java Servlets, JMS, JUnit.
Programming Languages: C, C++, Java, XML, Unix Shell Scripting, SQL, PLSQL.
IDE/Tools: Eclipse, Net Beans, Oracle.
J2EE Technologies: JSP, JDBC, JNDI, Servlets, Hibernate.
Web Technologies: HTML, DHTML, XML, SOAP, Java Script, CSS.
Webservices: WebLogic, WebSphere, Apache Tomcat, JBoss.
Network Protocols: TCP/IP, UDP, HTTP, DNS, DHCP, FTP
Operating Systems: Windows 98/XP/Vista/7/8, UNIX, Linux.
PROFESSIONAL EXPERIENCE:
Senior Hadoop Developer
Confidential, San Francisco CA
Responsibilities:
- Developed data pipeline using Flume, SQOOP, Pig and Java map reduce to ingest customer behavioral data and financial histories into HDFS for analysis
- Involved in Sqoop, HDFS Put or Copy from Local to ingest data and Map Reduce jobs.
- Used Pig to do transformations, event joins, filter boot traffic and some pre-aggregations before storing the data onto HDFS.
- Involved in developing Pig UDFs for the needed functionality that is not out of the box available from Apache Pig.
- Expertise with the tools inHadoopEcosystem including Pig, Hive, HDFS, Map Reduce, SQOOP, Kafka, Yarn, Oozie, and Zookeeper.Hadoop architecture and its components.
- Extensive experience in using the MOM with Active MQ, Apache storm, Apache Spark & Kafka Maven and Zookeeper.
- Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
- Involved in developing Hive DDLs to create, alter and drop Hive tables and storm.
- Create scalable and high-performance web services for data tracking.
- Managed works including indexing data, tuning relevance, developing custom tokenizers and filters, adding functionality includes playlist, custom sorting and regionalization with Solr Search Engine.
- Involved in loading data from UNIX file system to HDFS. Installed and configured Hive and also written Hive UDFs and Cluster coordination services through Zoo Keeper.
- Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
- Experienced in managing Hadoop Cluster using Cloudera Manager Tool.
- Involved in developing Hive UDFs for the needed functionality that is not out of the box available from Apache Hive.
- Involved in using HCATALOG to access Hive table metadata from Map Reduce or Pig code.
- Computed various metrics using Java Map Reduce to calculate metrics that define user experience,Revenue etc.
- Responsible for developing data pipeline using flume, Sqoop and pig to extract the data from weblogs and store in HDFS.
- Extracted and updated the data into Monod using Mongo import and export command line utility interface.
- Extracted and updated the data into Monod using Mongo import and export command line utility interface. Involved in using SQOOP for importing and exporting data into HDFS.
- Used Eclipse and ant to build the application. Proficient work experience with NOSQL, Monod databases also the HDFS data from Rows to Columns and Columns to Rows.
- Involved in developing Shell scripts to orchestrate execution of all other scripts (Pig, Hive, and Map Reduce) and move the data files within and outside of HDFS.
Environment: Hadoop, Map Reduce, Mongo, Yarn, Hive, Solr, Pig, HBase, Oozie, Sqoop, Flume, Oracle 11g, Core Java, Cloudera, HDFS, Eclipse.
Hadoop Developer
Confidential, Dallas TX
Responsibilities:
- Worked extensively in creating Map Reduce jobs to power data for search and aggregation.
- Designed a data warehouse using Hive.
- Involved in Analyzing system failures, identifying root causes, and recommended course of actions.
- Using Hive, Map-reduce, and loaded data into HDFS.
- Worked with systems engineering team to plan and deploy newHadoopenvironments and expand existingHadoopclusters.
- Monitored workload, job performance and capacity planning using Cloudera Manager.
- Worked extensively with SQOOP for importing metadata from Oracle.
- Worked with application teams to install operating system,Hadoopupdates, patches, version upgrades as required.
- Extensively used Pig for data cleansing.
- Created partitioned tables in Hive.
- Worked with business teams and created Hive queries for ad hoc access.
- Evaluated usage of Oozie for Workflow Orchestration.
- Mentored analyst and test team for writing Hive Queries.
- Experience in writing MapReduce programs with Java API to cleanse Structured and unstructured data.
- Experience in RDMS such as Oracle, Teradata
- Worked on loading the data from MySQL to HBase where necessary using Sqoop.
- Launching Amazon EC2 Cloud Instances using Amazon Images (Linux/ Ubuntu) and Configuring launched instances with respect to specific applications.
- Gained very good business knowledge on claim processing, fraud suspect identification, appeals process etc.
Environment: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, AWS, Java, Oozie, MySql.
J2EE Developer
Confidential, San Ramon, CA
Responsibilities:
- Actively participated in requirements gathering, analysis, design, and testing phases.
- Designed use case diagrams, class diagrams, and sequence diagrams as a part of Design Phase.
- Developed the entire application implementing MVC Architecture integrating JSF with Hibernate and Spring frameworks.
- Developed the Enterprise Java Beans (Stateless Session beans) to handle different transactions such as online funds transfer, bill payments to the service providers.
- Implemented Service Oriented Architecture (SOA) using JMS for sending and receiving messages while creating web services.
- Used ANT scripts to build the application and deployed on WebSphere Application Server.
- Developed XML documents and generated XSL files for Payment Transaction and Reserve Transaction systems.
- Developed Web Services for data transfer from client to server and vice versa using Apache Axis, SOAP and WSDL.
- Used JUnit Framework for the unit testing of all the java classes.
- Used DAO and JDBC for database access.
- Implemented various J2EE Design patterns like Singleton, Service Locator, DAO, and SOA.
- Worked on AJAX to develop an interactive Web Application and JavaScript for Data Validations.
Environment: J2EE, JDBC, Java 1.4, Servlets, JSP, Struts, Hibernate, Web services, SOAP, WSDL, Design Patterns, MVC, HTML, WebLogic 9.0, XML, JUnit, Oracle 10g, Web Sphere, My Eclipse.
Java/J2EE Developer
Confidential
Responsibilities:
- Involved in the analysis, design, and development and testing phases of Software Development Lifecycle (SDLC) using agile development methodology.
- Developed the application under J2EE architecture, developed Designed dynamic and browser compatible user interfaces using JSP, Custom Tags, HTML, CSS, and JavaScript.
- Actively participated in requirements gathering, analysis, design, and testing phases.
- Used EJBs (Session beans) to implement the business logic, JMS for communication for sending updates to various other applications and MDB for routing priority requests.
- Implemented Service Oriented Architecture (SOA) using JMS for sending and receiving messages while creating web services.
- All the Business logic in all the modules is written in core Java.
- Deployed & maintained the JSP, Servlets components on Web logic 8.0
- Developed Application Servers persistence layer using JDBC, SQL, and Hibernate.
- Used JDBC to connect the web applications to Data Bases.
- Designed and developed user interface using JSP, HTML and JavaScript.
- Validated the fields of user registration screen and login screen by writing JavaScript validations.
- Worked with field level engineers and teams to make more user-friendly. Performed testing for GUI and back end.
- Developed and utilized J2EE Services and JMS components for messaging communication in Web Logic.
- Configured development environment using Web logic application server for developers integration testing.
Environment: Java/J2EE, SQL, Oracle 10g, JSP 2.0, EJB,SOA, AJAX, Java Script, Web Logic 9.0, HTML, JDBC 3.0, XML, JMS, WebSphere, Servlets, My Eclipse.
Java Developer
Confidential
Responsibilities:
- Responsible and active in the analysis, design, implementation and deployment of full Software Development Lifecycle (SDLC) of the project.
- Defined the search criteria and pulled out the record of the customer from the database. Make the required changes and save the updated record back to the database.
- Developed Struts action classes, action forms and performed action mapping using Struts framework and performed data validation in form of beans and action classes.
- Designed and developed user interface using JSP, HTML and JavaScript.
- Extensively used Struts framework as the controller to handle subsequent client requests and invoke the model based upon user requests.
- Validated the fields of user registration screen and login screen by writing JavaScript validations.
- Design and develop XML processing components for dynamic menus on the application.
- Developed build and deployment scripts using Apache ANT to customize WAR and EAR files..
- Developed stored procedures and triggers using PL/SQL in order to calculate and update the tables to implement business logic.
- Involved in postproduction support and maintenance for dynamic menus on the application.
Environment: Oracle 10g, Java 1.4, Struts, Servlets, HTML, XML, SQL, J2EE, JUnit, Tomcat 6.