Java /hadoop Developer Resume
SUMMARY
- 7 years of total IT experience in Java Development, Web application Development, Database Management.
- 3 years of Experience in Installation, Configuration and Managing Hadoop cluster using Cloudera Manager and Apache Ambari.
- Experience with the Hadoop stack (HBase, Oozie, Sqoop, Hive, Flume, Pig).
- Experience in integrating HBase with Hive,Pig.
- Experience with Hadoop Version 0.22 and 1.1 as well.
- Well versed with programming in C, Java and Shell Scripting
- In depth understanding of HDFS architecture and MapReduce framework.
- Proficiency in using Apache Sqoop to import data to and export data from HDFS with RDBMS and Hive at other end.
- Well versed with writing workflows using Apache Oozie with job controllers like MapReduce, Hive.
- Developed PIG Latin Scripts for data cleansing.
- Developed and involved in the industry specific UDF (user defined functions).
- Worked with Big Data distributions Cloudera (CDH 3 & 4). Also have knowledge of other distributions.
- Created Custom classes to store output from Mapper in Map - Reduce.
- Hands on experience in working with Sequence files, RC files, Combiners, Counters, Dynamic Partitions, Bucketing for best practice and performance improvement.
- Worked with join patterns and implemented Mapside joins and Reduceside joins usingMapReduce Using Distributed Cache.
- Experience with creating External Tables in Hive.
- Adequate knowledge and working experience with Agile methodology and performed role of Scrum master.
- Experienced with working on Avro Data files using Avro Serialization system.
- Extensive experience in creating Class Diagrams, Activity Diagrams, Sequence Diagrams using Unified Modeling Language(UML).
- Experience using middleware architecture using Sun Java technologies like J2EE, JSP, Servlets, and application servers like Web Sphere and Web logic.
- Java Web services experience using SOAP, WSDL, Axis2 and UDDI in Service Oriented Architecture (SOA) and in Restful Web services.
- Extensive knowledge of Web XSLT and HTML specifications.
- Experience in programming with Struts Framework and Spring Framework.
- Experience in working with spring using AOP, IOC and JDBC template.
- Hands on experience in using ORM, Hibernate and JPA for Object Mapping with databases.
- Working experience with databases such as Oracle, MYSQL.
- Good working knowledge with Struts II, Web Logic, Web sphere and JDK
- Experience in writing various test cases using JUNIT framework.
TECHNICAL SKILLS
Hadoop Ecosystem: HDFS, MapReduce, Hive, Pig, Sqoop, flume, Zookeeper
Scripting Languages: Shell Scripting, Python
Relational Databases: Oracle 10g/11g, MySQL
Languages: C++, Java, HTML, CSS, JavaScript, R
Operating Systems: Linux (RedHat, CentOS), Windows
IDE: Visual Studio, Eclipse, IDLE
Web Tools: HTML, XML, JDBC, EJB, JSON, JSP, Servlets, Struts, REST API,JMS, Spring and Hibernate
Tools: SPSS, SAS, Documentum
PROFESSIONAL EXPERIENCE
Confidential, Kansas City, MO
Java /Hadoop developer
Responsibilities:
- Developed a Java MapReduce program to extract the required Information from the semi-structured price claim information.
- Loading data into HIVE data warehouse and extracting data based on requirements.
- Implemented Partitioning, Dynamic Partitions, and bucketing in HIVE for Decomposing Datasets.
- Created Normal table, Externals tables in HIVE and used various HIVE joins to merge the data sets.
- Stored and Processed sparse data for real-time access in HBase.
- Created Sqoop scripts to capture structured claims and rate sheet data.
- Created Hadoop Work Flows and scheduled jobs using oozie.
- Responsible for building scalable Distributed data solution Using Cassandra.
- Create Hive scripts to extract, transform, load (ETL) and store the data.
- Hive framework is used for generating reports from previous accidents information.
- Designed table architecture and developed DAO layer using Cassandra NoSQL database.
- Developed request/response paradigm by using Spring Controllers, Inversion of Control and Dependency.
- Used R to cluster the data to find patterns and Visualize the Data.
- Extensively used IOC and AOP concepts of Spring Framework as part of development.
- Developed application service components and configured beans using Spring, creation of Hibernate mapping files and generation of database schema.
- Injection with Spring MVC, Created a web application using Ext JS JavaScript library to show subscriptions, relay station messages.
- Developed RESTful web services using Jersey and invoked those using Ext JS Ajax requests.
- Developed Data Access Classes using the Hibernate.
- Used Ant for compiling and creating war files, PVCS for Version Control, and Weblogic as Application Server.
- Used XML Technologies like DOM for transferring data.
Environment: Apache Hadoop, Pig, Hive, Sqoop, Cassandra, NoSQL, Big Data, HBase, ZooKeeper, Cloudera, CentOS, Sencha Ext JS, Ajax, JavaScript, Java 6, Spring, Hibernate, JMS, Weblogic Application Server, Eclipse, Ant, PVCS, REST, Jersey, Web Services, Ant, HP Service Manager, Project Server, Unix, Windows
Confidential, Houston, TX
J2EE / Hadoop Developer
Responsibilities:
- Involved in implementing seven node CDH4 Hadoopcluster on RedhatLINUX.
- Worked on pulling the data from oracle databases into the Hadoop cluster using the Sqoop import.
- Worked with flume to import the log data from the reaper logs, syslog’s into the Hadoop cluster.
- Data was pre-processed and fact tables were created using HIVE.
- The resulting data set was exported to SQL server for further analysis.
- Create Hive scripts to extract, transform, load (ETL) and store the data.
- Automated all the jobs from pulling data from databases to loading data into SQL server using shell scripts.
- Implemented Partitioning, Dynamic partitions, Buckets in Hive.
- Responsible for developing PIGLatinscripts.
- Used Ganglia to monitor the cluster around the clock.
- Enabled speedy reviews and first mover advantages by using Oozie to automate data Loading into the Hadoop Distributed File System and PIG to pre-process the data.
- Worked on importing and exporting data into HDFS and Hive using Sqoop.
- Maintained the existing code base developed in the Struts, spring and Hibernate framework by incorporating new features and doing bug fixes.
- Created Web.xml, Validation.xml files to integrate components in the Struts framework
- Used Spring framework to develop application using MVC pattern to decouple the business logic and data.
- Developed the XML files, DTD's, and parsed them by using SAX parser.
- Wrote message handleradapter for enterprise calls using Message Driven Beans, JMS and XML
- Used Spring Framework for dependency injection to achieve loose coupling.
- Used Spring Framework with Hibernate to map to database.
Environment: Linux, Java, Map Reduce, HDFS, Oracle, Sql server, tableau, Hive, Pig, Sqoop, Cloudera manager, Spring, Hibernate, Struts.
Confidential
Hadoop developer
Responsibilities:
- Installed and configured Apache Hadoop to test the maintenance of log files inHadoop cluster.
- Installed and configured Hive, Pig, Sqoop, Flume and Oozie on theHadoop cluster.
- Installed Oozie workflow engine to run multiple Hive and Pig Jobs.
- Setup and benchmarkedHadoop /HBase clusters for internal use.
- Developed Java MapReduce programs for the analysis of sample log file stored in cluster.
- Developed Simple to complex Map/reduce Jobs using Hive and Pig.
- Developed Map Reduce Programs for data analysis and data cleaning.
- Developed PIG Latin scripts for the analysis of semi structured data.
- Developed PIG UDF’s (like UTAF’s & UDAF’s ) for manipulating the data according to business Requirements and also worked on developing custom PIG Loaders.
- Implemented data pipeline by chaining multiple mappers by using Chained Mapper.
- Used Hive and created Hive tables and involved in data loading and writing Hive UDFs.
- Used Sqoop to import data into HDFS and Hive from other data systems.
- Migration of ETL processes from Oracle to Hive to test the easy data manipulation.
- Developed Web services using SOAP, JAX-WS and WSDL.
- Developed persistence layer using ORM hibernate for transparently store objects into database.
- Used Hibernate to map the database tables using hbm.xml files.
- Used both SAX and DOM parser for XML related development.
- Used Web services - WSDL and SOAP for getting credit card information from third part
- Created java Interfaces and Abstract classes for different functionalities.
- Implemented Multithread concepts in java classes to avoid deadlocking.
- Developed Data Access Classes using the Hibernate.
Environment: ApacheHadoop, HDFS, Cloudera Manager, Java, MapReduce, Eclipse Indigo, Hive, PIG, Sqoop, Oozie, SQL, Hibernate, spring.
Confidential, Warren, NJ
Java/J2ee DeveloperResponsibilities:
- Actively involved in UI design for application.
- Used Struts validation logic to generate user friendly error messages by using application properties.
- Developed entire JSP pages for the application.
- Used JSTL and custom tag library with frameworks such as Ajax and JQuery, to build interactive and attractive user interfaces.
- Developed the User interface screens using HTML, JSP and AJAX.
- Worked on GUI with Java Script frameworks JQuery, XML.
- Used JSON objects to transfer data between controllers and web services.
- Used Log4J logging framework. Log messages with various levels are written in all the Java code.
- Used the MVC model to create the Object Model.
- Developed server side code using Struts and Servlets.
- Created struts-config.xml file for the Action Servlet to extract the data from specified Action form so as to send it to specified instance of action class.
- Extensively used JSP, CSS, XML, XSL, and Servlets for the presentation.
- Responsible for Testing and deploying application into Preprod and Prod servers.
Environment: Java, Struts, JSP, JQuery, CSS, HTML, XML, Oracle, Tomcat, Eclipse, Linux, Windows, spring, Hibernate.
Confidential
JAVA Developer
Responsibilities:
- Involved in the analysis, design, implementation, and testing of the project.
- Implemented the presentation layer with HTML, XHTML and JavaScript.
- Involving in creation object model to relational using hibernate
- Developed web components using JSP, Servlets and JDBC.
- Implemented database using SQL Server.
- Consumed Web Services for transferring data between different applications.
- Designed tables and indexes.
- Wrote complex SQL and stored procedures.
- Involved in fixing bugs and unit testing with test cases using JUnit.
- Developed user and technical documentation
Environment: Java, SQL, Servlets, HTML, XML, Hibernate, JavaScript, spring, Hibernate.