Java J2ee Developer Resume
Boston, MA
SUMMARY
- Over all 8 years of experience in IT industry which includes 4 years of experience in Big Data and Hadoop Ecosystem tools.
- 4 years of experience in JAVA programming with various technologies like Java, J2EE, JavaScript andData Structures.
- Excellent working knowledge of HBase and data pre - processing using FLUME-ng.
- Experience in writing MapReduce jobs using apache Crunch.
- Hands on experience in installing, configuring Hadoopecosystems like Flume-ng, Hbase, Zoo Keeper, Oozie, Hive, Sqoop,Hue, Pig andHue with CDH3.x &4.x.
- Experience in Big Data analysis using PIG and HIVE and understanding of SQOOP and Puppet.
- Experience in developing customized UDF’s in java to extend Hive and Pig Latin functionality.
- Good understanding of HDFS Designs, Daemons, federation and HDFS high availability (HA).
- Strong experience working with Test Driven Development (TDD)techniques such as JUunit and Mockitoalong with code coverage tools like Emma.
- Well versed indesigning and implementing MapReduce jobs using JAVA on Eclipse to solve real world scaling problems.
- Hands on experience working with Java project build managers Apache MAVEN and ANT.
- Solid understanding of the high volume, high performance systems.
- Strong Java development skills using J2EE, J2SE, Servlets, JSP, EJB, JDBC.
- Fair amount of experience working using Python. with MapReduce
- Good knowledge in integration of various data sources like RDBMS, Spreadsheets, Text files, JSON and XML files.
- Basic Knowledge of UNIX and shell scripting.
- Have flair to adapt to new software applications and products, self-starter, have excellentcommunication skills and good understanding of business work flow.
TECHNICAL SKILLS
Big Data Technologies: Hadoop, HDFS, Hive, MapReduce, Pig, Sqoop, Flume, Zookeeper, Crunch, Oozie
Scripting Languages: Python, Perl
Programming Languages: C, Java
J2EE Technologies: JSP, Servlets, EJB 3.0, JDBC, Hibernate
NoSQL Databases: Hbase, Redis and Cassandra
Search engines: Elastic Search
Build tools: Maven and ANT
Operating Systems: LINUX (Centos and Ubuntu), Windows XP, 7, MS DOS
Office Tools: MS-OFFICE - Excel, Word, PowerPoint.
PROFESSIONAL EXPERIENCE
Confidential, MD
Hadoop Developer
Responsibilities:
- Responsible for loading the customer’s data and event logs from MSMQ into HBase using REST API.
- Responsible for architecting Hadoop clusters with CDH4 on CentOS, managing with Cloudera Manager.
- Involved in installing Hadoop, Hive, Hbase, Zookeeper, Flume and Oozie.
- Used Oozie to orchestrate the map reduce jobs that extract the data on a timely manner.
- Used Hive to find correlations between customer’s browser logs in different sites and analyzed them to build risk profile for such sites.
- Developed an alert module that triggers alert messages for fraudulent sites.
- Used Maven and ant to build the application.
- Developed MR jobs using python, performed unit testing using MRUnit.
- Involved in initiating and successfully completing Proof of Concept on FLUME for Pre-Processing, Increased Reliability and Ease of Scalability over traditional MSMQ.
- Responsible for writing code to implement interceptors, Serializers and Selectors in flume to Handle PII data.
- Used REDIS to store the HBase table mapping to the corresponding attributes and categories.
- Implemented Cassandrato store Sensitive fields and Masking/Encryption values to handle Sensitive data.
- Used Elasticssearch to index the HBase entities for extremely low latency application.
- Involved in Agile development andTest Driven Development (TDD) methodology.
Environment: JDK1.6,CentOS, FLUME, HBase, HDFS, Maven, Redis, Cassandra,Map-Reduce, Hive, Oozie, Zookeeper.
Confidential, Kansas city, MO
Hadoop Consultant
Responsibilities:
- Installed and configured ClouderaHadoop on a 24 node cluster.
- Loaded data from Oracle database into HDFS.
- DevelopedMapReduce pipeline jobs in Apache Crunch to process the data and create necessary HFiles.
- Loaded the created HFiles into HBase for faster access of large customer base without taking
- Performance hit.
- Used apache Maven for project build.
- Performed unit testing of MapReduce jobs on cluster using MRUnit.
- Used Oozie scheduler system to automate the pipeline workflow.
- Actively participated in software development lifecycle (scope, design, implement, deploy, test), including design and code reviews, test development, test automation.
- Implemented data serialization using apache Avro.
- Involved in story-driven agile development methodology and actively participated in daily scrum meetings.
- Analyzed Business Requirement Document written in JIRA and participated inpeer code reviews in Crucible.
Environment: Cloudera Hadoop, MapReduce, HDFS,Crunch,HBase, Avro, Oozie, Java (jdk1.6), JIRA, Crucible, GitHub, Maven.
Confidential, Boston, MA
Hadoop Consultant
Responsibilities:
- Installed and configured Hadoop clusters for Dev, Qa and production environments
- Installed and configured the Hadoop name node ha service using Zookeeper.
- Installed and configured Hadoop security and access controls using Kerberos, Active Directory
- Responsible to manage data coming from different sources into hdfs through Sqoop, Flume
- Troubleshooting and monitoring Hadoop services using Cloudera manager.
- Developed multiple MapReduce jobs for preprocessing the data
- Importing and exporting data into HDFS and Hive using Sqoop.
- Developed oozie workflows to automate data extraction process from data warehouses.
- Supported Map Reduce Programs those are running on the cluster.
- Monitoring and tuning Map Reduce Programs running on the cluster.
Environment: Java 6, Eclipse, Linux, Hadoop, HBase, Sqoop, Pig, Hive, Flume, Zoo keeper.
Confidential, Memphis, TN
Java Developer
Responsibilities:
- Involved in the designing of the project using UML.
- Followed J2EE Specifications in the project.
- Designed the user interface pages in JSP.
- Used XML and XSL for mapping the fields in database.
- Used JavaScript for client side validations.
- Created stored procedures and triggers that are required for project.
- Created functions and views in Oracle.
- Enhanced the performance of the whole application using the stored procedures and prepared statements.
- Responsible for updating database tables and designing SQL queries using PL/SQL.
- Created bean classes for communicating with database.
- Involved in documentation of the module and project.
- Prepared test cases and test scenarios as per business requirements.
- Involved in bug fixing.
- Prepared coded applications for unit testing using JUnit.
Environment: Java, JSP, Servlets, J2EE, EJB 3, Java Beans, Oracle, HTML, DHTML, XML, XSL, JavaScript, BEA WebLogic.
Confidential, Minneapolis, MN
Java/J2EE Consultant
Responsibilities:
- Analysis and understanding of business requirements.
- Developed views and controllers for client and manager modules using Spring MVC 3.0 and Spring Core 3.0.
- Business logic is implemented using Spring Core 3.0 and Hibernate.
- Data Operations are performed using Spring ORM wiring with Hibernate and Implemented Hibernate Template and criteria API for Querying database.
- Developed Exception handling framework and used log4J for logging.
- Developed Web Services using XML messages that use SOAP. Developed Web Services for Payment Transaction and Payment Release.
- Developed Restful web Services
- Created WSDL and the SOAP envelope.
- Developed and modified database objects as per the requirements.
- Involved in Unit integration, bug fixing, acceptance testing with test cases, Code reviews.
Environment: Java/J2EE, JSP, CSS, Java Script, AJAX, Hibernate, spring 3.0, XML, Web Services, SOAP, Restful, Maven, Rational Rose, HTML, Log4J, JBoss 4.
Confidential, Somerset, NJ
Java J2EE Developer
Responsibilities:
- Coded the business methods according to the IBM Rational Rose UML model.
- Extensively used Core Java, Servlets, JSP and XML.
- Used Struts 1.2 in presentation tier.
- Generated the Hibernate XML and Java Mappings for the schemas
- Used DB2 Database to store the system data
- Used Rational Application Developer (RAD) as Integrated Development Environment (IDE).
- Used unit testing for all the components using JUnit.
- Used Apache log 4j Logging framework for logging of trace and Auditing.
- Used Asynchronous JavaScript and XML (AJAX) for better and faster interactive Front-End.
- Used IBM Web-Sphere as the Application Server.
- Used IBM Rational Clearcase as the version controller.
Environment: s: Java 1.6, Servlets, JSP, Struts1.2, IBM Rational Application Developer (RAD) 6, Web sphere 6.0, iText, AJAX, Rational Clear case, Rational Rose, Oracle 9i, log4j.