Java Developer Resume
SUMMARY
- Over 10+ years of professional IT experience which includes experience in Bigdata, Hadoop ecosystem related technologies in Banking, Retail, Insurance and Communication sectors
- Well versed in installation,configuration,supporting and managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience on major components in Hadoop Ecosystem like Hadoop Map Reduce, HDFS, HIVE, PIG, Hbase, Zookeeper, Sqoop,Oozie, Flume and Avro.
- Excellent understanding and knowledge of NOSQL databases like MongoDB, HBase, Cassandra, and SOLR/Lucene.
- Responsible for setting up processes for Hadoop based application design and implementation.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems Vice - versa.
- Good understanding of Data Structures and Algorithms.
- Experience in managing and reviewing Hadoop log files.
- Very good experience in complete project life cycle (design, development, testing and implementation) of Client Server and Web applications.
- Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
- Experience in managing Hadoop clusters using Cloudera Manager tool.
- Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
- Extensive expertise on development using SQL and PL/SQL
- Extensive expertise working in Oracle, DB2, SQL Server and My SQL database.
- Experience in Java, JSP, Servlets, EJB, WebLogic, WebSphere, Hibernate, Spring, JBoss, JDBC, RMI, Java Script, Ajax, Jquery, XML, and HTML
- Knowledge on Unix and Database administration.
- Determined, committed, hardworking with strong communication, interpersonal and organizational skills.
- Ability to work in a team and coordinate/resolve issues with team of developers and other stakeholders.
TECHNICAL SKILLS
Big Data/Hadoop: HDFS, HadoopMapReduce, Zookeeper, Hive, Pig, Sqoop, Flume, Oozie
Project Management Tools: Microsoft Project, MS Excel, MS Access, MS PowerPoint
Requirements Visual Modeling Tools: Rational Rose, MS Visio
Langauges: C, C++, Java 6, PHP, SQL/PLSQL, Python, Perl, Shell ScriptingMS SharePoint, Teradata
Web Technologies: HTML, Java Script, XML, ODBC, JDBC, Java Beans, EJB, MVC, Ajax, JSP,. Servlets, Struts, Junit, REST API, Spring, Hibernate
Methodologies: GAP Analysis, Agile, RUP, UML, SDLC
Business Modeling Tools: MS Visio, Rational Rose, MS Project
Databases: HBase, MongoDB, Cassandra, Oracle 10g, MySQL, Couch, MS SQL server
Operating Systems: Windows 98/2000/XP/7, UNIX
PROFESSIONAL EXPERIENCE
Confidential
Responsibilities:
- Worked on analysing Hadoop cluster and different big data analytic tools including Pig, Hive Hbase database and sqoop.
- Responsible for building scalable distributed data solutions using Hadoop Installed and configured Flume, Hive, Pig, Sqoop, HBase on the Hadoop cluster.
- Managing and scheduling Jobs on a Hadoop cluster using Oozie.
- Implemented nine nodes CDH4 Hadoop cluster on Ubuntu LINUX.
- Worked on installing cluster, commissioning & decommissioning of datanode, namenode recovery, capacity planning, and slots configuration.
- Setup Hadoop cluster on Amazon EC2 using whirr for POC.
- Resource management of HADOOP Cluster including adding/removing cluster nodes for maintenance and capacity needs.
- Involved in loading data from UNIX file system to HDFS.
- Created HBase tables to store variable data formats coming from different portfolios.
- Implemented various requirements using Pig scripts
- Implemented test scripts to support test driven development and continuous integration.
- Responsible to manage data coming from different sources.
- Installed and configured Hive and also implemented various business requirements by writing Hive UDFs.
- Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.
- Cluster coordination services through Zookeeper.
- Experience in managing and reviewing Hadoop log files.
- Exported the analysed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
- Analysed large amounts of data sets to determine optimal way to aggregate and report on it.
- Supported in setting up QA environment and updating configurations for implementing scripts with Pig andsqoop.
Environment: Hadoop, HDFS, Hive, Flume, HBase, Sqoop, PIG, Java (JDK 1.6), Eclipse, MySQL and Ubuntu, Zookeeper
Confidential
Responsibilities:
- Involved in review of functional and non-functional requirements.
- Installed and configured Hadoop Mapreduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Installed and configured Pig and also written PigLatin scripts.
- Wrote MapReduce job using Pig Latin.
- Have solid understanding of REST architecture style and its application to well performing web sites for global usage.
- Involved in ETL, Data Integration and Migration. Imported data using Sqoop to load data from Oracle to HDFS on regular basis.
- Developing Scripts and Batch Jobto schedule various Hadoop Program.
- Written Hive queries for data analysis to meet the business requirements.
- Creating Hive tables and working on them using Hive QL. Importing and exporting data into HDFS from Oracle Database and vice versa using Sqoop.
- Experienced in defining job flows.
- Got good experience with NOSQL database HBase, MongoDB.
- Hybrid implementation using Oracle and MongoDB.
- Involved in creating Hive tables, loading the data and writing hive queries that will run internally in a map reduce way.
- Developed a custom FileSystem plugin for Hadoop so it can access files on Data Platform.
- The custom FileSystem plugin allows HadoopMapReduce programs, HBase, Pig and Hive to work unmodified and access files directly.
- Designed and implemented Mapreduce-based large-scale parallel relation-learning system. Extracted feeds form social media sites such as Facebook, Twitter using Python scripts.
- Setup and benchmarked Hadoop/HBase clusters for internal use.
Environment: Hadoop, MapReduce, HDFS, Hive, Java, Hadoop distribution of Cloudera, Pig, HBase, Linux, XML, Java 6, Eclipse, Oracle 10g, PL/SQL, MongoDB, Toad
Confidential
Responsibilities:
- Installed and configured Hadoop Map Reduce, HDFS, Developed multiple Map Reduce jobs injava for data cleaning and preprocessing.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Involved in defining job flows.
- Involved in managing and reviewing Hadoop log files.
- Load and transform large sets of structured, semi structured and unstructured data.
- Responsible to manage data coming from different sources.
- Responsible for implementing Mongo DB to store and analyze unstructured data.
- Supported Map Reduce Programs those are running on the cluster.
- Involved in loading data from UNIX file system to HDFS.
- Installed and configured Hive and also written Hive UDFs.
- Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
- Implemented CDH3 Hadoop cluster on CentOS.
- Worked on installing cluster, commissioning & decommissioning of datanode, namenode recovery, capacity planning, and slots configuration.
- Created HBase tables to store variable data formats of PII data coming from different portfolios.
- Implemented best income logic using Pig scripts.
- Load and transform large sets of structured, semi structured and unstructured data Cluster
- Coordination services through Zookeeper
- Exported the analyzed data to the relational databases using Sqoop for visualization and to
- Generate reports for the BI team
- Supported in setting up QA environment and updating configurations for implementing scripts With Pig and Sqoop.
- DeDesigned the logical and physical data model, generated DDL scripts, and wrote DML scripts for Oracle 9i database.
- Used Hibernate ORM framework with Spring framework for data persistence and transaction management. Used struts validation framework for form level validation.
- Wrote test cases in JUnit for unit testing of classes.
- Involved in templates and screens in HTML and Javascript
- Involved in templates and screens in HTML and JavaScript
Environment: Hadoop, MapReduce, HDFS, Hive, Java, SQL, PIG, Zookeeper, MongoDB, Sqoop, CentOS, SOLR. Struts 1.3, JSP, Servlets 2.5, WebSphere 6.1, HTML, XML, JavaScript
Confidential
Responsibilities:
- Involved in the analysis of the existing credit card processing system, mapping phase according to functionality and data conversion procedure.
- Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
- Involved in complete requirement analysis, design, coding and testing phases of the project.
- Participated in JAD meetings to gather the requirements and understand the End Users System.
- Developed user interfaces using JSP, HTML, XML and JavaScript.
- Generated XML Schemas and used XML Beans to parse XML files.
- Created Stored Procedures & Functions. Used JDBC to process database calls for DB2/AS400 and SQL Server databases.
- Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.
- Developed web application called iHUB (integration hub) to initiate all the interface processes using Struts Framework, JSP and HTML.
- Developed the interfaces using Eclipse 3.1.1 and JBoss 4.1 Involved in integrated testing, Bug fixing and in Production Support
Environment: Java 1.3, Servlets, JSPs, Java Mail API, Java Script, HTML, MySQL 2.1, Swing, Java Web Server 2.0, JBoss 2.0, RMI, Rational Rose, Red Hat Linux 7.1.
Confidential
Java Developer
Responsibilities:
- Accumulated system requirements from various departments through surveys and interviews.
- DesiDesigned and developed Struts like MVC 2 Web framework using the front-controller design pattern, which is used successfully in a number of production systems.
- Normalized Oracle database, conforming to design concepts and best practices.
- Resolved product complications at customer sites and funneled the insights to the development and deployment teams to adopt long term product development strategy with minimal roadblocks.
- Convinced business users and analysts with alternative solutions that are more robust and simpler to implement from technical perspective while satisfying the functional requirements from the business perspective. Played a crucial role in developing persistence layer.
- Developed Database applications using SQL and PL/SQL
- Applied design patterns and OO design concepts to improve the existing Java/J2EE based code base.
- Identified and fixed transactional issues due to incorrect exception handling and concurrency issues due to unsynchronized block of code.
Environment: Java 1.2/1.3, Swing, Applet, Servlet, JSP, custom tags, JNDI, JDBC, XML, XSL, DTD, HTML, CSS, Java Script, Oracle, DB2, PL/SQL, Weblogic, JUnit, Log4J and CVS.