We provide IT Staff Augmentation Services!

Java Developer Resume

0/5 (Submit Your Rating)

Indianapolis, IN

SUMMARY

  • 8+ years of overall experience in applications development, integration, maintenance and support using Java/J2EE, Hadoop platforms.
  • 3+ years of experience in Hadoop platform (Apache, Cloudera and Hortonworks).
  • Experience in setting up the Hadoop cluster for the Environment.
  • Experience in upgrading the existing hadoop cluster to latest releases.
  • Experience in Linux file system and its administration.
  • Experience in trouble shooting and maintenance of cluster.
  • Basic Knowledge of ETL tools.
  • Experienced in using NFS (network file systems) for Name node metadata backup.
  • Developed Map Reduce programs to perform analysis.
  • Expertise in understanding, solving big data problems and plan, design, create a Hadoop cluster, setup and configure commonly used Hadoop ecosystem components such as HDFS, Map Reduce, Hive, Pig, HBase, Sqoop and Oozie.
  • Strong knowledge in development of Object Oriented and Distributed applications.
  • Written unit test cases using JUnit and MRUnit for MapReduce jobs.
  • Good knowledge in programming JDBC, Servlets and JSP.
  • Experience in working with ETL workflow.
  • Developed Map Reduce programs to perform analysis.
  • Have hands on experience on various DB platforms like Oracle, MySQL, DB2 and MS SQL Server.
  • Expertise in writing automation test cases using Selenium framework.
  • Good knowledge in NoSQL databases including Cassandra and MongoDB.
  • Experienced with Agile SCRUM methodology, involved in design discussions and work estimations, takes initiatives, very proactive in solving problems and providing solutions.
  • Hands on experience in Analysis, Design, Coding and Testing phases of Software Development Life Cycle (SDLC).
  • Provided technology, domain trainings through regular knowledge sharing sessions, guided team members through hands - on lab sessions which enabled them quickly learn the essentials.
  • Have strong interpersonal skills resulting in exceptional rapport with people, ability to meet deadlines, adaptive in learning and working on various technologies.

TECHNICAL SKILLS

Languages: Java, Java Script, XML, XSL, XSLT, SQL and PL/SQL

Web Frameworks: Servlets, JSP, Spring MVC, Struts, Hibernate, Apache Axis2

Servers: IBM WebSphere Application Server, JBoss & Apache Tomcat

IDE: IBM Rational Application Developer (RAD), Eclipse, XCode

Databases: IBM DB2, Oracle, My SQL & Microsoft SQL Server

NoSQL Databases: Cassandra, MongoDB

BigData: Hadoop HDFS, MapReduce, HBase, Hive, Sqoop

Hadoop Distributions: Apache, Cloudera, Hortonworks

Operating Systems: Cent OS, MacOS X, RHEL, Ubuntu, Windows 7

Version Control: IBM Rational ClearCase, Microsoft Visual Source Safe (VSS), Microsoft Team Foundation Server (TFS)

Testing Frameworks: Selenium WebDriver, Apache JUnit, Apache MRUnit

PROFESSIONAL EXPERIENCE

Confidential, Houston, TX

Hadoop Consultant

Responsibilities:

  • Involved in building a multi-node Hadoop Cluster.
  • Installed and configured Cloudera Manager.
  • Worked on performing minor upgrade of cluster from CDH3u4 to CDH3u5, major upgrade of cluster from CDH3u5 to CDH4.3.0.
  • Implemented Namenode HA and automatic failover infrastructure to overcome single point of failure for Namenode utilizing Zookeeper services.
  • Worked on Commissioning and Decommissioning Datanodes and TaskTrackers.
  • Installed and configured Cassandra cluster and CQL on the cluster.
  • Developed multiple MapReduce jobs in Java for data cleaning and processing.
  • Worked on implementing custom Hive and Pig UDF's to transform large amounts of data.
  • Developed custom shell scripts for backing up metadata of Namenode.
  • Experience in analyzing data using hive queries, pig scripts and MapReduce programs.
  • Implemented Oozie workflow engine to manage inter-dependent Hadoop jobs and to automate Hadoop jobs such as Hive, Sqoop and system-specific jobs.
  • Exported the business required information to RDBMS using Sqoop to make the data available for BI team to generate reports based on data.
  • Worked on creating Keyspaces and loading data on the Cassandra Cluster.
  • Monitored the nodes, the streaming process between nodes during the startup of new nodes and clearing of keys which are no longer used using the nodetool utility.
  • Experience in working with development team to resolve Cassandra related issues.

Environment: Linux, Java, Map Reduce, HDFS, Oracle, SQL server, tableau, Hive, Pig, Sqoop, Cloudera manager, Cassandra

Confidential, Houston, TX

Hadoop Consultant

Responsibilities:

  • Installed, configured and deployed a 50 node Cloudera Hadoop cluster for development, production.
  • Worked on setting up high availability for major production cluster and designed automatic failover.
  • Maintenance and troubleshooting the cluster.
  • Configured Hive metastore, which stores the metadata for Hive tables and partitions in a relational database.
  • Maintained company SQL server.
  • SQL Database Integrity and database version upgrade.
  • Managing users and resource of SQL database.
  • Configured Flume for efficiently collecting, aggregating and moving large amounts of log data.
  • Worked on configuring security for hadoop cluster (Kerberos, Active Directory).
  • Cloudera Administration (performance tuning, commissioning and decommission).
  • Responsible to manage data coming from different sources.
  • Installed and configured Zookeeper for Hadoop cluster.
  • Bash Scripting in LINIX shell.
  • Tuning MR Program’s those are running on the Hadoop cluster.
  • Involved in HDFS maintenance, Upgrading the cluster to latest versions of CDH.
  • Wrote Map Reduce job using Java API.
  • Imported/exported data from RDMS to HDFS using Sqoop.
  • Wrote Hive queries for data analysis to meet the business requirements.
  • Created Hive tables and working on them using Hive QL.

Environment: Linux, Java, Map Reduce, HDFS, Hive, Pig, Sqoop, FTP, oozie, Teradata, Cloudera manager

Confidential, Atlanta, GA

Hadoop Consultant

Responsibilities:

  • Exported data from DB2 to HDFS using Sqoop and NFS mount approach.
  • Moved data from HDFS to Cassandra using Map Reduce and BulkOutputFormat class.
  • Developed Map Reduce programs for applying business rules on the data.
  • Developed and executed hive queries for denormalizing the data.
  • Works with ETL workflow, analysis of big data and loaded them into hadoop cluster.
  • Installed and configured Hadoop Cluster for development and testing environment.
  • Implemented Fair scheduler on the Job tracker to share the resources of the cluster for the map reduces jobs given by the users.
  • Automated the work flow using shell scripts.
  • Performance tuning of the hive queries, written by other developers.

Environment: Linux, Java, Map Reduce, HDFS, DB2, Cassandra, Hive, Pig, Sqoop, FTP

Confidential, Indianapolis, IN

Java Developer

Responsibilities:

  • Worked on front end of the application which is called CLIQ.
  • Worked on Quote Summary screen which displays all the Information that has been selected throughout the application.
  • Used Service-Oriented Architecture (SOA) to communicate between different applications.
  • Responsible for the source code maintenance using version control using Source Jammer Versions system.
  • Implemented the MVC architecture using Struts framework. Implemented J2EEdesign patterns like Service Locator, Business delegate design patterns.
  • Developed the front-end components usingJava Server Pages, Servers HTML and XML.
  • Coding Different Java helper and validation classes for the Application logicandutilities.
  • Process data feeds from external systems and generate feeds from the application database and send to other systems.
  • Performed logging of the application using log4j framework.
  • Involved in development of Unit test cases and testing the application using JUnit 3.

Environment: Java, websphere 6, RAD 7, Servlets 2.3, JSP, JSTL, Expression Language, Custom tags, EJB 2.0, Log4j, JavaScript, JUNIT, JMS, XML, XSL, XSLT, TML, DHTML, JDBC, PL/SQL, Web Services, AJAX, Solaris, Rational Clearcase, Rational Clearquest

Confidential

Senior Software Engineer

Responsibilities:

  • Developed the code using the struts framework.
  • Involved in requirement analysis.
  • Developed UI components using JSP and JavaScript.
  • Involved in writing the technical design document.
  • Developed the front end for the site based on (MVC) design pattern Using Struts framework.
  • Created a data access layer to make rest of the code database independent.
  • Developed JSPs, ServLets and created java beans for the application.
  • Developed sample requests and responses for testing web services.
  • Deployed web applications on server using Apache Tomcat.
  • Developed new code for the change requests.
  • Developed complex PL/SQL queries to access data.
  • Coordinated across multiple development teams for quick resolution to blocking issues.
  • Prioritized tasks and coordinated assignments with the team.
  • Performed on call support on a weekly rotation basis.
  • Performed manual and automated testing.
  • Involved in writing and updating the Test cases in the Quality tool.

Environment: JSP, Java Bean, Servlets, Oracle, HTML & JAVASCRIPT, JDBC, PL/SQL, The web-tier consists of Apache

Confidential

Software Engineer

Responsibilities:

  • Involved in Coding and Unit in Job Submission/Controller module consists of Creation, Modification and Deletion.
  • Involved in coding java classes and Jsp using Struts Framework.
  • Design and development of SOA, web services components using XML, SOAP and JAX-WS.
  • Development of PL/SQL and Stored procedures for Oracle.
  • Developed and deployment of web services.
  • Web Services based on AXIS distribution for publishing and consuming internally.
  • Defect mitigation or fixing issues/ bugs.
  • Setting up build environment by writing Ant build.xml, taking build, configuring, and deploying of the application in all the servers (dev, stage and production servers).

Environment: Java, Struts framework, J2EE, JSP, Hibernate, SOA, webservices(SOAP), WSDL, SQLServer 2000, My Eclipse, JBoss (4.0/5.0) Appication Server

We'd love your feedback!