We provide IT Staff Augmentation Services!

Sr Java/j2ee Developer Resume Profile

5.00/5 (Submit Your Rating)

Northbrook, IL

SUMMARY

  • 7 years of extensive IT experience in Analysis, Design, Development, Implementation and testing of software applications which includes 3 Years of experience in Big data using Hadoop, Hive, Pig, Sqoop and Map Reduce Programing.
  • Experience on major components of Hadoop Ecosystem like Hadoop, Map Reduce, Hadoop Distributed File System HDFS , JobTracker, TaskTracker, NameNode, DataNode, YARN HIVE, PIG
  • Developed scripts, numerous batch jobs scheduled under Hadoop ecosystem.
  • Experience in analyzing data using HiveQL, Pig Latin, and custom Map Reduce programs in Java.
  • Worked on Importing and exporting data from different databases like Oracle, Teradata into HDFS and Hive using Sqoop.
  • Strong experience in collecting and storing stream data like log data in HDFS using Apache Flume.
  • Wrote Hive and Pig queries for data analysis to meet the business requirements.
  • Involved in Creating tables, partitioning, bucketing of table and creating UDF's in Hive.
  • Experienced implementing Security mechanism for Hive Data.
  • Well experienced with Implementing join operations using Pig Latin.
  • Involved in writing data transformations, data cleansing using Pig operations.
  • Excellent understanding of NoSQL databases like HBase, Cassandra.
  • Experienced with Performing CRUD operations using HBase java API and Rest API.
  • Good knowledge on Cassandra, DataStax Enterprise,DataStax OpsCenter and CQL.
  • Experience with Oozie Workflow Engine to automate and parallelize Hadoop Map/Reduce, Hive and Pig jobs.
  • Experienced with different file formats like Avro, Xml, Json and Sequence file formats.
  • Extending HIVE and PIG core functionality by using custom UDF's.
  • Excellent Java development skills using J2EE Frameworks like Spring, Hibernate, EJBs and Web services.
  • Experienced with implementing SOAP and Rest based Web Services.
  • Experienced in creating and analyzing Software Requirement Specifications SRS and Functional Specification Document FSD . Strong knowledge of Software Development Life Cycle SDLC .
  • Experience in Object Oriented Analysis, Design OOAD and development of software using UML Methodology, good knowledge of J2EE design patterns and Core Java design patterns.
  • Extensive knowledge in creating PL/SQL stored Procedures, packages, functions, cursors against Oracle 9i, 10g, 11g , and MySQL server.
  • Experienced in preparing and executing Unit Test Plan and Unit Test Cases using Junit, Easy mock and MRUnit.
  • Experienced with build tools like Maven, ant and CI tools like Jenkins.
  • Worked with version controls like CVS, SVN, Git and Clear Case.
  • Experience in Scrum, Agile and Waterfall models.
  • Committed to professionalism, highly organized, ability to work under strict deadline schedules with attention to details, possess excellent written and communication skills.
  • Ability to work effectively in a multi-cultural environment with a team and individually as per the project requirement.

TECHNICAL SKILLS

Hadoop ecosystem

Map Reduce, Sqoop, Hive, Oozie, PIG, HBase, HDFS, Zookeeper, Flume, Cassandra.

Java J2EE Technologies

Core Java, Servlets, JSP, JDBC, JNDI, Java Beans.

Languages

C, C , JAVA, SQL,PL/SQL, Pig Latin, HiveQL, Unix shell scripting.

Frameworks

MVC, Spring, Hibernate, Struts 1/2,EJB, JMS, Junit, MRUnit.

Databases

Oracle 11g/10g/9i,MySQL,DB2,MS SQL Server.

Application Server

Apache Tomcat, Jboss, IBM Web sphere, Web Logic.

Web Services

WSDL, SOAP, Apache CXF, Apache Axis, REST, Jersey.

Methodologies

Agile, Waterfall.

PROFESSIONAL EXPERIENCE

Confidential

Title: Sr. Hadoop Developer.

Roles and Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Handled importing of data from multiple data sources using Sqoop, performed Cleaning, Transformations and Joins using Pig.
  • Designed and implemented Map Reduce based large-scale parallel relation-learning system.
  • Exported analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Experience in providing support to data analyst in running Pig and Hive queries.
  • Continuous monitoring and managing the Hadoop cluster using Cloudera Manager.
  • Creating Hive tables, partitions to store different Data formats using Serde's.
  • Implemented Hive generic UDF's for complex business logic.
  • Used UDF's to implement business logic in Hadoop and responsible to manage data coming from different sources.
  • Created workflows using Oozie coordinator jobs to configure cron jobs that get data from different data sources.
  • Involved in loading data from UNIX file system to HDFS.
  • Experience in managing and reviewing Hadoop log files.
  • Consolidate all defects, report it to PM/Leads for prompt fixes by development teams and drive it to closure.
  • Wrote shell scripts to monitor the health check of Hadoop daemon services and responded accordingly to any warning or failure conditions using Ganglia.
  • Configuring automatic version build using Maven plug-in's and Jenkins.

Environment: Apache Hadoop 1.1.2, Map Reduce, HDFS, Hive, Pig, Oozie, Sqoop, Flume, Java, SQL, Eclipse, Unix Script, MySQL, BO, YARN and Ganglia.

Confidential

Title: Hadoop Developer.

Roles and Responsibilities:

  • Designed and implemented Map Reduce based large-scale parallel processing.
  • Handled importing of data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS.
  • Experienced working with real time analytical operations using HBase.
  • Implemented CRUD operations on HBase Data using Java API and Rest API.
  • Integrated HBase with Map reduce to work on bulk data.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • To set up standards and processes for Hadoop based application design and implementation.
  • Written Hive queries for data analysis to meet the business requirements.
  • Creating Hive tables and working on them using Hive QL.
  • Extending HIVE and PIG core functionality by using UDF's.
  • Experienced with performance tuning against Hive operations.
  • Experience in Designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and Hadoop ecosystem.
  • Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.

Environment: Java, Eclipse, Linux, Apache Hadoop 1.0.3, Map Reduce, HBase, Sqoop, Pig, Hive, Flume, Oracle 10g.

Confidential

Title: Hadoop Developer.

Roles and Responsibilities:

  • Launching Amazon EC2 Cloud Instances using Amazon Images Linux/Ubuntu and Configuring launched instances with respect to specific applications.
  • Launching and Setup of HADOOP Cluster which includes configuring different components of HADOOP.
  • Hands on experience in loading data from UNIX file system to HDFS.
  • Implemented POC's to configure data tax Cassandra with Hadoop.
  • Experienced with Performing Cassandra Query operations using Thrift API to perform real time analytics.
  • Implemented CDH3 Hadoop cluster on CentOS.
  • Cluster coordination services through Zookeeper.
  • Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
  • Involved in creating Hive tables, loading data and running hive queries in those data.
  • Extensive Working knowledge of partitioned table, UDFs, performance tuning, compression-related properties, thrift server in Hive.
  • Involved in writing optimized Pig Script along with involved in developing and testing Pig Latin Scripts.
  • Working knowledge in writing Pig's Load and Store functions.
  • Used struts validation framework for form level validation.
  • Wrote test cases in JUnit for unit testing of classes.
  • Involved in templates and screens in HTML and JavaScript.

Environment: Apache Hadoop 1.0.1, MapReduce, HDFS, CentOS, Zookeeper, Sqoop, Cassandra, Hive, Pig, Oozie, Java, Eclipse, Amazon EC2, JSP, Servlets.

Confidential

Sr Java/J2EE Developer.

Roles and Responsibilities:

  • Participated in use case review meetings with business analysts to completely understand the requirements.
  • Involved in complete requirement analysis, design, coding and testing phases of the project.
  • Completely involved in back-end development Business Layer of the application using Java/J2EE technologies.
  • Documented the events, workflows, code changes, bugs fixes related to enhancing new features and correcting code defects.
  • Implemented MVC architecture and Web Flow module using Spring.
  • Worked with JSTL tags and AJAX implementation in developing the JSF pages.
  • Implemented SOAP based JMS Web Services using Spring CXF.
  • Used Hibernate to develop the Data Access layer.
  • Designed and developed Database scripts for further enhancements and requirements.
  • Updated Struts-config.xml files to integrate all the components in the Struts framework.
  • Used SVN for software configuration management and version control.
  • Giving demos and status updates of the application.
  • Involved in code reviews of the some parts of the application.

Environment: Java, Spring, Web Services, JBoss, Eclipse, Oracle 10g , Sql Server, JDBC, JIRA, SVN, Windows XP, JSP, EJB, struts 1.3 , XSL , XSLT , JSTL, Java script, AJAX, HTML, CSS.

Confidential

Title: JAVA Developer.

Roles and Responsibilities:

  • Involved in business requirements analysis.
  • Built the application using Struts framework with JSP as view part.
  • Developed Dispatch Actions, Action Forms and Custom taglibs in Struts framework.
  • Designed JSP pages as view in Struts for frontend templates.
  • Developed Session Beans for handling the back business requirements.
  • Used the RSD IDE for development and Clear Case for the versioning.
  • Involved in configuring the resources and administering the Web sphere application server 6.
  • Written stored procedures in DB2.
  • Developed code to handle web requests involving Request Handlers, Business Objects, and Data Access Objects.
  • Coded different package structures based on the purpose and security issues handled by that particular package which assists developers in future enhancements or modifications of code.
  • Involved in making the client side validations with JavaScript.
  • Involved in code reviews, system integration and testing. Developed unit test cases using JUnit framework.
  • Involved in deploying the application on UNIX DEV, QA and Prod Environments box.
  • Used Change management tool Service Center for promoting the War file from one environment to other.
  • Involved in user acceptance testing, fixing bugs and Production support.

Environment: Java, J2EE, Apache Struts, Websphere 5 6, JNDI, JDBC, JSP, UNIX and Windows NT, DB2 and SQL Server.

Confidential

Title: JAVA Developer.

Roles Responsibilities:

  • Worked on both Weblogic Portal 9.2 for Portal development and Weblogic 8.1 for Data Services Programming.
  • Worked on creating EJBs that implemented business logic.
  • Developed the presentation layer using JSP, HTML, CSS and client validations using JavaScript.
  • Involved in designing and development of the ecommerce site using JSP, Servlet, EJBs, JavaScript and JDBC.
  • Used Eclipse 6.0 as IDE for application development.
  • Validated all forms using Struts validation framework and implemented Tiles framework in the presentation layer.
  • Configured Struts framework to implement MVC design patterns.
  • Designed and developed GUI using JSP, HTML, DHTML and CSS.
  • Worked with JMS for messaging interface.
  • Used Hibernate for handling database transactions and persisting objects.
  • Deployed the entire project on WebLogic application server.
  • Used AJAX for interactive user operations and client side validations.
  • Used XML for ORM mapping relations with the java classes and the database.
  • Used XSL transforms on certain XML data.
  • Developed ANT script for compiling and deployment.
  • Performed unit testing using JUnit.
  • Extensively used log4j for logging the log files.
  • Used Subversion as the version control system.

Environment: Java/J2EE, Oracle 10g, SQL, PL/SQL, JSP, EJB , Struts, Hibernate, WebLogic 8.0, HTML, AJAX, Java Script, JDBC, XML, JMS, XSLT, UML, JUnit, log4j, MyEclipse 6.0.

We'd love your feedback!