We provide IT Staff Augmentation Services!

Java Developer Resume Profile

4.00/5 (Submit Your Rating)

MD

Confidential

Javadeveloper

PROFESSIONAL EXPERIENCE

  • I am a software engineer with 8 years Java with Hadoop Big Data experience, relational database. I am a Java developer who has both knowledge and practical experience with Hadoop, Big Data and NOSQL, J2EE components, such as Servlets, Java Sever Pages JSP Java Messenger Service JMS , Spring, Ejb, Struts, Webservcices, cloud computing, JavaBeans, Java Cryptology Extensions JCE , Java Data Base Connectivity JDBC , Web services, and web-client programming such as XML, XSLT, HTML, DHTML and JavaScript. I have extensive experience in the development of software systems. Recent experience has included design, development and coding in JAVA, Oracle PL/SQL, and C . Worked with everything from Web/Windows GUI front ends to Postgre/Oracle back ends, Worked in all phase of Software development life cycle in BFSI domain Project.

Java Developer

  • Consulting on Hadoop ecosystem : Hadoop, MapReduce, Hbase, Sqoop, Amazon Elastic Map Reduce EMR
  • Managed Hadoop clusters : setup, install, monitor, maintain Distributions : Cloudera CDH, Apache Hadoop
  • Certified Cloudera Developer

ROLE AND RESPONSIBILITIES

  • Hands on experience in application development using Java, Hadoop, J2EE, JFC/SWING, EJB 2.0, Hibernate, Castor JDO, JDBC, WebWork, Web services, Spring, Ejb, Webservcices, cloud computing Jakarta Struts, JSP, Servlets, HTML, XML, RMI, SOAP, Web Services, Web Logic, JBoss.
  • 4 years experience in Hadoop technology with 1 master and 2 slave set up on Cloudera. Good experience in big data analytics.
  • Build business solutions on Hadoop Platform for different Machine Log / Device Log, Banking Transaction Log.
  • Built solutions for Customer Behavior Analytics Sentimental analysis.
  • Architected / Planned next generation EDW architecture for a banking Customer using Hadoop, Hive.
  • Delivered multiple deep dive technology and sales presentation for Customers on these platforms
  • Hands-on Experience on HDFS, Hadoop-1,2, map reduce, hive, base, sqoop, flume, pig, talend, SQL-MR.
  • Experience in Hadoop cluster, Asterdatancluster and IBM BigInsights.
  • Hands on experience in reporting tool Pentaho, Tableau and statistical language R.
  • Worked with cloud services like Amazon web services AWS and Google cloud.
  • Hands on experience in Google Big Query and connect to it by REST API.
  • Proficient in database development: Oracle, DB2, MS SQL, MySQL, PostgeSQL, MS Access
  • Experienced in Software development and business modeling of Web applications, Client/Server systems, Distributed Applications and other custom-built projects on UNIX and Windows.
  • Object - oriented design/analysis, UML modeling, Classic design patterns, and J2EE patterns.
  • Experience in Java development GUI using JFC, Swing, JavaBeans, and AWT.
  • Strong Web development skills. Experience in N-tier Client-Server based Internet technology, intranet portal design/development Web based data reporting system, Framework development for Internet application.
  • Thorough knowledge with J2EE application platform configuration and performance optimization. Hands on experience with J2EE application deployment technology, EJB transaction implementation CMP, BMP, Message-Driven Beans
  • Hands on experience on varied fields, as CRM and ERP systems, Back office brokerage, Workflow, B2B and B2C Applications.
  • Good knowledge of Python, Perl, PHP, JavaScript, VBScript, Visual Basic, CGI, HTML, DHTML, XML, CSS.

Technical Summary:

  • File System: HDFS, GFS
  • Big Data Technology: Hadoop, Google Big Query
  • Data processing Language : Map-Reduce,SQL-MR, Impala
  • Databases and Big Data ETL / LET : DB2, Tera Data Hive, HBase, Pig
  • Relational Database : Oracle 10g and 11g SQL and PL/SQL ,
  • Operating System :Windows, Ubuntu, CentOs, RedHat.
  • Tools Used : Eclipse, IBM InfoSphereBigInsight, putty, cygwin
  • Hadoop on Cloud : Amazon Web Service AWS , IBM SMART cloud, Google cloud
  • BI and Reporting Tool :Pentaho, Tableau
  • Testing : Hadoop Testing, Hive Testing, Quality Center QC

Languages: Java /Java 2, Hadoop, J2EE, Python, Perl, Spring, Ejb, Webservcices, cloud computing PHP, JavaScript, VBScript, Visual Basic, C , C, SQL, DHTML, HTML, XML, CSS, UML , SOAP, Web services, REST Java Skills: JFC/Swing, AWT, Applets, JavaBeans, JMS, Java Mail, RMI, Servlets, JSP, EJB, JNDI, JMS, JDBC, SOAP, Multi-threading, Java Networking, Socket Programming, JUnit Object Persistence Tools: Hibernate, Castor JDO, ObjectRelationalBridge RDBMS: Oracle,, MS SQL Server, MySQL, PostgeSQL, DB2, Cloudscape WEB frameworks: WebWork, Spring, Jakarta Struts Web/Application Servers: Apache, Jakarta Tomcat, Bea WebLogic, IBM Web Sphere, JBoss, Resin CASE Tools: Rational Rose, WithClass Version Control: CVS, WinCVS Operating Systems: MS Windows 7/XP / 2000 / NT4 / 9x, Linux, Solaris PROJECTS UNDERTAKEN

Hadoop/BigData Developer

Used Technologies: Hadoop IBM Big Insights , PIG, Hive, Hadoop time line, HadoopVaidya, IBM Big Sheet and Tableau.Java, Hadoop, Java Servlets, JSP, HTML, Apache, Oracle 7.0, Java 2, J2EE, JBoss 3.0.6, Blazze 1.0, JFC/Swing, Hibernate, Webwork, Web logic, Web services, Spring, Struts, Oracle 11g, Web service SOAP

Successfully virtualized Hadoop

  • Developed a custom File System plugin for Hadoop so it can access files on Hitachi Data Platform. This plugin allows Hadoop MapReduce programs, HBase, Pig and Hive to work unmodified and access files directly. The plugin also provided data locality for Hadoop across host nodes and virtual machines.
  • Advised file system team on optimizing IO for Hadoop / analytics work loads.
  • Setup and benchmarked Hadoop/HBase clusters for internal use
  • Wrote data ingesters and map reduce programs, scripting to provision and spin up virtualized hadoop clusters
  • Implemented Hadoop and big data analytics.

Hadoop/BigData Developer

Technology Used: Hadoop cloudera , PIG, Hive, HBase, Sqoop, DB2, TeraData, Teradata Connector, Revolution R, Hadoop time line, Hadoop Vaidya, HCatlog, Tableau.

Responsibilities :-

  • Provided Big Data solution in a Time based Usecase.
  • Provided solution to store Historical data in 1: 15 cost comparison with TeraData and HP Hadoop Boxes hardware.
  • Worked with Cloudera Enterprise Edition - 4 and implemented automatic failover in cluster.
  • Installed and Integrated Revolution R for statistical analysis On Hadoop 200 node cluster on Centos 6.2
  • Created statistical model in R.
  • Integrated hive with web server for auto generation of Hive queries for non-technical business user.
  • Integrated multiple sources data SAS, DB2, Tera Data in to hadoop cluster and analyzed data by Hive-HBaseintegration.
  • Created components like hive UDFs for missing functionality in HIVE for analytics.
  • Generated Business reports by using Tableau.

Confidential

Java Developer

Responsibilities:

  • Development and implementation of the project
  • Analyzed users needs and planed information streams UML, Rational Roses
  • Developed and implemented a database structure and software modules SQL, Ms SQL Server, MS Access
  • Designed the server applications for a financial analysis and representation of the stock quotation in real time
  • Developed client environment for financial analysis Java 1.1, Applets, HTML, JavaScript, IIS 3.0 -4.0

Confidential

Used technologies:

  • Java 2, J2EE, JSP, Jakarta Tomcat, Bea Web Logic, and MySQL.
  • Responsible design and implementation of the server side software, business logic and database structure of the project UML
  • Implemented business logic on server-side as Entity Beans and Session Beans EJB, JDBC, WebLogic
  • Developed web-based software modules JSP, HTML
  • Implemented software utilities for the site administration.
  • Developed front end software modules in JSP, HTML

We'd love your feedback!