We provide IT Staff Augmentation Services!

Senior Java Architect Resume

4.00/5 (Submit Your Rating)

San Jose, CA

PROFESSIONAL SUMMARY:

  • Over 13 years of overall experience as software developer in design, development, deploying and supporting large scale distributed systems.
  • Primary technical skills in HDFS, MapReduce, YARN, Pig, Hive, Impala, Sqoop, HBase, Cloudera.
  • Have good experience in extracting and generating statistical analysis using Business Intelligence tool Tableau for better analysis of data.
  • Experience in importing and exporting data using Sqoop and SFTP for Hadoop to/from RDBMS.
  • Excellent understanding of Hadoop architecture and its components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode and MapReduce programming paradigm.
  • Good experience in Core Java, J2EE, JavaScript, Servlets, Struts, spring, Hibernate, JDBC, EJB, XML, PL/SQL and working with Agile methodologies.
  • Extensive experience with Databases such as MySQL, Oracle 11G.
  • Experience in writing SQL queries, Stored Procedures, Triggers, Cursors and Packages.
  • Good experience in writing optimized Map Reduce jobs using Java.
  • Experience in implementing User Defined Functions for Pig and Hive.
  • Experience with working on web Services REST, JAX - WS, SOAP and AWS.
  • Very Good knowledge and Hands-on experience in Cassandra, Flume and Spark (YARN).
  • Good knowledge in distributed coordination system ZooKeeper and search platform Solr.
  • Expertise in preparing the test cases, documenting and performing unit testing and Integration testing.
  • In-depth understanding of Data Structures and Algorithms and Optimization.
  • Strong knowledge of Software Development Life Cycle and expertise in detailed design documentation.
  • Fast learner with good interpersonal skills, having strong analytical and communication skills and interested in problem solving and troubleshooting.
  • Self-motivated, excellent team player, with positive attitude and adhere to strict deadlines.

TECHNICAL SKILLS:

Languages: Java, C, C++

BigData Technologies: Hadoop, HDFS, MapReduce, Hive, Pig, HBase, Storm, kafka, Impala, Sqoop, Oozie, ZooKeeper, Spark, Cassandra, Cloudera CDH4, CDH5, HiveQL, PigLatin, Git, Elastic Search, Maven

RDBMS: Oracle, MySQL, SQL Server.

No SQL: HBase, Cassandra

Scripting & Query Languages: Python, Shell, SQL & PL/SQL.

Web/Application Servers: IBM Web sphere, Tomcat and LDAP.

Middleware: RMI, EJB, JMS, Hibernate

Technologies: J2EE, JDBC, Multi-threading, JSP, Servlets, Struts, JSF, AJAX, SOAP, XSLT, DOM, CSS, DTD and Schema

PROFESSIONAL EXPERIENCE:

Senior Java Architect

Confidential, San Jose, CA

Responsibilities:

  • Involved in the high-level design of the Hadoop architecture for the existing data structure and Business process.
  • Created indices in ElasticSearch and written queries on aggregation using filters.
  • Collected JSON that is generated by the Data Collector from Oracle and mapped to objects for processing and persistence.
  • Involved in code promotion and migration from WebAction to Storm environment.
  • Developed code to read data stream from Kafka and send it to respective bolts through respective stream.
  • Written python scripts for internal testing which pushes the data reading form a file into Kafka queue which in turn is consumed by the Storm application.
  • Used Vagrant as an internal VM which provides a running environment for testing and maven to build.
  • Written QA test cases as a part of Agile unit testing process for processing of data and persistence.
  • Exposed ElasticSearch queries as REST services for the time-series data representation through graphs.
  • Integrated the code with the Data-Collector data which comes on real time and made the results available in UI for monitoring and querying.
  • Mentored the new team members with the work flow and tools involved.

Environment: ElasticSearch, Storm, Kafka, Git, Maven, Java, J2EE, REST, python scripting, Vagrant, Chef

Senior Java Architect

Confidential, Columbus, OH

Responsibilities:

  • Acted as a lead resource and build the entire Hadoop platform from scratch.
  • Evaluated suitability of Hadoop and its ecosystem to the above project and implementing / validating with various proof of concept (POC) applications to eventually adopt them to benefit from the Big Data Hadoop initiative.
  • Estimated the Software & Hardware requirements for the NameNode and DataNode in the cluster.
  • Extracted the needed data from the server into HDFS and BulkLoaded the cleaned data into HBase using MapReduce.
  • Written the Map Reduce programs, Hive UDFs in Java.
  • Used Map Reduce JUnit for unit testing.
  • Develop HIVE queries for the analysts.
  • Created an e-mail notification service upon completion of job for the particular team which requested for the data.
  • Defined job work flows as per their dependencies in Oozie.
  • Played a key role in productionizing the application after testing by BI analysts.
  • Maintain System integrity of all sub-components related to Hadoop.

Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Java, Cloudera CDH4, Oozie, Oracle, MySQL, Amazon S3.

Sr. Java Developer

Confidential, Rocky Hill, CT

Responsibilities:

  • Have setup the 64 node cluster and configured the entire Hadoop platform.
  • Migrating the needed data from Oracle, MySQL in to HDFS using Sqoop and importing various formats of flat files in to HDFS.
  • Mainly worked on Hive queries to categorize data of different claims.
  • Integrated the hive warehouse with HBase
  • Written customized Hive UDFs in Java where the functionality is too complex.
  • Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.
  • Designing and creating Hive external tables using shared meta-store instead of derby with partitioning, dynamic partitioning and buckets.
  • Generate final reporting data using Tableau for testing by connecting to the corresponding Hive tables using Hive ODBC connector.
  • Maintain System integrity of all sub-components related to Hadoop.
  • Maintained System integrity of all sub-components (primarily HDFS, MR, HBase, and Hive).
  • Monitored System health and logs and respond accordingly to any warning or failure conditions.
  • Presented data and dataflow using Talend for reusability.

Environment: UNIX, Apache Hadoop, HDFS, Hive, Java, Sqoop, Cloudera CDH4, Oracle, MySQL, Tableau, Talend, Elastic search, Kibana, SFTP

Senior Java Developer

Confidential, Windsor, CT

Responsibilities:

  • Acted as a lead for development team of 8 gathering the requirements and designing the flow of the project.
  • Analyzed all the test cases based on the requirements gathered and documented for unit testing as well as for integration testing.
  • Designed the user interface required for the portal with all the components for selection of plan.
  • Provided the design using Restful WebServices to populate the individual details of plans available for the customers to pick.
  • Programmed functionality for all the components in the user interface interacting with the database using Enterprise Java Beans and MySQL Server.
  • Developed various Controller classes and business logic using the spring libraries which interact with the middle tier to perform the business operations.
  • Responsible to develop the custom tools as per the client needs.
  • Developed the DTDs finalized by the business.
  • Tested the application by programming the test cases using JUnit for both unit testing and Integration testing and bug tracking for the entire application.

Environment: Core Java, JDK 1.7, JSP, Struts, EJB, Hibernate, MySQL, SOAP, REST, JUnit, Eclipse, HTML, JavaScript, XML

Senior Java Developer

Confidential

Responsibilities:

  • Prepared the design document for the flow of each module and their dependency over other modules.
  • Documented all the test cases based on the requirements gathered for unit testing as well as for integration testing.
  • Developed the user interface components apart from the ready to use components using CSS to maintain the uniformity across the application.
  • Developed the front end interface using JavaScript and Ajax features.
  • Implemented SOAP WebServices for total equipment and Mechanic details accessible over all the departments.
  • Participated and delivered various work products in the development and implementation of Software deliverables and Software configuration management (SCM)

Environment: Core Java, JDK 1.6, JSP, Spring Framework, EJB, Hibernate, Oracle 10g, JUnit, Eclipse, HTML, CSS, JavaScript, REST, XML, WebSphere Application Server.

Java Developer/ Programmer

Confidential, Bentonville, USA

Responsibilities:

  • Interacted with client for gathering requirements and preparing the test cases for each and every requirement which are useful in testing.
  • Developed UI using HTML, JavaScript, CSS and JSP for interactive cross browser functionality and complex user interface.
  • Implemented the end to end functionality of the client requirement during the development phase.
  • Implemented the functionality of mapping entities to the database using Hibernate.
  • Written SQL queries involved in the JDBC connection in accordance with the business logic.
  • Performed various levels of unit testing for the entire application using the test cases which included preparation of detail documentation for the results.
  • Actively participated in client meetings and taking the inputs for the additional functionality.
  • Suggested some better possible ways of having the user interface from the user perspective.

Technologies: J2EE, Spring, Hibernate, JavaScript, CSS, Servlets, MySQL

We'd love your feedback!