Java Developer Resume Profile
Littleton, CO
PROFESSIONAL SUMMARY:
- Overall 10 years of experience as Java Programmer and Data Analyst with domain expertise in Java, J2EE, Hadoop technologies.
- 2 years experience with Hadoop HDFS, Map - Reduce, GIS tools for Hadoop, Pig, Hive, Sqoop
- SCJP and Oracle DBA-1 certified with hands on experience in Enterprise J2EE applications and RESTful Web Services including AWS JBoss AS admin and deployment, Swing GUI, JUnit, JSP, JMX, Servlets, Portlets, XML, JSON, Maven, Oracle, PostGres, Neo4J, MS SQL and frameworks like Spring IOC, MVC integration and Transaction Management- adhering to Agile JIRA, XP paradigms, Subversion SVN .
- Data Modeling tools SQL Power Architect and ETL implementation Informatica and Custom or capturing meta-data of ETL transformations and data profiling, data quality and DDL generation.
- Machine Learning, Data Analysis Algorithms for Pattern Recognition, Apache Mahout Machine Learning Library, High Dimensional Data Analysis of Heterogeneous and Unstructured Data, Supervised, Unsupervised Classification, Dimensionality Reduction, Graph Theory, Linked Data and Data fusion and parallel processing on GPU CUDA C .
- Excellent knowledge of Big Data and rapid assimilation of new skills like Spatial Analytics and Geo-processing Tools for Hadoop MapReduce, ArcGIS Server on Amazon EC2 to manage and exploit spatial intelligence from large scale data sets and practical expertise in implementing Geo-Spatial Linked Data.
TECHNICAL SKILLS AND EXPERTISE:
Programming Development:
Excellent programming skills in core Java and J2EE technologies including JUnit for testing, JBOSS AS 7, Tomcat, JSP, JMS, JSON GSON and Jackson API, JMX MBean, EJB, Servlets, JSR Portlets, Maven, Oracle 10g, PostGres, MyBatis, Hibernate, AJAX, JavaScript, JQuery, AMCharts, JSTree,XML, CSS, HTML5, Python and CUDA C GPU, Data Modeling tools like SQL Power Architect and ETL tools like Informatica.
Web and Enterprise GIS:
ArcGIS Server, ArcSDE and JavaScript plug-ins like ESRI JS API, Google Map API, Openlayers, PostGIS, Neo4J Spatial and Oracle Spatial. Possess sound knowledge of OGC standards from a standpoint of implementation of WMS, WFS, Data-interchange formats like GML, XML, GeoJSON, GIS Big data and Hadoop Map-Reduce and GeoSPARQL .
Big Data and Hadoop:
Confidential Map-Reduce, Pig, Pigeon, Hive,Sqoop, Flume, ESRI Hadoop, ESRI Geometry API for big data.
Frameworks:
Spring MVC, IOC, Apache Jena Semantic Web and Integration of core technologies.
Operating Systems:
Windows, Unix/Linux
Data Modelling, DDL, ETL, Databases, SQL:
Informatica, SQL Power architect and Oracle 10g, PostGres, Neo4J, MySQL database, Oracle Spatial and PLSQL.
Continuous Integration, Refactoring and Agile Practice:
JUnit, SVN Subversion, GitHub, JIRA
Development Environments:
Eclipse, IntelliJ, MS Visual Studio 2010, Matlab, Python Shell.
Machine learning techniques:
Apache Mahout, Artificial Neural Networks, SVM, Bayesian Machine Learning Techniques, Graphical Models, development and implementation of Algorithms in Parallel GPU's in CUDA C .
PROFESSIONAL EXPERIENCE:
Confidential
Java Developer
- I am responsible for developing applications for improving the platform performance doing automated backups using an XML configuration file for the client's projects running on the platform which supports multiple DBMS vendors Oracle, MySQL and PostgreSQL.
- Involved in Moving the application to Cloud to enable users to integrate the platform with Hadoop and support a Hadoop Data Lake model to provide scalability to analytical capabilitie of the platform. Redesigning the data layer to support HDFS.
- Redesigning WFS and WMS Map service connectivity to ESRI MapServer and ensure OGC compliance and RESTful service integration.
Technologies: Eclipse OSGi, Eclipse SWT, JFace, Java, JAX-B, XSD, XML, PostgreSQLOracle, MySQL, Git, Ant Build, ESRI Mapserver etc.
Software Developer
Responsibilities:
- I actively contributed towards product development DaaS building data as a service platform and was responsible for building RESTful WebServices, Visualization, GateIn Portal Customization and Configuration, JBoss AS 7 administration, Building MyBatis Mappings, and JMX Mbeans, User Registration Services, Relevant JUnit test cases,JavaScript,XML, CSS, HTML5 JQuery, JSTree, AMCharts, Tableau, D3JS for Portlet Navigation, Search and Display as per JSR 286 portlet specifications, Data Modeling and ETL tools like SQL Power Architect and Informatica Power Center.
- I have contributed towards introducing the concept of Linked Data and Federated Queries using DB2 Information Integrator,Infoshpere DataStage, OpenLink Dataspaces and further customization for leveraging SPARQL to exploit Semantic Web to leverage public data and leveraging Hadoop Pig and HiveQL for unstructured data from both internal and external sources to be integrated into the architecture of DaaS platform.
- Data Modeling, Profiling tools SQL Power Architect and ETL implementation Informatica and Custom or capturing meta-data of ETL transformations and data profiling for data quality and DDL generation.
- The primary goal was to enable users including analysts to rapidly build and test queries from sample data for any business objects built using custom sandboxes for analysis using the business glossary and data-dictionary terms of enterprise metadata and link them to diverse data-sources RDBMS and HDFS and make then allow the federated data engine to extract data and then transform and load such open data into RDBMS for further analysis and create RDF links in the semantic web to enable users to discover applicable services and other relationships between those terms and entities.
- I have also contributed in customization of both versions of this product for demonstration at the Federal Communications Commission and also it has been successfully adopted at the Internal Revenue Service, where Computech provides Data Strategy for these both clients.
Technologies: GateIn, JBoss AS 7,JSR 286 Portlets, Spring MVC,IOC, Transaction Management, MyBatis, JMX Mbeans, JSON, GSON,JAX-RS, JUnit, Apache Jena, Maven, Eclipse, JavaScript, XML, JQuery, JSTree, AMCharts, Tableau, Javascript, AJAX, Jquery, JSP, SQL Power Architect,Informatica Power Exhange, DB2 Information Integrator, Infosphere Data Stage, Pig, HiveQL, Map-Reduce, SPARQL and Jira.
GIS Big Data Consultant
Responsibilities:
- Confidential and Image Data Analysis. Processing of more than 100TB of lidar data and fusing heterogenous data including crowd sourced Geo-tags and GeoNames. Geo-processing of coordinate data using Python, PostGIS, XLRD and other tools in Linux Environment. Shapefile overlay and Geo-referencing.
- 3D Point Cloud Classification using Machine Learning techniques SVM and Analysis of Spatial Big Data using ESRI GIS tools for Hadoop and ESRI Geometry API, PostGresSQL, PostGIS, Spatial Hadoop, Pigeon Spatial Extension of Pig and Amazon Elastic Map-Reduce.
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
- Working on parallelizing heuristics for Active Machine Learning Algorithms for High-Dimensional Classification using GPU computing in CUDA C . This shall improve speed and efficiency and reduce idle time between successive iterations. Active Learning essentially reduces the labeling requirements by identifying the optimum number of most informative pixels needed to accurately classify a scene, which can reduce the cost of expert annotation.
Technologies: ESRI Geometry tools for Hadoop, Pig, Hive,GitHub, Sqoop, PostGreSql, PostGIS, Python, Map-Reduce, Amazon EC2 cluster,EMR and RedShift.
Confidential
Responsibilities:
Worked on a contract as programmer for Confidential Java, J2EE using MVC 2 framework for customizing, configuring and testing of the enterprise Java application ERDAS APOLLO/ Landmap Kaia. Tiling and Vector Extraction, Storage and native integration with Oracle Spatial.
- Performed Various GIS analysis tasks using ARCGIS,Quantum GIS and Image processing tasks using ENVI and ERDAS and pre-processing software like ATCOR and Parge for geometric rectification.
- Engaged clients and prospects in the Oil Gas Industry, including Shell Technologies, AramCo and Rolls Royce to demonstrate and deliver Geo-spatial solutions.
- Developed Matlab Code for High-Dimensional Data Analysis for determining Soil Composition for Identification of Hydrocarbon resources from Hyper-spectral Image data for a Oil Exploration Company.
Technologies: J2EE,JSP, JDBC,EJB, Spring, JMS, ActivMQ, SOAP JAX-WS, Maven, Oracle Spatial, ArcGIS Server 9.2, ArcSDE, ArcIMS, Model Builder, ESRI BusinessMap, ArcGIS JavaScript API, ERDAS APOLLO, ATCOR and Parge.
Junior Software Developer
Responsibilities:
- Used Rational Rose to design class diagrams, Use case diagrams and sequence diagrams
- Implemented the project using Spring MVC
- Used Hibernate to communicate with the Oracle Database
- Developed GUI using JSP, JavaScript, CSS, HTML, AJAX
- Integration of GIS with Financial Services adding spatial component to enterprise data for increasing productivity in Sales and Marketing, Assets and Facilities Management, Client Property etc.
- Mainly responsible for code testing and integrating business logic with Geo-spatial technologies mainly involving the middle-ware.
- Actively assisted Team Manager in evaluating outcomes of SCRUM meetings, tracking project progress.
Technologies: J2EE,JSP, JDBC,EJB, Spring, JMS, ActivMQ, SOAP JAX-WS,Maven,TomCat, Oracle, Hibernate, ArcGIS Server 9.2, ArcSDE, ArcIMS, Model Builder, ESRI BusinessMap, ArcGIS Javascript API.
Java Developer
Confidential people and maintains the information of Railcars, VINs, Ramps, Facilities, shippers, dealers, hallways and damage information of Vehicles. FM contains the mapping information of shippers, location and hallways. FM handles security, credentials information of AMS users, authenticated SPLCs, shippers, dealers. It feeds alert's, event's and vehicle's model information to AMS as part of EDI messages processing which help to create Railcar and VIN cycles.
Roles Responsibilities:
- . Designed mock-ups using SHTML in Web logic environment.
- . Developed DAOs using Spring-JDBC.
- . Implemented ORM applications using Hibernate and integrated with spring.
- . Development, enhancement, bug fixing and writing Junit Test Cases of Java related applications.
Technologies: MVC, Hibernate3.1, JavaScript, J2EE, CVS, ANT, spring 2.0, AJAX, XML Beans, Web logic 8, Oracle10g, Windows 2000 NT and Unix.