- More than 8 years of Java / Web development in several industries and projects on Cross Platform (web based application and client server application) development and design using Object Oriented Programming, Java / J2EE technologies.
- Expertise in implementing Spring framework for Dependency Injection, support for the Data Access Object (DAO) pattern and integration with Hibernate.
- Expertise in Object Oriented technologies usingJava,J2EE, Design Patterns, JSP, Servlet, Struts, Spring, Hibernate, JDBC, EJB, JMS,JavaMail, RMI/IIOP, and JAXB, JacksonAPIs.
- Strong exposure to Object Oriented analysis and design (OOAD/OOP), and design patterns.
- Excellent hands on experience in Struts framework and also extensively used Validator Framework and Tiles Framework.
- Expertise in development of Hadoop/Big Data ecosystem and experience in fields of Developing and Testing in Java/J2EE technology, web based technologies with different back end databases.
- Good Knowledge and exposure in Big Data processing using Hadoop Ecosystem including Pig, Hive, HDFS, Map Reduce (MRV1 and YARN), Zookeeper, Spark, Impala.
- Excellent knowledge and experience in Core - JAVA, J2EE, Web Design and Client Server application.
- Design and implementation of distributed systems, and exploration and customization on Kafka, Hadoop, Spark, Storm, HBase, Presto etc.
- Experience in using automations tools like puppet for deployingCassandracluster.
- Experienced in front end development using Ajax and JSON, JQuery, JSP, Angular JS, Bootstrap, and ExtJS.
- Experienced in Deployment of Internet/Intranet applications in Web and Application servers.
- Designed and developed a big data querying system similar to Presto.
- Pig, Hive, Hbase, Cassandra, Zoo Keeper, Oozie, Spark, Storm, Impala, AWS and Kafka.
- Expertise in deployment of Hadoop, Yarn, Spark and Storm integration with Cassandra, ignite and RabbitMQ, Kafka etc.
- Expertise in J2EE and MVC architecture/implementation, WebServices, SOA, Analysis, Design, Object modeling, Data modeling, Integration, Validation, Implementation and Deployment.
- Experience in spring module like MVC, AOP, JDBC, ORM, JMS, and Web Services using Eclipse and STS IDE.
- Experienced in full Software Development Life Cycle (SDLC) starting from collecting Business specifications, Analysis, Design and Development, Testing and documenting the entire life cycle
- Experienced in working on SOAP and RESTful Web Services.
- Proficient in deploying n-tier Enterprise / Web applications under IBM WebSphere, BEA Web logic and Apache Tomcat.
- Used Apache Axis to develop, configure SOAP and WSDL based Web Services accessed by numerous clients running bothJavaand NonJavaapplications.
- Hands on experience with production support tool such as Jira, Remedy, Quality Center - bug tracking tool.
Java/J2ee Technologies: J2EE, JDK 1.8, EJB, JSP, Servlets, JDBC, JNDI and RMI, JAVA MAIL API
Web/Application Servers: IBM Websphere 8.0 /5.0, BEA Weblogic6.1/ 6.0/ 5.1, Apache Tomcat 4.0.3, IIS4.0, Apache 1.3
EAI Server: IBM Websphere MQ 5.3
RAD Tools: Websphere Studio Application Developer (WSAD 5.0), Borland JBuilder, Microsoft Visual Interdev, XML Spy, IBM Rapid Application Developer(RAD), Eclipse
Utilities & Frameworks: Ant, Jakarta-Struts 1.0/1.1, Struts, Hibernate, Spring, Junit, Accenture's GRNDS
Markup Languages: HTML, DHTML, XML, XSLT
Databases: DB2, Microsoft SQL Server, Confidential 11g
OS: Window 7/8/10, Unix, Linux
Source Code Control: Rational Clear Case, Rational Clear Quest, CVS, Visual SourceSafe, GIT, SVN
Confidential, Minneapolis, MN
Sr. Java/Big Data Developer
- Developed controller and business logic using Spring MVC module.
- Designed and developed persistence layer components using spring, Hibernate to store and fetch data from database.
- Experience working withbigdataand real time/near real time analytics andbigdataplatforms like Hadoop, Spark using programming languages like Scala and Java.
- Experience working with Hbase/Pig/Hive/ Map Reduce programs in a YARN Cloudera Manager framework and to parallel process thedatafiles in AWS environment.
- UsedNoSQLon MongoDB, DB2 BLU. Great understanding and implementation experience of BigData Architecture at Enterprise Database Level.
- Having onsite mentoring several of their engineers on all aspects of the Presto Framework and best practices as well as development.
- Installed and configured Hive, Pig, Spark, Presto, Rstudio in AWS EMR clusters.
- Experience in KafkaDeployment and Integration with Confidential Databases.
- Involved in the process of designingCassandraArchitecture.
- Experience working in AWS Cloud Computing environment, Elastic Map Reduce and Amazon S3.
- Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities ofSparkusing Scala.
- Created and maintained frameworks, utilities, and shared libraries to support them, for the processing and maintenance of internal and external data stores across a number of different interfaces and conventions (REST, SOAP, SQL, MongoDB, Hadoop/HDFS, Hive & Presto).
- Configuring internode communication betweenCassandranodes and client using SSL encryption.
- Designed and built the reporting application that uses theSparkSQL to fetch and generate reports on HBase table data.
- Worked mainly on Core Java, STL, data structures, UNIX, multithreading.
- Installed and configuredCassandracluster and CQL on the cluster.
- Created producers, consumer and zookeeper setup for Confidential tokafkareplication.
- Involved in Web Sphere Application Server environment setup includes IBM HTTP Server, Apache Web Server, IBM Directory Server onUNIXand Windows
- Built application logic using Python 2.7.
- Used Apache Couch db (NoSQL) in AWS Linux instance in parallel to RDS MySQL to store and analyze.
- Worked with Core Spring Framework for Dependency Injection, Spring Context to provide message sources.
- Installation & configuration of MongoDB Sharding components like Shards,MongoS, Config Servers for Large Deployment.
- Worked on analyzingHadoopstack and different big data analytic tools including Pig and Hive, Hbase database and Sqoop.
- Created various backend APIS using Python and Django.
- Worked on Python to build distributable libraries for others to use.
- DeliverBigDataProducts including re-platforming Legacy Global Risk Management System with BigDataTechnologies such as Hadoop, Hive and HBase.
- Developed complex database SQL queries/Stored Procedures using SQL developer.
- Used Test driven approach (TDD) for developing services required for the application.
- Implemented logging mechanism using log4j tool.
- Used Log4J appenders for local & remote logging and also for writing the logs to the database.
- Used SPRING Framework 3.2.2 for transaction management.
- Implemented validation rules using JBOSS BRMS (Business Rule Management System), a version of DROOLS.
- Created Web services using Apache Axis 2 for communication with other application.
- Setup MongoDB Backups using different methods - mongodump tool, fsync util, Master/Slave deployment, Hidden replica Server..
Sr. Java/Big Data Developer
- Used different Design patterns, like MVC, Façade, Controller Servlets, Business Delegate, Service Locator, Singletons, Value Objects while implementing the framework and Factory.
- Wrote Hibernate configuration file, Hibernate mapping files and defined persistence classes to persist the data into Confidential Database.
- Configured Hibernate session factory to integrate Hibernate with spring.
- Employed Spring JDBC to implement batch jobs to pull organization structure related data.
- Installed/Configured/Maintained ApacheHadoopclusters for application development andHadoop tools like Hive, Pig, Sqoop and Flume.
- Experience in upgrading the existingCassandracluster to latest releases.
- Used Spring MVC framework to eliminate complexity and to achieve faster and better result.
- WrittenKafkaRest API to collect events from front end.
- Experience in developingHadoopapplications on SPARK using SCALA as a functional and object oriented programming.
- Used Hibernate for the database framework.
- Configured authorization toCassandracluster using Password Authenticator.
- Wrote various PLSQL queries for transactional data and created JDBC connections.
- Used Ajax for form validation in JSPs.
- Good experience using ApacheSPARK, Storm and Kafka.
- Used Struts validation framework for performing front end validations.
- Developed the project using Agile/Scrum methodologies.
- Used Ant build tool for building and deploying the application.
- Used Log4J utility to log error, info and debug messages.
- Used spring annotations to create controller as well as service layer classes.
- Experience in working with Scheduling tools like AutoSys, JAMS, andDataStage ETL tool, and Unix Shell and Perl scripts.
- Expertise on Apachekafkafor stream filtering system and Microsoft Unified Communications Web API (UCWA).
- Involved in BigDataProject Implementation and Support.
- Created RelationalDatamodel for Normalizeddata.
- Involved in end to end implementation ofBigdatadesign.
- Used Web services (SOAP) for transmission of large blocks of XML data over HTTP.
- Developed and debugged the servlet’s and EJB with WebSphere Application server.
Environment: JDK1.6, HTML, Backbone, Hibernate, Struts, SOAP, REST, JSP, Eclipse, Weblogic AppServer 10.3, spring, Spring-JDBC, Big Data, Sybase, Angular JS, SQL, AJAX, JQuery, CSS, WSDL, Confidential, Rapid Sql, Log4j, Maven, ANT, JUnit, MVC
Confidential, Bellevue, Washington
Sr. Java/Big Data Developer
- Involved in writing POJOs, hbm and hibernate.cfg files and configured the same for application development.
- Extensively used SQL, PL/SQL in constructing views, indexes, stored procedures, triggers, cursors, functions, relational database models.
- Experience with Data migration from DB2 to ApacheCassandraDB.
- DevelopCassandraData model considering current functionality and business need of application.
- DevelopedSparkcode using scala andSpark-SQL/Streaming for faster testing and processing of data.
- Developed Kafka producer and consumers, Hbase clients, Spark andHadoopMapReduce jobs along with components on HDFS, Hive.
- UsedSparkAPI over Hortonworks Hadoop YARN to perform analytics on data in Hive.
- Coding of SQL, PL/SQL, and Views using IBM DB2 for the database.
- Solution planning with Java EE,Cassandra& SOA platforms andCassandradata modeling.
- Extensive experience in develop, maintain and implementation of EDW,DataMarts, ODS andData warehouse with Star schema and snowflake schema.
- Build recommendation engine for unstructured and semi structureddata.
- Developed web components using JSP, Servlets, and JDBC.
- Implemented database using MySQL.
- Implemented J2EE standards, MVC2 architecture using Struts Framework.
- Worked on importing the unstructured data into the HDFS using Flume.
- Involved in working of Big data analysis using Pig and Presto and User defined functions (UDF).
- Worked with HiveQL on Big data of logs to perform a trend analysis of user behavior on various online modules.
- Worked on analyzing Hadoop cluster using different big data analytic tools including Pig, Hive and Map Reduce.
- Developed and implemented the MVC Architectural Pattern using Struts Framework including JSP, Servlets and Action classes.
- Set up JBoss Server, Configured MySQL Cluster in Linux OS and installed Open Call XDMS.
- Client pages are built using HTML, JSP, XML to interact with users, and the business logic is implemented using Servlets andJavaBeans.
- Used shell scripting to develop scripts for integration and deployment.
- Experience in handling WebSphere for server management and application deployment
- Extensively used Factory Pattern, Builder Pattern, Singleton and Facade design patterns.
- Used IBM WebSphere 7.5 as my Application Server for this application.
- Worked in Agile Methodology for developing of application.
- Developed XSD Schemas adhering CSI standards, with all the specified elements, defined with proper data types giving more preference to the Cingular Data Model (CDM) types.
- Created AID (Application Interface Document) for each interface and API.
- Used Eclipse IDE for code development along with SVN for managing the code.
- Successfully run the service by the M2E technique, by developing the Maven Project in eclipse with all the necessary attributes.
Confidential, Chicago, IL
Java/Big Data Developer
- Developing Intranet Web Application using J2EE architecture, using JSP to design the user interfaces, and JSP tag libraries to define custom tags and JDBC for database connectivity.
- Used JPA (Java Persistence API) with Hibernate as Persistence provider for Object Relational mapping.
- Coded in Angular JS MVC Framework to make single page and Allocation price configurations pages.
- Reviewed the Business Use cases, Analyzing Business Requirements, Technical Requirements and Developed High Level and Low Level Design Documents.
- Implemented Log4j for the project to compile and package the application, used ANT and MAVEN to automate build and deployment scripts.
- Involvedinimport the data from excel sheet to database throughOAFpage.
- Apart from construction, was involved in SystemTesting (ST), Bugfixing, UAT, and QC rounds of this phase of the project.
- Created Ant tasks to support application deployment in development, test and production environments.
- Worked on data conversion by extracting data from DB2, reform data, and load data intoCassandra nodes.
- Expertize in runningHadoopstreaming jobs to process terabytes data.
- Worked on PL/SQL and SQL queries.
- Knowledge ofCassandrasecurity.
- Involved in both High Level Design and Detailed Design
- Extensive experience developing applications using agile methodologies like Test Driven Development (TDD), SCRUM and KANBAN.
- Involved in Using Sqoop tool to extract data from a relational database intoHadoop.
- Developed screens for the Correspondence, Insured Personal Information, Reports, Help Center using data tables and search functionality.
- Rational ApplicationDeveloper(RAD-7.5) and Webspere-8 were used for the development and deployment.
- Proficient in RDBMS concepts and worked with MySQL 5.0, Confidential 9i/10g and SQLServer.
- Good team player with ability to solve problems, organize and prioritize multiple tasks.
- Experienced in writing the DTD for document exchange XML. Generating, parsing and displaying the XML in various formats using XSLT and CSS.
Environment: Java, J2EE, Java SE 6, UML, JSP 2.1, JSTL 1.2, Servlets 2.5, Spring MVC, Hibernate, JSON, Restful Web services, Big Data, jQuery, AJAX, Angular Js, JAXB, IRAD Web sphere Integration Developer, Web Sphere 7.0, Unix, JUnit, DB2, Confidential .
- Built an end to end vertical slice for a JEE based billing application using popular frameworks like Spring, Hibernate, JSF, Swing, JavaBeans, Facelets, XHTML, Flex, AngularJS, JSON, Ivy, and Ajax by applying OO design concepts, JEE design patterns, and best practices.
- Developed the application using spring framework that leverages model view layer architecture, also configured Dependency Injection.
- Knowledge of setting up the configuration of ApacheSolr, core creation, data indexing, searching.
- Involved in configuring multi-nodes fully distributedHadoopcluster.
- Used Hibernate ORM tools which automate the mapping between SQL databases and objects in Java. Integrated the spring and Hibernate framework.
- Experienced in managing and reviewingHadooplog files.
- Extensively used Hibernate in data access layer to access and update information in the database.
- Knowledge ofCassandramaintenance and tuning - both database and server.
- Worked on TDD (Test Driven Development).
- Used JMS for the asynchronous exchange of critical business data and events among J2EE components and legacy system.
- Extensive experience in implementing J2EE design patterns like, Visitor,Singleton, MVC pattern and Data Access.
- Supported and provided important feedback to various development teams with regards to the grid computing and caching technology.
- Designed and developed RESTful style Web Service layer and WSLD.
- Developed unit test cases using Junit, Mockito and Selenium.
- Implemented a generic interface to Hibernate criteria API for UI search functionality.
- Developed SQL, PL/SQL, stored procedures along with Shell Scripting- database application scripts.
- Deployed of web, enterprisejavacomponents, messaging components and multi-threading.
- Developed XML Schemas - XSD, DTD for the validation of XML documents, also used Subversion.
- Used XML web services using SOAP to transfer the amount to transfer application that is
- Used SWING for GUI based coding.
Environment:Java1.4, J2EE, JSP 2.0, PL/SQL, Spring 2.0, EJB 2.0, JMS, JNDI, Confidential, XML, DOM, SOAP, JUnit, Apache Camel, WebSphere 8.5, Hibernate 3.0, Big Data, JDBC, MS SQL Server 2012, JESS, REST-ful Web service, Weblogic 8, Jboss Drools, SOA Design Patterns, Cassandra, NOSQL.