- Software Professional with 6+ years of IT experience in designing, developing and delivering complex multi - tiered enterprise applications for web-based and clients-server domains using Java 7, 8 /J2EE/Unix/SQL/ MySQL languages with experience on working with mobile application on interactive application development.
- Experience in creating Embedded Linux distribution using open Embedded and Yocto and Linux package management and BSD and Linux system and kernel configuration, debugging, board driver customization, systems software.
- Exposure to CI/CD tools - Jenkins for Continuous Integration, Ansible for continuous deployment and Familiar with other Continuous Integration tools such as Bamboo & Teamcity.
- Experienced in a fast paced Safe Agile Development Environment including Test-Driven Development (TDD) and Scrum. Experience in making the Devops pipelines using Open shift and Kubernetes for the Microservices Architecture.
- Expertise in writing JUnit test cases using Junit Frameworks like Mockito and JMock. Used microservices architecture, with Spring Boot-based services interacting through a combination of REST and Apache Kafka message broker.
- Proficient with container systems like Docker and container orchestration like EC2 Container Service, Kubernetes, worked with Terraform.
- Experienced using different version control systems like GitHub, S3 and Bitbucket. Deploying applications using Lambda, EC2 and Docker containers and utilizing CI/CD processes. Experience in Implementing API's in Java Multi-Threaded Environment.
- Used PL/SQL and SQL*Loader to create ETL packages for flat file loading and error capturing into log tables. Oracle PL/SQL developer responsible for writing ETL scripts in in AGILE SCRUM /SAFE Environment.
- Extensive experience in developing Microservices using Spring Boot, Netflix OSS (Zuul, Eureka, Ribbon, Hystrix) and followed domain driven design. Performed Unit testing and Migrate the ETL code to QA Environment.
- Experience implementing REST and SOAP web services using technologies such as JSON, XML, JAXB, Jackson and Jersey. Configured Spark Streaming to receive real time data from the Apache Kafka and store the stream data to HDFS using Scala.
- Used Node Package Manager (NPM) to manage or install the Node - JS modules like Web pack, Grunt, Gulp, Browersify, Express, underscore.js, require.js, crypto.js, mongoose, and mongo.js.
- Experience in designing websites using J2EE technologies and handling design/implementation using Rational Application Developer RAD /Eclipse/NetBeans.
- Have knowledge on partition of Kafka messages and setting up the replication factors in Kafka Cluster and Profound experience in creating real time data streaming solutions using Apache Spark /Spark Streaming, Kafka and Flume.
- Worked with SVN, Team Foundation Server, Bitbucket, Git and GitHub for version control and Quality Center, Jira for bug tracking. Hands on experience on fetching the live stream data and inject data into Hbase table using Spark Streaming and Apache Kafka.
- Extensively worked in different IDE’s like Eclipse, Visual Studio Code, JIRA and Postman and Unit testing frameworks like Log4j, JUnit, Jasmine, Karma, Selenium.
- Expertise in developing Micro services using Spring Boot and NodeJS to build more physically separated modular applications which will improve scalability, Availability, Agility of application.
- Good working knowledge of Ant/Maven for project build/test/deployment, Log4j for error logging and Debugging, Junitfor unit and integration testing and XML Spy for XML validation.
- Expertise in design, development and Testing of various web and enterprise applications using Type safe technologies like Scala, Akka, play framework, Slick and experienced in using Scala, Java tools like Intelli J, Eclipse.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle.
- Proficient in Test Automation using UTF and Selenium tools, web Drivers, MySQL, NOSQL, Junit, Jacoco and TestNG. Expertise in using J2EE application servers such as IBM WebSphere, JBoss and web servers like Apache Tomcat.
- Experience in implementing web-based projects using Web/Application Servers such as Apache Tomcat 6.0.18/7.0.42 and JBoss Application Server 4.2 Used Apache Tomcat that implements the Java Platform, Enterprise Edition (Java EE).
- Knowledge of deploying application updates to AWS EC2, Docker, Rancher, Maven, Jenkins, and monitored the load balancing of different instances and used AWS IAM for access management.
- Expertise in using J2EE Frameworks like JDBC, Jakarta Struts, JMS, Spring JDBC, Spring Batch, Apache Kafka, Zookeeper, Hibernate, WEBSERVICES SOAP and REST.
- Knowledge in application of cloud providers, Amazon AWS Ec2/s3/AMI, Cassandra, MongoDB, Microservices and Cloud sigma. Used JBoss Fuse for integration of the web services and Apache Tomcat.
Cloud: Microsoft Azure, Amazon Web Services, PCF, OpenStack
DevOps: Tools Chef, Ansible, Puppet, Jenkins, Docker, ELK, Ant, AppDynamics, MavenTools & Services: GitHub, Bitbucket, REST Services, Docker, OpenStack, Kubernetes, DevExpress tools, Crystal Reports, UML, Jet Brains tools, JSON, Swagger, Shell scripting, CI/CD, SOLR
Version Control Tools: Git, GitHub, SVN, VSS
Web Technologies: J2EE, Servlets, JDBC, XML, JSON
Debugging Tools: MS Visual Studio debugging tools, Windows Debugger (Windbg) and GNU Linux/Unix debugger (GDB)
Full Stack Java Developer
Confidential - Menlo Park, CA
- Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart and used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
- Using Confidential Cloud Functions to integrate applications and run backend processes according to schedules without provisioning/managing servers.
- Involved in CI/CD process using GIT, Nexus, Jenkins job creation, Maven build Create Docker image and deployment in Confidential cloud environment & Creating and Managing Maven scripts to build and deploy the application. Developing Java API for ETL transformations: using Informatica, developed self-made ETL Java programs.
- Application deployment and debugging is performed in Linux environment and used Putty/ WinSCP to access the Server Logs and worked on the code debugging using memes cape debugging tool. Optimized Microservices using Nodejs and ES6 as interfaces to Cassandra NoSQL database.
- Used Git as source control management giving a huge speed advantage on centralized systems that have to communicate with a server & Wrote Kafka producers to stream the data from external rest APIs to Kafka topics.
- Implemented REST Microservices using spring boot & Generated Metrics with method level granularity and Persistence using Spring AOP and Spring Actuator.
- Implemented web page layout using struts tiles libraries, and performed struts validations using Struts validation framework. And Collaborated with the team using GIT, GitHub, Source Tree version control platform.
- Developed integration solutions and prepared programming code. Solid experience in Java Database Connectivity; JDBC API, Entity Beans, DAO Pattern, and configuring data sources on Web Sphere and Web Logic App Server.
- Involved in batch processing using Spring Batch framework to validate feed files and load data into corresponding EBX5 tables and Used Maven for dependency management and building the project.
- Managed the entire ETL process involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources to customers.
- Created tile definitions, Struts-config files, validation files and resource bundles for all modules using Struts framework. Build REST web service by building Nodejs Server in the back-end to handle requests sent from the front-end jQuery Ajax calls.
- Developed Spark Applications by using Scala, Java and Implemented Apache Spark data processing project to handle data from various RDBMS and Streaming sources.
- Experience in Service Oriented Architecture (SOA), Web Services design development and deployment using WSDL, SQL, REST, UDDI, JAXP, JAXB, SAX, DOM, XSLT, Apache AXIS, SOAP Web Services.
- Well experience in Design and Development of database systems using RDBMS concepts including Oracle, PostgreSQL, MySQL and experience in writing SQL queries, PL/SQL, T-SQL, Stored procedures, prepared statements and triggers. Experience in working with No-SQL database like MongoDB, Cassandra, and HBase.
- Involved in design & development of SQL queries, MySQL, Multithreading, Restful Web, Jacoco, Functions and Stored procedures for performing database operations. Work on Hadoop Ecosystem (Pig, Hive, Impala and Spark, Scala), Java and J2EE.
- Strong experience of using open source technologies including React, Node JS, Spring, Angular, jQuery, Apache Storm, Elastic Search, NOSQL, Mongo DB, Bootstrap, JUnit, Eclipse etc.
Sr. Java Developer
Confidential - Bellevue, WA
- Managed large datasets using Pandas API ecosystem to analyze the different segments of customers based on Location. Designed and developed Common Frameworks within the organization using Spring Batch, Apache Kafka and BeanIO Frameworks.
- Created common reusable objects for the ETL team and overlook coding standards and reviewed high-level design specification, ETL coding and mapping standards and Developed native Scala/Java library using Jsch to remotely execute Auto Logs Perl Scripts.
- Used Gulp as task runner, SASS/SCSS pre-processor, GIT and GitHub for source code management and tracking. Used Visual Source Safe for Version Control. Built Continuous Integration environment Jenkins, TFS, SVN and Continuous delivery environment.
- Worked on AWS SDK gem, including utilities that initialize local application environments mirroring Beanstalk environments and implemented AWS solutions using Dynamo DB, EBS, Elastic Load Balancer, Auto scaling groups.
- Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker, on GCP ( Confidential Cloud Platform). Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test deploy.
- Implemented Spark using Scala and Spark SQL for faster testing and processing of data and implemented Spark using Scala and utilizing Data frames and Spark SQL API for faster processing of data.
- Utilized Jenkins, Git for deployment on test servers and to move towards CI/CD. Designed and deployed full SDLC of Hadoop cluster based on client's business need. Created Lambda functions to automate the AWS environments
- Used AWS Cloud Watch for performing Monitoring, customized metrics and file logging and successfully completed the task in using AWS Lambda to manage the servers and run the code in the AWS.
- Worked on creating the Docker containers and Docker consoles for managing the application life cycle. Setup Docker on Linux and configure Jenkins to run under Docker host.
- Support an Agile CI/CD Environment with Devops where we make the Atlassian tools (Jira and Bitbucket) and provide layer 3 support on these tools if there are any issues. Responsible for ingesting large volumes of data into Hadoop Data Lake Pipeline on daily basis.
- Used Amazon Elastic Beanstalk with Amazon EC2 instance to deploy project into AWS. Configured continuous integration (CI) with Jenkins on AWS EC2. Created continuous integration builds using Maven and Harvest control.
- Developed algorithms to analyze gene expression data using Bayesian statistical techniques and ran the algorithm on Linux debugging server and Implemented code porting from Solaris system to Linux system.
- Developed a Java based ETL tool which extracts data from sources like IBM Cognos (xml) & My SQL and dumps data in the target tables in My SQL database.
- Designed and developed asynchronous RESTful Microservices using Spring boot, Couch base and RXJava that are consumed by other microservices with an average response time of 30 milliseconds.
- Implemented or exposing the Micro services to base on RESTful API utilizing Spring Boot with Spring MVC and Apache Kafka. Developed Rest API's with mongo DB for front end team.
- Analyzed the Node.JS server structure in legacy project, then mimicked the REST service by using Java JAX-WS API, and did the corresponding configurations.
- Created client website running on a Node.js, using the Express framework, Jade, AngularJS and Backbone.js for front end MVC/templating.
- Closely worked with Kafka Admin team to set up Kafka cluster setup on the QA and Production environments. And Implemented Spring boot microservices to process the messages into the Kafka cluster setup.
- Developed Microservices based on Restful web service using Akka Actors and Akka-Http framework in Scala which handles high concurrency and high volume of traffic.
- Built real time data pipelines by developing Kafka producers and Spark streaming applications for consuming. Created Scala and Java based ETL streaming framework, using Spark, Cassandra, Hadoop and Hive.
- Developed end to end data processing pipelines that begin with receiving data using distributed messaging systems Kafka through persistence of data into HBase.
- Involved in configuring the Git repository and maintain the version control using Git and Maintain the code base by frequent updates to revision using GIT version control system.
- Used struts tag libraries like html, bean, and logic in the JSP pages. Used html form tag for auto-populating of related Action Form as specified in the action mapping.
- Configured Jenkins as a Continuous Integration server with GitHub and Maven and Generated Test Report using ReportNG and Extent report in Maven Project.
- Working closely with Web Administrators to setup an automated deployment for SharePoint applications using SVN and Git Tools. Developed and tested extraction, transformation, job and load (ETL) processes
- Implemented usage of Amazon EMR for processing Big Data across Hadoop Cluster of virtual servers on Amazon Elastic Compute Cloud (EC2) and Amazon Simple Storage Service (S3). Worked on Apache Flume for collecting and aggregating huge amount of log data and stored it on HDFS for doing further analysis.
- Worked with IDE as Eclipse Indigo and deployed into Apache Tomcat Web Server & used Maven build tool to achieve more functionality for build process.
- Developing Spring Boot based Micro Services using the annotation & implementing architecture patterns, Used Spring Rest/Json to expose Micro Services APIs.
- Experienced in object-oriented analysis, design and application development using JAVA 6, J2EE, XML, JSON, RAML, JMS, BRMS, NoSQL technologies implemented code according to create AngularJS Controller, which isolate scopes perform operations.
- Used Spring Boot framework for building cloud Micro Services and to develop Spring-based application radically faster with very less configuration.
- Worked on integration of AWS Cloud configuration management and Centralized logging using Spring Boot and Java application. Used Kubernetes to orchestrate the deployment, scaling and management of Docker Containers.
- Redesigned the current ETL project architecture to make it more scalable by using Apache Camel and Spring Beans. Configuration Management - delivery of code via VSS Visual Source Safe
- Used Eclipse IDE for application development and deployment. Used Periscope Debuggers and Nohau Emulators for distributed system-interface debugging
- Developed REST based Scala service to pull data from Elasticsearch/Lucene dashboard, Splunk and Atlassian Jira and Used Spark and Spark-SQL to read the parquet data and create the tables in hive using the Scala API.
- Developed RESTful services using Jersey, JAX-RS and Resettle to call third party vendors and responsible for developing the sequential and conditional batch jobs using the Spring batch framework.
- Developed the automation environment for build, deployment, and configuration of Portlets onto the IBM WebSphere portal server using ANT scripts, XML Access scripts and JACL scripts.
- Designed and developed various modules of the application with J2EE design architecture, Spring MVC architecture using IOC, AOP concepts
- Created EC2, RDS instances and deployed application on AWS Elastic Beanstalk and Developed Java API to connect with AWS S3 Services for storing and retrieving data stored in the S3 buckets.
- Involved in writing SQL & PL SQL - Stored procedures, functions, sequences, triggers, cursors, object types.
- Developed RESTful services using Jersey, JAX-RS and Resettle to call third party vendors. Developed the automation environment for build, deployment, and configuration of Portlets onto the IBM WebSphere portal server using ANT scripts, XML Access scripts and JACL scripts.