Micro Service Developer Resume
AtlantA
PROFILE:
Java Microservices
SUMMARY:
- Over 8+ years of experience spread across Micro Services, Java that includes extensive work on Micro services Technologies along with development of web applications in multi - tiered environment using Confluent Kafka 5.1.8,Apachi Kafka,Spring Boot 2.0+, Spring Framework.
- Experience in working on PCF(Pivotal Cloud Foundry), Azure cloud and Amazon cloud with Docker.
- Experience in working with Azure technical tools like Azure Blob, Azure CosmoDB,Azure SQL,Azure Kubernetics and Azure devops.
- Experience in working with Amazon technical tools like Amazon S3, Amazon EC2.
- Strong experience in Spring Framework such as Spring MVC, IOC, AOP, JPA, Spring JDBC, Prepared Statments and Hikari Data source.
- Experience in developing a data pipeline through Kafka-Spark API, Spring-kafka and Confluent kafka(KAFKA API’s, KAFKA CONNECT, KAFKA STREAMS) .
- Experience with Spring IO, Spring Boot with Thymeleaf and ReactJs.
- Experience in core Java-Multithreading with Java 1.8 and Java 1.7.
- Experience in RDBMS technologies like Oracle, MySql, and Postgres using Functions, Triggers, And Stored Procedures.
- Good Knowledge on Apache NIFI, FLINK.
- Implemented Microservices architecture with RESTFUL APIs and oAuth2.
- Experienced with different file formats likeAvro,Json, ORC, Parquet,Fixed file Width, CSV, Text files, Sequence files, XML.
- Experience in Object Oriented language like Java and Core Java.
- Experience in NoSql databases technologies like HBase, Mongo DB, Cassandra .
- Worked with agile and Scrum software development framework for managing product development.
- Experience in JUNIT,MOCKITO,Functionality Testing, Regression Testing and Smoke testing.
- Good experience in using AWS S3.
- Experience in building Microservices by using Spring Boot.
- Experience in using of Jenkins Continuous integration and continuous deployment.
- Implemented a continuous Delivery pipeline with Docker, Jenkins and GitHub.
RELEVANT EXPERIENCE:
Micro service Developer
Confidential, Atlanta
Responsibilities:
- Implemented Spring boot microservices to process the messages into the Confulent Kafka cluster setup.
- Implemented kafka connect JDBC connected to oracle data base using GoldenGate.
- Implemented to reprocess the failure messages in Kafka using offset id.
- Implemented Kafka producer and consumer applications on Kafka cluster setup with help of Zookeeper.
- Used Spring Kafka API calls to process the messages smoothly on Kafka Cluster setup.
- Implemeneted avro schema for processing events.
- Used Avro serilization and Deserilization.
- Deployed the services to AKS.
- Handeld unstructured data using azure blob.
- Developed microservice to save data to mongoDB.
- Connected to MongoDB cluster to fetch the records.
- Created CI/CD pipelince using Azure tools Git Hub, Jenkins, AKS(Azure Kubernetes)
Environment: Java 1.8, Restful web Services (Spring Boot), azure kubernetes service, Jenkins, confluent kafka 5.1.2, Spring Config server, Swagger,YAML, JSON, MAVEN, Bitbucket, Junit, Splunk,NEW RELIC,Spring Security,Azure Blob, MongoDb, CosmoDB.
Java Developer
Confidential, Atlanta
Responsibilities:
- Responsible for building scalable distributed data solutions using Hadoop.
- Experienced in loading and transforming of large sets of structured, semi structured and unstructured data.
- Developed Spark jobs and HiveJobs to summarize and transform data.
- Responsible for developing data pipeline by implementing Kafka producers and consumers and configuring brokers.
- Involved in converting Hive/SQL queries into Spark transformations using Sparkdataframes, Scala and Python.
- Experienced in developing Spark scripts for data analysis in both python and scala.
- Wrote Scala scripts to make spark streaming work with Kafka as part of sparkKafka integration efforts.
- Handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Exporting of result set from HIVE to MySQL using Sqoop export tool for further processing.
- Used Maven for building and testing project.
- Fixed defects as needed during the QA phase, support QA testing, troubleshoot defects and identify the source of defects.
- Used Mingle and later moved to JIRA for task/bug tracking.
- Used GIT for version control.
Environment: Cloudera5.8, Hadoop2.7.2, HDFS2.7.2, PIG0.16.0, Hive2.0, Impala, Drill1.9, SparkSql1.6.1, MapReduce1.x, Flume1.7.0, Sqoop1.4.6, Oozie 4.1, Storm1.0, Docker1.12.1, Kafka 0.10, Spark1.6.3, Scala 2.12, Hbase0.98.19, ZooKeeper3.4.9, MySQL, Tableau, Shell Scripting, Java.
Micro service Developer
Confidential, Illinois
Responsibilities:
- Build Spring Boot microservices for the delivery of software products across the enterprise.
- Experience in building RESTful based applications hosted in the cloud.
- Developed RESTful resources using RestTemplate for user account management. Consumed REST from various vendors.
- Designed and developed OAuth2.0 based RestFul service for connecting third party API.
- Implemented REST Microservices using spring boot and get Metrics on PCF, New Relic and Splunk.
- Used a microservice architecture, with Spring Boot-based services interacting through a combination of REST and Apache Kafka endpoints.
- Responsible for providing POS software/hardware support for corporate stores.
- Connected S3 bucket using spring based application
- Implemented Completable Future using Java 8 for parallel call for the microservices.
- Used spring config server for centralized configuration and Splunk for centralized logging. Used Concourse and Jenkins for Microservices deployment
- Integrated Swagger UI(JSON/YAML) and wrote integration test along with REST document.
- Implemented Fault Tolerance system for Distributed Systems. Wrote Controller, Services.
- Implement authorization and authentication using Spring Security.
- Experience in working with Spring Quartz framework and developed various quartz jobs.
Environment: Java 1.8, Restful web Services (Spring Boot), PCF, Jenkins,OAuth2, Spring Config server, Swagger,YAML, JSON, MAVEN, Bitbucket, Junit, Splunk,NEW RELIC,Spring Security.
Java Developer
Confidential, Tampa, FL
Responsibilities:
- Worked on confluent kafka core(API’s and CONNECT), K-streams and KSQL.
- Designed spring-kafkaproducer client using Confluent kafka and produced events into kafka topic.
- Subscribing the spring-kafka topic with kafka consumer client and process the events in real time using spark.
- Developed analytical engine usingApacheSpark and scala.
- Built real time pipeline for streaming data using Kafka and Spark Streaming.
- Implemented Batch processing and stream processing using Apache Spark.
- Written the test cases using Embedded Kafka API.
- Developed RESTFULAPI using spring framework 5.0.
- Developed automatic code using Apache Avro plug-in.
- Used Avro serializer and Avro De serializer for developing the kafka clients.
- Good knowledge on defining avro schema.
- Implemented Python Data Analysis using Pandas, Matplotlib and Numpy.
- Exploring with the Spark improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frame, and Pair RDD's.
- Implementedmicroservice architecture.
- Involved in Agile methodologies, daily Scrum meetings, Sprint planning.
- Developed Event driven architecture.
Environment: Docker, Confluent kafka 4.0.0, Apache kafka, Spring kafka, Spring framework 5.0, Spring boot 2.0, Spring webflux, Bucket, Openshift, Insomnia, JUNIT, Mocito, Spark, Cassandra, MongoDB, Scala, Python(Pandas, Matplotlib and Numpy).
Java Full Stack
Confidential, Tampa, FL
Responsibilities:
- Responsible for requirements analysis, technical design, implementation and testing.
- Implemented service layer using Spring IOC and annotations and Controllers using Spring MVC.
- Implemented DAOs, entities using JPA.
- All the functionality is implemented using Spring IO / Spring Boot, Thymeleaf, JPA and JDBC Template . Implemented Java EE components using Spring MVC, Spring IOC, Spring transactions and Spring security modules.
- Implemented all the components following test-driven development methodology using JUnit.
- Designed and developed SSO to connect to verify the user using OAuth 2.0
- Configured Single Sign On (SSO) between SAS application including different domains.
- Used Bitbucket as version control system.
- Created build and deployment scripts using Maven.
- Developed the Sign up and Login flow for Stores.
- Implemented the product listing and management functionality, using which stores can add new Users and update and delete the SAS users.
- Worked on live 8 node Hadoop cluster running CDH 4.
- Used Sqoop to import the data from RDBMS to Hadoop Distributed File System (HDFS).
- Developed several MapReduce programs to analyze and transform the data to uncover insights into the customer usage patterns.
- Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing the data into HDFS.
- Responsible for creating Hive External tables and loaded the data into tables and query data using HiveQL.
- Used Hive data warehouse tool to analyze the unified historic data in HDFS to identify issues and behavioral patterns.
- Created concurrent access for Hive tables with shared and exclusive locking that can be enabled in Hive with the help of Zookeeper implementation in the cluster.
Environment: Apache Hadoop, Hortonworks, Sqoop, MapReduce, Pig, Hive, Zookeeper, HBase,Python.