Java Developer Resume
5.00/5 (Submit Your Rating)
SUMMARY:
- 14 years of extensive experience in design, development and testing of applications.
- Currently Working as Java Developer with Big Data stackat Confidential .
- Over 14 years of hands on experience with Object Oriented Analysis and Design, development, testing, maintenance of distributed and Client/Server applications.
- Good Experience in Microservice development as Java Developer.
- Proficiency in Data Structures, Algorithms, Object Oriented Design and Service oriented architecture.
- Experience in developing REST microservices with Spring - boot .
- Experience in development of microservice using spring cloud framework components like Configuration server, Service Discovery.
- Proficiency in JAVA, J2EE, C++ and Knowledge on Python.
- Familiar with designing RESTful web services using Spring MVC.
- Expertise in Big-data technologies like Elasticsearch,Kibana, Apache Kaka, Apache Spark, MongoDB, Zookeeper.
- Experience in Data ingestion, distribution and processing methods using BigData technologies .
- Experience with data handling of RDBMS with databases such as Oracle, MySQL, MS-SQL Server and MariaDB.
- Expertise on NoSQL database such as MongoDB and custom NoSQL database.
- Experience in developing of reports with Apache PDFBox .
- Experience in writing test cases using Junit.
- Proficiency in API testing with Cucumber framework.
- Experience in development of multithreaded applications.
- Experience in network programming and inter process communication.
- IPv4/IPv6, Network Security, SSL and Cryptography.
PROFESSIONAL EXPERIENCE:
Confidential
Java Developer
Responsibilities:
- Developed micro services using Spring Bootand Elasticsearch.
- Designed Live Dashboards using Kibana.
- Developed a micro service using Spring Boot and Apache Kafka to enrich the data.
- Developed a java application to automate the API testing, using Cucumber framework.
Confidential
Software Engineering Lead
Responsibilities:
- Optum care supports the providers to improve patient care and quality. It collects the patient data from
- Providers and normalizes the data and gets the insights of the data by performing
- Analytics on the data. Provides the insights of the data to the providers for better health care.
- Developed micro services using Spring Boot and HBase to track the Job status.
- Involved in creating Hive Tables, loading with data and writing Hive queries.
- Developing spark application for data normalization.
Confidential
Consulting Software Engineer
Responsibilities:
- Designed and implemented Real Time data collection infrastructure using Kafka, Flume, and Apache Spark on Amazon Cloud for Data Analytics and Adaptive Learning platform.
- Worked on Elastic Search for building big data search platform for indexing and search on millions of log events.
- Developed a framework to collect Cloud Watch/Cloud Trail Logs from Amazon Cloud.
- Developed a framework to collect the office365 data (Active directory/Exchange Audit data) from Azure Cloud using Office365 management API.
- Developed a custom Splunk App to collect the triggered alerts information at Splunk.
- Developed a framework to collect the triggered alerts at Splunk Enterprise application.
- Developed a Reporting engine framework to provide aggregated reports through Elastic Search engine, developed different types of reports like PDF, CSVs and HTML.
- Worked on EIQ SOC portal, a Cloud based platform, helps abstracting the complexity from the end user intern helping to focus only on actionable information and knowing the security posture of the organization.
Confidential
Senior Software Engineer
Responsibilities:
- Developed Monitoring Engine, to monitor various events/changes occurred within a network over a duration of time, categorized on different threat levels.
- Developed Alerting Engine, a rule-based system, which correlates data collected across a range of devices, over a duration of time, notifying the user on email, SNMP alerts.
- Worked on Data Extraction on a range of devices like Windows, Linux, Firewalls, Routers, IDS, IPS, Vulnerability Scanners, and LDAP. Data extraction in the form of Configurations, Assets, Vulnerabilities.
- Developed a Unified Log Format, a common log format to represent different log data collected from a range of devices.
- Using WMI APIs and Linux shell scripts, implemented a watchdog system for files, registry monitoring.
- Suggested and implemented ideas to scale the application to process hundreds of millions of log records using multi-level distributed approach with high availability.