- Experienced, fast learner, self - starter and highly motivated IT professional with excellent communication skills.
- Outstanding design, programming, integration, debugging and testing skills.
- Experienced in Big Data, Java, Scala, Python, Spring Framework, Hibernate,, Relational DB, NoSQL DB,
- Easily adaptable to change, with an eagerness towards learning and expanding capabilities.
- Ability to work autonomously or as a team player.
- Viewed as a strong developer/ Lead - efficient and maintainable code.
TECHNICAL SKILLS/ /PROGRAMMING LANGUAGES/ DATABASES:
- Over 13+ years’ experience in software development delivering solutions in Cloud, Big Data and Java Web Applications.
- Apache Hadoop, Apache NiFi, Apache Storm, Apache Accumulo
- Apache Spark, AWS, HDFS, DBFS, S3, ElastiCache, Elasticsearch, Kafka, Mesosphere DC/OS, Mesos.
- Java EE, Spring Framework, Spring Boot, Spring MVC, Hibernate, JPA, Spring Data, Web Services, WebLogic
- Angular, JSP, HTML, jQuery, XML,
- Weblogic, Tomcat, Jetty
- Microservices, Docker
- Geoserver, Zeppelin, Jupyter
- Scala, Python, R, AI, Machine Learning, Scikit-learn
- SVN, GitHub, Maven, Nexus, TFS, Perforce
- Oracle 11g, SQL Server 2008, - design, stored procedures, views, functions, Triggers.
Development Platform: AWS, Java, Scala
Development process: Agile Software Development
Frameworks: Apache Hadoop, Apache Storm, Apache Kafka
Database: Accumulo, PostgreSQL
Messaging Systems: Kafka
Miscellaneous: Jira, Mockito, Junit, Elasticsearch, Kibana, Jenkins, Maven, Puppet
Big Data Senior Software Engineer
Confidential, Germantown, MD
- Designed, implemented and tested several NiFi flows for ingesting data (CSV, JSON, Log files) from AWS S3 buckets into Hadoop cluster on AWS.
- Designed and implemented Custom NiFi processor to sample data within the NiFi flow to limit the amount of data ingested for unique scenarios during testing in Dev environment.
- Designed and implemented Custom NiFi processor to merge additional data into a regular FlowFile.
- Enhanced Splunk parser (Json Parser) to categorize data based on type field.
- Designed and implemented Json Parser enhancements to extract more fields from embedded CSV string.
- Designed, implemented and tested several custom analytics using Apache Spark.
- Monitor, troubleshoot and fix data ingestion issues in Production Hadoop cluster hosted in AWS.
- Tested Data Ingestion using NiFi Flows in Dev/Staging environment for several software releases to verify Parser functionality - new parsers, enhancements and bug fixes.
- Helped the team to setup Jenkins to run Maven code analysis plugins for Confidential codebase.
Confidential, Alexandria, VA
- Designed and implemented several ETL pipeline for ingesting data from Kafka, CSV, JSON, Postgres into Parquet format using Databricks Spark.
- Designed and implemented Clustering Analytics as microservice.
- Implemented DBSCAN, KMEANS Analytic container using Scikit-learn
- Designed and implemented Analytics Registration module as a microservice.
- Implemented Persistence using HIbernate/PostgreSQL for Registration of Analytics.
- Implemented RESTful webservice for Analytic Registration using Spring Boot.
- Designed and implemented a Custom Scheduler as a microservice.
- The scheduler launches analytic containers on the cluster when analytics are registered.
- Scheduler uses Mesos resource manager to allocate resources on the cluster
- Implemented the scheduler using RxJava and HTTP Scheduler API of Mesos.
- Designed and implemented Metrics API as a microservice for autoscaling Docker containers deployed on DC/OS cluster.
- Metricbeat collects cpu/mem stats from all Analytics containers and stores in Elasticsearch
- Metrics API analysis the metrics for the last minutes and scales the Analytics.
- Implemented the elasticsearch queries to aggregate metrics data from Elasticsearch
- Designed and implemented DIM(Data Integration Module) as a microservice.
- DIM extracts data from Elasticsearch in a format similar to Dataframe based on user selection of a datasource and CQL query.
- Implemented CQL filtering in the DIM
- Implemented Sampling of data from Elasticsearch to reduce the amount of data retrieved.
- Designed and Implemented Drive Manager as a library to be used in other microservices.
- Any output from running analytics is stored using Drive Manager to Amazon S3.
- AWS Java SDK is invoked to interact with S3.
- Designed and implemented JUnit or Integration test for all the modules - DIM, Metrics API, Custom Scheduler, Analytic Registration.
Senior Java Developer
Confidential, Alexandria, VA
- Worked extensively for integrating one of the backend services PALM (Patent Application Location and Monitoring) of Confidential into our application.
- Mapping data elements from the web service (Patent application information) into our Data Model.
- Implemented the service layer to retrieve business objects by consuming Web Service..
- Implemented business workflow using Hierarchical state Machine(HFSM)
- This module helps track the Patent Examiner activities and the supervisor activities related to Correspondence between Patent Office and Patent applicants.
- Designed and implemented Unit testing framework for the team using embedded database - HSQL, Spring framework, Hibernate, Junit and Maven and Jenkins. This added extensive unit testing/debugging/refactoring capabilities for the code base. The Unit tests runs against the newly exported Hibernate schema in HSQL (in memory database). As the test run on schema in memory, they run extremely fast..
- Part of SOLR team (3 developers) to implement search capabilities for the Official Correspondence project.
- Designed and implemented the SOLR schema.
- Implemented Incremental and full indexing of Official Correspondence documents using Tika.
- Conducted code reviews for the team based on Confidential coding standards.
- Setup Sonar with the Confidential rule set to detect code violations.
- Conducted for the developers for coding standards.
Senior Java Developer/ Lead Developer
Confidential, Alexandria, VA
- Actively worked with Business Analysts during every sprint (3 week) planning phase for requirement gathering and developed/implemented few user stories for every release.
- Worked with the team to develop a Data Model and implement the schema in Oracle Database.
- Setup Hibernate entity classes and implemented the entity mapping to Oracle tables. Used Spring Data JPA to implement this.
- Wrote the Service layer functionality for the modules I worked on.
- Designed and implemented message exchange between US and EU using web services. Planned, coordinated and worked together with EU development team to design and implement the solution based on an ICD(Interface Control Document). Used JAX-WS, WSDL, SOAPUI to implement and test this.
- Designed and implemented the Task Infrastructure Module to support tasking of users on the portal. Task templates (around 50) are kept in the database which is used to create/manage task for the different users of the portal. Supported the team with service layer interface for this module.
- Lead developer for performance related issues on the PROD portal. Several Hibernate related problems (too many queries being fired) were investigated and resolved successfully.
- Designed and implemented Email messaging infrastructure module on the portal. Around 75 Email templates are kept in the database and emails are created and inserted into the queue. A Quartz scheduler is used to dispatch the messages. Supported the team with service layer interface for this module.
- Ran custom queries against Oracle DB for the client for business needs.
- Lead Developer/ POC for production issues. Actively investigated/debugged/resolved production issues on the portal. Also coordinated work related to PROD issues with other developers in the team.
- Lead developer for migration of SQL Server 2005 to Oracle 11g. This was a six months effort to test, and migrate schema objects with data to Oracle 11g.
- Completed 11 releases in 4.5 years following Agile methodologies.
Senior Developer/Lead Developer
Confidential, Chantilly, VA
- Contributed to patent 20090205036 - Secure information storage and delivery system and method.
- Designed the Confidential Financial Alerts module for delivering financial alerts to the customers - alerts for deposit, debits, credits, etc based on user configurable thresholds for Bank and Credit Card Accounts. This was implemented using SOA architecture.
- Since the project was to be deployed on Oracle 10g, PL/SQL was used to program and deploy stored procedures, packages and triggers.
- Coordinated and worked closely with a third party. They did the financial data aggregation. The third party had an XML API for us to interact with their system.
- Designed the Data Model for storing Account data (transactions, account names, account type, alert types, etc) and user threshold configurations for individual accounts.
- Developed an Alerts engine using Oracle packages and stored procedure to analyze the transaction data and the user alert configuration to generate the alerts and deliver the same to the user Vault. The Alerts engine runs every hour to process new transactions for every user and generate alerts if the threshold is met.
- Worked in the Security Committee, which sets the guidelines for secure coding in Confidential . Made several recommendations for deploying countermeasures - network policies for a 3-tier production environment, SQL injection, session hijacking etc.