Lead Developer / App Architect Resume
PROFILE:
- Over 16 years of experience architecting, designing and developing Microservices based, 12 - Factor Cloud-Native Apps for a variety of sectors - like Capital Markets, Retail Banking, Finance/Mutual-Funds, Insurance, Telecommunications, Government, Health, Software.
- 4 years in Capital markets (Fixed income, Equities, Mortgage backed securities) and 4-5 years in Banking sector.
- Hold M.S and B.S degrees in computer science.
- Confluent Kafka certified Developer, hands-on Cloud platform experience with AWS, Azure, GCP, Pivotal Cloud Foundry.
- Proficient in implementing cloud-native real-time data streams using cutting edge technologies like Confluent Kafka platform, AWS Kinesis, AWS Lambda, ElasticSearch, Redis, Dynatrace (for monitoring/Alerting), Akana, Apigee (for API Gateway) etc.
- Expert in Microservices patterns like Event Sourcing, CQRS, Distributed transactions, Circuit Breaker, Saga, Backend for Front-end patterns.
- Hands-on with Container Orchestration platforms - Kubernetes, OpenShift, Pivotal Container Service (PKS).
- Full stack experience working with Java in the backend and JavaScript frameworks for front-end UIs.
- Java frameworks - Spring5 / Spring Boot 2.x / JEE7
- JavaScript frameworks - React 16.x, Angular 8.x / Ember 2.x, Node.js 8.x/10.x/12.x, Express.js
- Web technologies - HTML5/CSS3/JavaScript (ES2015)
- Expertise with Database products
- NoSQL - ElasticSearch, Cassandra, Redis, MongoDB
- RDBMS - MySQL, PostgreSQL, Oracle, SQL Server, IBM DB2
- University-level -winning public speaker, impressive inter-personal skills with strong leadership abilities developed while leading mid-sized teams on various client assignments.
TECHNICAL SKILLS:
- Enterprise/Cloud Application Development
- Play framework, Spring Framework/SpringBoot/Spring Cloud - 10 years
- React.js, Node.js, Jest, Mocha, Enzyme, Cypress, Protractor - 5 years
- Python 2.x/3.x, Flask - 5 years
- Multi-threading/JEE/J2EE/JMS/Servlets/JSP/Thymeleaf - 12 years
- Pivotal Cloud Foundry/Docker/OpenShift 3.11 (Kubernetes) - 4 years
- GCP, Azure Cloud, AWS Cloud - 2 years
- Spring Framework (IoC, AOP, JDBC, Spring Data JPA, Hibernate) - 12 years
- Web Services (JSON, Spring REST, SOAP, JAX-WS, JAX-RS, Spring-WS, Apache CXF)
PROFESSIONAL EXPERIENCE:
Confidential
Lead Developer / App Architect
Responsibilities:
- Leading the commercial banking Customer/Agent Chat project and Report Builder projects.
- Built several React based front ends with Node.js and Spring Boot backends. Did integration with Hazelcast distributed cache for improving the response times for microservices. In separate project used Redis Cache as backbone for Chat engine to improve the response times. Did integration with Kafka broker for handling async report generation requests. Did integration with Amazon S3 for uploading cheque images for Report builder.
- Implemented the edge-Gateway pattern for the APIs using Node.js, it handled CSRF, CORS level security features for the API.
- Participated in the SAML based security setup for Citi Business Online portal with Citi Velocity portal that hosted the Report Builder app.
- Deployed the projects via CI/CD pipelines built using TeamCity, Urban Code deploy. Also did various monthly production deploys and resolved issues when needed (related to 2-way SSL setup, s, WIP Load balancer setup - to name a few).
- Collaborated with teams in India, China, and US to resolve several production issues and managed ongoing sprint issues as well.
- Fixed several issues with Commercial banking and Citi Business online portal related to OAuth, SAML based authentication flows.
Technologies Used: Java 8, Node.js 10.x, Spring Boot 2.4, React 16.x, Redux (for state management), React Hooks, Spring Kafka, Quartz, Spring REST, Spring Integration tests, Spring Data JPA, MongoDB, Confluent Kafka, Swagger, PCF (Pivotal Cloud platform), OAuth, SAML, Redis cache.
Confidential
Lead Developer / App Architect
Responsibilities:
- This project was a part of multi-year program to support SDM’s Financial Systems’ migration to align with Loblaw’s strategy of making SAP ECC and CAR POSDTA as official systems for Revenue posting, Sales Audit and Enterprise Reporting.
- Built several PCF/GCP based Data transformation pipelines using Spring Kafka-Based/MQ and Node.js based based Microservices to support various input sources (like ePOS, and pharmacy systems like HealthWatch, KROLL, AssistRx, AMPLE). The data feeds represented large variety of sources including file, queue, APIs, and various formats with differing frequencies. These data pipelines built using Spring Cloud Data Flow fed the transformed data into SAP systems via RFC (SAP proprietary Remote Function call)/REST/SOAP interfaces. Wrote various data maintenance Stored procedures using PL/SQL for SAP ECC (Oracle) platforms. Also did performance tuning for SQL Queries using indexing and Explain Plan usage.
- Built React based SPA for the business users to analyze/fix data issues and release the stuck data records. Implemented OIDC authentication against Azure AD to provide SSO experience for the users.
- Built several OIDC/OAuth based integrations with Azure AD (OIDC provider) and on-prem solutions as well off-prem SaaS apps like FormHero. Provided SSO experience to the Shoppers DrugMart pharmacists/pharmacy assistants between several Angular based pharmacy web applications. Used Node.js to build an edge Gateway and PCF SSO tile to implement full authorization code flow from OIDC spec to intercept all calls to the Angular app and Node.js & Spring Boot APIs. Wrote the Node.js API tests using Jest, Mocha frameworks. Wrote the Spring Boot API tests using Spring Boot / JUnit/Mockito testing frameworks.
- Done extensive work with Docker containers, deployed them in Pivotal Cloud, GCP. Built Jenkins/BlueOcean based CI/CD pipelines to deploy the apps using Fabric8/Docker-maven-plugin, Used the full BlueOcean based authorization workflows to run the application deployment approval process through different environment managers. Extensively used Kubernetes/OpenShift as orchestration platforms. Used Maven Dekorate plugin to generate Kubernetes artifacts.
- This AngularJS based Web application was built to provide holistic platform for IT colleagues to monitor store profile information like location, devices, software, hardware, scales, IoT devices (like ESL tags - Electronic Shelf Labels) etc. It also allows to better plan for store activities like store upgrades, device refresh cycles, also helps with analyzing Store Major Incidents and root causes for problems.
- Built a Node.js based Data Ingestion app that consumes data feeds (file/topics (Apache Kafka) based) from various Loblaw and Shoppers’ systems.
- Built Node.js and Express based API to serve AngularJS based front-end.
- Deployed the apps using Jenkins based CI/CD pipelines running on VMs.
- Used Postman and SOAPUI for local client testing. Used Spring Boot testing framework and RestTemplate to write unit and integration tests (using REST-Assured and Spring MockMvc libraries) to provide over 80% test coverage (reported via JaCoCo) for the microservices codebase. Also employed IntelliJ/SonarLint for static code analysis and to ensure best coding practices’ compliance. Used various latest plugins like Lombok, Custom Mappers using MapStruct for cleaner, terse Java code.
- Employed OpenSSL toolkit to generate s for doing mutual authentication (using 2-way-TLS) setup between various calling clients and SAP.
- Followed agile methodology using Atlassian JIRA/Confluence on-prem portals. Attended daily standups, participated in story mapping sessions to divide the features into user stories. Main focus was on delivering working software than providing elaborate documentation.
Technologies Used: Node.js, Express, Jest, Mocha, Java 8, Spring Boot 2.x, Angular 9, Spring Kafka, Quartz, Spring REST, Spring Integration tests, tcServer, Spring Data JPA, Spring Cloud Data Flow, MongoDB, Confluent Kafka, Swagger, PCF (SSO tile, Spring Cloud Gateway tile, CloudAMQP, Redis On Demand), GCP Cloud Storage, GCP Cloud SQL, GCP external http(s) Load Balancer, GCP App Engine, GCP Cloud Pub/Sub, Apigee, RabbitMQ, IBM MQ, IBM MQ AMQP plugin, Kubernetes/OpenShift, Maven Dekorate (for Kubernetes), Fabric8 (for Docker) plugins.
Confidential
Lead Developer / App Architect
Responsibilities:
- Built scalable, resilient Microservices for RBC Direct Contribution Pension Plan (DCPP), Enterprise Client Identifier (ECI) Program. ECI Program builds on the One RBC Initiative to collate all enterprise-wide customer consents in one data source, harden those consents using Machine Learning and customer feedback and then serve them enterprise-wide using APIs and Kafka data streams. This employed SpringBoot/SpringData/Kafka (with Kerberos Security/Avro support)/ElasticSearch/Cassandra, SQData, Kibana. It also employed Spring-Batch based datapipelines for Cloud-scaled batch file processing (loaded from Mainframe systems using Unix system services (USS) module).
- Worked closely with CCP group's solution architects to architect and build Client Identity enrichment pipeline solution for CCP. Itinvolved several RBC systems including Sales Platform, Wealth Management, CCP, ECR, SRF, CPC. Presented Architecture for Data Streaming platform based on Kafka to RBC Lead Architects and IBM Executive Architects while evaluating typical ETL based pipelines.
- Worked on the Node.js/Express based Enterprise Client Search API and New Client onboarding solution and its integration with IBM MDM APIs. Implemented OAuth security patterns using Google's Apigee Gatewayand PCF cloud environments. Also worked on Akana migration to Apigee Gateway.
- Improved the performance (20x faster)for our Day-0 data onboarding solution using ElasticSearch Batch APIs and analyzed/fixed hard data analysis problems in production.
- Built OpenShift containers-based CI/CD pipelines using Jenkins (Pipelines as code feature)/IBM Urban Code deploy tools to deploy code in all environments after running unit, integration and performance tests.
- Used Redis based distributed cache service (for APIs running in PCF cloud) for maintaining North America wide branch transit information for the APIs to get millisecond response times while querying for this information.
- Worked on ETL POCs based on Apache Nifi and IBM DataStage suite of products to consume Avro messages (from Kerberos/SSL secured Kafka brokers). Used Kafka Rest Proxy to build data test pipelines and Kafka ElasticSearch/Cassandra Connectors to push data from Kafka to ElasticSearch/Cassandra.
- Setup all our APIs to use PCF based Config Server and Vault setups to securely store/retrieve all sensitive information. Allowed hot refresh of properties without service restart with RabbitMQ service bus integration.
- Provisioned several new ESXi nodes running RHEL VMs, installed SQData (CDC solution) for Data pipelines and worked as the main point of contact with Cloud Adoption team to provision Kafka pipelines in dev to production environments.Deployed all our applications components to AWS Elastic Beanstalk.
- Worked with Dynatrace team to configure alerts for all APIs to allow for push notifications to report all API issues from cloud environment.
- Led the dev team to take around 14 components to production following RBC's go-live procedures and change management processes.
Technologies Used: Node.js/Express, Jest, Mocha, Java8/Spring5/Spring Boot2.x, JMeter, Swagger, Dynatrace, IBM DB2,Spring Boot, RabbitMQ, IBM MQ, IBM DataStage, Apache Nifi, Pivotal Cloud Foundry 2.5, AWS, ElasticSearch 6.x, Cassandra 3.11, Redis 5.x, Confluent Kafka 5.1, SQData (Change Data Capture solution), Spring Cloud design patterns (Config server, Eureka, Zuul, Ribbon, Spring Cloud Sleuth, Zipkin, Cloud Data Stream, Hystrix circuit breakers), Spring Cloud DataFlow, Spring Cloud Data Streams, Spring Batch.
Confidential
Sr. Full-Stack Software Engineer
Responsibilities:
- Architected and developedseveral microservices for the CIBC Cardinal project - Single Sign-on API, Session Extend API,CIBC Cross-border account open API.
- Worked closely with CIBC e-Banking team around architectural discussions to make sure their existing services (CustomerProfile, EligibilityCheck, Data API) serve the front-end needs.
- Worked as one of the lead front-end developers for the CIBC Cardinal Angular based project. Developed several screens/features in the application - Authentication screen, Application form for Applicant/Co-applicant, Analytics Reporting to Adobe Digital insights, OTVC functionality (2-factor authentication), Global Error handling/reporting.
- Wrote end-to-end acceptance and component tests to provide test coverage for the application using Protractor, Jasmine. Use TSLint for static code analysis.
Tools: /Frameworks used: Angular 5.x, TSLint, Typescript, Protractor, Jasmine, CSS3/SCSS, HTML5, Several CIBC Angular component kits, Spring Boot 2.x, MongoDB3.x, Swagger, Pivotal Cloud Foundry, Spring Cloud design patterns (Config server, Eureka,, Zuul, Ribbon, Spring Cloud Sleuth, Zipkin, Hystrix circuit breaker)
Confidential
Sr. Microservices Developer/Architect
Responsibilities:
- Worked under aggressive timelines to build Big Data Analytics PoC for Technology & Operations Head in RBC. This employed SpringData/Kafka (with Kerberos Security/Avro support)/ElasticSearch/Cassandra, Kibana analytics dashboards showing live customers' purchase behavior and demographics information. It also employed Spring-Batch based parallel partitioned pipelines for Cloud-scaled batch file processing (loaded from Mainframe systems using Unix system services (USS) module).
- Built scalable, resilient Microservices using Node.js/Express frameworks for RBC Online Banking, RBC Insurance, Credit Services Bus - built and productionized various services which include - Core banking services for Multi-Bill Pay services, TransUnion Bureau Credit Check services, New Auto-Save Account Open Services. Handled the offline and online processing scenarios.
- Architected and Built Transaction Description enrichment pipeline solution for RBC online banking. It employed Kafka/ElasticSearch/Cassandra and involved several RBC systems including COLT (Canadian Online Teller), DDA, CCP, PTB/CIS, SRF, CPC. I also gained in-depth business knowledge about all these systems.
- Built Angular based Swagger reverse-engineering portal. This allowed easier collaboration between business analysts. Allowed setting up microservices contracts, test data, published endpoints for quick prototyping - all without developers having to write any code.
- Built and architected LDAP/Global Directory APIs to be used across the bank for Client Profile/Client Appointments Booking services.
- Designed Tokenization based micro-service - an HSM Cryptography solution. This solution is getting used across the bank by a variety of micro-services.
- Worked closely with Pivotal and their Advisory Architects during a 6-week Dojo exercise and identified the best design patterns to be employed at RBC for Micro-service architectures. Attended Confluent Kafka to help identify the best practices for RBC Kafka implementations. Certified as Confluent Certified Kafka Developer.
Tools: /Frameworks used: Java8/Spring 4.x/Spring Boot 1.5, JavaScript ES2015/Angular framework2.x, TSLintProtractor, Jasmine, Node.js, JMeter, Swagger, Dynatrace, SQL Server, AWS Lambda, IBM BluemixElasticSearch 5.x, Cassandra 3.0, AWS Beanstalk, Confluent Kafka 4.x, Pivotal Cloud Foundry, Spring Cloud design patterns (Config server, Eureka, Zuul, Ribbon, Spring Cloud Sleuth, Zipkin, Cloud Data StreamHystrix circuit breaker)
Confidential
Sr. Java Consultant / Architect
Responsibilities:
- Worked in Citi Market Operations Technology Division. Architected and developed two high performance back-office trade processing applications.
- BroadRidge Realtime Trading Platform (BR RT): A high performance mortgage backed securities-based trade processing system that gets the trades from Broadridge (Impact/BPSA) and builds real time journals, firm positions, settlement ladders (projections), and also publishes them downstream to Citi’s Liquidity Desk and Collateral Optimization Platform.
- Global Stock Record (GSR) processes fixed income products like borrow/loans and repos/reverse-repos, from upstream systems like LOANET, MATRIX and GFDTS. It also builds real time positions/activities and uses Oracle Golden Gate technology to send it to our Global Positions Data Warehouse which acts as the central data lake for all Citi systems.
- Architected our latest initiative-Our GPDW data warehouse will be serving real-time positions and trade updates via a secure REST API to the Citibank wide Asset Servicing platform-ASPEN. We are using binary Google protocol buffers message format for its high serialization performance and compact size compared to JSON.
- Helped architect and develop an in-house web app-based solution for automating our end-to-end integration test-cases that can be launched and validated from a light-weight front-end by the QA resource.
- Built single click (via Jenkins automated jobs) production load automation utilities that run a given timeframe’s production trades load in UAT environments.
Tools: /Frameworks used: Java 8, REST, Quartz, JEE/JMS, Jenkins, ELK Stack, Spring Integration, Oracle (SQL, PL/SQL), MongoDB2.x, Spring (Core, AOP, Batch, Core Java (with multithreading)
Confidential
Sr. Designer / Developer
Responsibilities:
- This project involved designing and developing a brand-new Lottery Gateway web services-based system that integrated with existing (OpenVMS based) legacy backend systems that currently run Lottery operations. An iGaming (online) vendor used this Lottery Gateway as entry point to OLG infrastructure to buy lottery tickets and validate winning tickets. PlayOLG.ca portal is live - using this Lottery Gateway web service backend in production.
- Worked as a Java/Web services SME in the design process with the lottery specialists to design a multi-threaded Lottery Gateway system that would support 250 transactions/ second.
- Evaluated several technology stacks and decided the tool stack for the Lottery Gateway system (considering client budget and functional requirements).
- Designed and developed the (JAX-WS based) web services for Buy and Validate functionality along with the supporting backend infrastructure-Communication Manager, Device Manager, Format Converters, and Random Number Generator. JAXB was used as data binding framework.
- Used Hazelcast to maintain a high performance, distributed cache of backend handles (limited number of mainframe connections) for the front-end to call.
- Employed Java concurrency API & multithreading design patterns like Producer-Consumer pattern, and also did server-side performance tuning (like http thread pools, EJB instance pools) to achieve desired throughput (~250 tps) from the system.
- Employed Glassfish’s container managed lifecycle listener feature for running important business threads to make sure these threads get notified about server startup and shutdown events.
Tools: /Frameworks used: Java 7, Load UI/TCPMon/SoapUI Pro, JAXB Data Binding framework, JAX-WS Web services, Oracle Glassfish 3.1.2/NetBeans 7.2
Confidential
Tech Java Team Lead
Responsibilities:
- This project involved designing and developing a brand-new Notional Allocation module for the Affordable Housing project. This module would allow the Finance ministry resources to allocate and administer (almost a billion dollar every financial year) for all service managers and ministry users across Ontario. In addition, all the existing 5 project modules in AIMS project had to be modified to integrate with the notional allocation module to do cross validation for complex business rules.
- Worked as a Lead Java developer on the Notional allocation project for a team of 3 developers and developed 40% of the application.
- Developed the middle-tier business logic using EJB (employing various JEE design patterns).
- Designed and developed the persistence layer using JPA persistence framework and PL/SQL stored procedures.
- This project involved designing and developing a brand new, web based GICServ application for FundSERV using cutting edge web development technologies like JSF2 and jQuery JavaScript framework.
- This will be used by various product issuers, intermediaries and distributors to place product requests and track their progress.
- The dealers could place orders for the published products using the JMS queues-based order processing engine. Middle-tier was written using Spring and business rules were authored using Drools Rules Engine. User information was retrieved using Java LDAP API.
Confidential
Sr. Java Consultant
Responsibilities:
- This project involved designing and developing a brand new, web based GICServ application for FundSERV using cutting edge web development technologies like JSF2 and jQuery JavaScript framework.
- This will be used by various product issuers, intermediaries and distributors to place product requests and track their progress.
- The dealers could place orders for the published products using the JMS queues-based order processing engine. Middle-tier was written using Spring and business rules were authored using Drools Rules Engine. User information was retrieved using Java LDAP API.
- This project involved adding new and fixing existing features in the client’s GRC (Governance, Risk & Compliance) product-BPS Server.
- Designed/Developed several product features as Sr. Java Consultant.
Confidential
Sr. Programmer Analyst
Responsibilities:
- Completed following business projects working as Sr. Programmer Analyst in Rogers Dealers Web Applications Group.
- Integrated Customer Management (ICM).
- Fido Redeem Dollars and Fido (In-store) Renewals.