Engineering Manager / Delivery Lead Resume
3.00/5 (Submit Your Rating)
Beaverton, OR
SUMMARY:
- Experience in AWS, Block Chain / DLT, capital markets, pension funds,
- Experience in process management, change management and release management
- Experience includes requirement definition, analysis, design, implementation and maintenance of large - scale systems
- Experience with developing cloud based systems using Java, Node.js,GO, elixir, C# and C++
PROFESSIONAL EXPERIENCE:
Confidential, Beaverton, OR
Engineering manager / Delivery lead
Tools: Solidity, Network, Ethereum Wallet, Bitcoin TestNet, Lightning Wallet, Confidential Merchant Api
Responsibilities:
- Designed and developed Confidential innovation project to demonstrate the blockchain specific use case for Confidential .
- I created Confidential coin in Confidential test network and air dropped to few uses for distribution then allowed the uses to unlock specific items in Confidential website for the coin redemption. Members can also earn the coin credit for completing specific challenges and this increases the member activity and participation.
- The system also establishes the use case for crypto currencies during the checkout process in Canadian geography where user can pay for the product using any crypto currencies supported by Confidential .
- The coin payment is converted to fiat currencies using Confidential merchant API to eliminate the currency value volatility.
Confidential
Tools: Java, Spring Boot, REST, DynamoDB, Node.Js, React, BMX CI/CD pipeline
Responsibilities:
- Designed and developed auditing subsystem and reporting functionality for Confidential Member access tool.
- Auditing subsystem captures all the activity done by user to an invitation and reports based on multiple criteria.
- Implementation of this feature requires changes to RESTful spring boot micro service written in Java, DynamoDB as well as the user interface written in React/Node .js.
Confidential, West Lake, CA
Delivery Lead
Responsibilities:
- Data Aggregation System collects, process and distribute data to internal pipeline for various destinations.
- Collects data from 1700 sources and puts in Kinesis pipeline for standardization and normalization of the data using Pentaho and delivers to downstream system
- Interviewed, hired and on-board 10 member team in Bangalore, Coached the team in AWS services, Realtor systems and the delivery process. Integrated Bangalore team with realtor team in design, development and deployment process. Conducted weekly dashboard meeting with Realtor VP and director on the status of the projects.
- Coordinated agile distributed standup meeting, sprint planning and grooming meeting.
- Created the software delivery process for distributed team including document management (wiki), maintenance of JIRA, automated testing and continuous deployment.
- Produced weekly report on the quality of source code using Github and SonarQube.
- Designed and developed Data warehouse for Confidential . First, incoming data files are dropped in S3 and it triggers lambda producer which shards and sends to raw Kinesis stream.
- First Kinesis consumer picks the file and transforms as CSV files and drops to transform Kinesis stream Second consumer picks the transformed file and copies it to redshift staging table. Redshift data model is done using star schema and distribution key, sorting key for redshift is selected using performance metrics.
- Finally Tableau is used for visualization.
- Realtor Image Processing System
- Designed and developed new image processing system for Confidential .
- Realtor’s Images are upload using AWS Gateway API which triggers step function where the work flow is managed.
- The system uses AWS Rekongnition for categorization of images and elimination of inappropriate content. Finally Images are resized for different client devices using Pillow API and upload to CDN Edge cast for cashing in CDN network