We provide IT Staff Augmentation Services!

Sr. Cloud Engineer/architect Resume

0/5 (Submit Your Rating)

Long Beach, CA

SUMMARY

  • An accomplished software Devops engineer specialized in API’s Designing and dealing with end - to-end solutions with any product and with extensive analysis and full Software Development Lifecycle (SDLC) experience in Application Software Specifications, Design, Security, Development, automation, Testing and debugging, Maintenance.
  • Almost 9 years of IT experience in analysis, design, Security, development and implementation of applications architecture.
  • Migrated a column-based database IBM DB2 Zos to AWS RDS (Aurora).
  • Enabled Multi AZ deployments for AWS RDS for enhanced availability and durability of instances.
  • Enabled RDS read replica for better performance.
  • Coded AWS Lambda functions in Node js and written a unit testing for source code.
  • Enabled encryption calls between every component by using AWS KMS.
  • Coded a python script for One-time data load by using informatica tool.
  • Coded a daily ingestion/ data ingestion for daily sync updates to the database.
  • Created a Route 53 for DNS management and private traffic route for API gateway.
  • Experience with all components of APIGEE edge Environment like APIGEE Baas, APIGEE Sense bot, APIGEE developer portal.
  • Moderate level skills on jQuery, JavaScript, JSON, HTML 5, DHTML, CSS 3, Tiles, Tag Libraries.
  • Designed REST architecture and SOA architectures, REST principles and security.
  • Experience on ANTI PATTERNS of APIGEE like caching the error response cache from backend and Quota policies for traffic management.
  • Experience on Flow hooks and shares flows concepts and building and creating and developing for Flow hooks and shared flows.
  • Used Backbone.js and React.JS to create Controllers to handle events triggered by clients and send requests to serve and Hightchart.js for generating reports.
  • Had Experience using Load Balancer for manage traffic with Edge Micro Gateway Administrator.
  • Hands on Experience in creating Api Proxies in Apigee Edge using Node.js, Java Script.
  • Owned and supported smooth transition from legacy products to newer version of Apigee Edge and some of the customers who were using competition products are migrated successfully.
  • Extensive experience in APIGEE Tools and creating developer portals.
  • Expert level skills on jQuery, JavaScript, Action Script, JSON, HTML, DHTML, CSS, Tiles, Tag Libraries.

TECHNICAL SKILLS

Components Involved: APIGEE Edge UI, APIGEE Analytics, Apigee Developer Portal

WEB TECHNOLOGIES: JavaScript, XML, JSON, XSLT

WEB SERVICES: JAX - WS, JAX-RS, SOAP, REST.

DATABASE: IBM DB2 Zios, Postgress, Cassandra,MySQL, SQL SERVER, AWS RDS, AWS DYNAMO.

Modeling Tools: UML, Visio, DRAW.IO

TESTING TOOLS: Postman, SOAP UI

VERSION CONTROL: Git Lab, Git Hub, Azure Git

PROFESSIONAL EXPERIENCE

Confidential, Long Beach, CA

Sr. Apigee Engineer

Environment: Apigee Edge UI, Apigee Dev portal, Postman

Responsibilities:

  • Gather Apigee requirements in a pre-initiative meeting to make sure Apigee has all the minimum requirements to start.
  • Working with security teams to resolve IP and networking issues.
  • Implemented and involved End to end development projects.
  • Worked various façade patterns, some of the publishers involve Salesforce, Pega, CVS
  • Implemented IP authorization at DNS level and end to end network level testing.
  • Implemented the projects in Apigee End to end until production.
  • Implemented Azure OAuth 2.0 and 3 legged and 2 legged OAuth.
  • Implemented and used all types of API proxies and validations by using schemas.
  • Developed Rest APIs using Swagger
  • Implemented RESTful Web services to retrieve data from client side and made REST API calls and parsed the data to project to client and handled the security and Authorization using OAuth 2.0.
  • All API Gateway specific logic can be built into proxies or in these wrapper APIs.
  • Used GIT for Version control across common source code used by developers.

Confidential, Irving, Texas

Sr. Cloud Engineer/Architect

Environment: AWS, React, Kubernetes, Docker

Responsibilities:

  • Figure out and understand the requirements of the project work and project method.
  • Implemented single page web app in React js. Having multiple components and calls.
  • Created multiple proxies in APIGEE, A rest client call and soap call and managed multiple api’s in different platforms
  • Deployment and code couple of AWS lambda functions in Node js and deploy it in AWS.
  • Used Docker for containerization of our application.
  • Used Kubernetes to host Docker into K8S.
  • Implemented and used multiple services in AWS like Secrets manager, lambda, RDS,

Confidential

Sr. Cloud Engineer/Architect

Environment: AWS

Responsibilities:

  • Determine the requirements in on-prem and learn how data is behaving in Realtime environment.
  • Used AWS RDS with postgres engine with latest version.
  • Determined the on prem data and migrated the data into AWS RDS by using ETL informatica.
  • Configured security through the use case, where security groups, VPC connections, NAT IP address, Ingres and outgress.
  • Configured Route rules in VPC and carefully configured that to the security groups.
  • Worked on networking inside AWS, faced diverse kinds of experience in securing instances in AWS.
  • Tested Amazon web services instances with security and with security by assigning VPC and subnets.
  • Created and Updated the Cloud formation stacks and templets for almost every instance.
  • Dealt with traffic redirecting to new https URL, if any new versions bumped into next versions.
  • Had multiple Architect review of each use case in various aspects of data securing.
  • Used KMS policies, data at rest and while transit. Used like encrypt and decrypt the data.
  • Dealt with Structed and unstructured data and data streaming. Loading the data into S3 buckets and pooling that data into various databases available in amazon web services.
  • Determine role and policies, configured whole of roles and policies in IAM for whole use case.
  • Worked on multiple logins into AWS, by using Azure tools.
  • Worked on IAM groups and Inline policies, custom policies, custom versions in policies.
  • Extremely valuable experience on Lambda functions code in Node framework.
  • Created a path for data synchronization by using lambda function. A continuous flow for updating database.
  • Worked on AWS API gateway, configured a private gateway for the application called transit gateway.
  • Configured Route 53 for private traffic, and DNS for this migration only.
  • Used data conversation in lambda functions, as per the IBM conversation documentation by using delimiters repeated the data values.
  • Experience in trouble shooting by using CloudTrail. And Logging by using cloud watch.
  • By using AWS SNS an email alerts and text messages whenever there is a change in database.
  • Created EC2 instance and assigned a security group which allows only HTTP (443) and DB (5432) calls to database.
  • Code whole AWS project in latest Terraform version and used Terrascan for unit testing for terraform.
  • Configured Git pipeline and Jenkins pipeline and AWS code commit and build pipeline and executed.
  • Determine which database is needed as per business requirement.
  • Scalable session management is implemented in Relational database.
  • Used terraform for coding and Jenkins pipeline for code automation.
  • Used Nodejs for coding lambda function and Mocha js for writing test cases for lambda functions.
  • Used python for Initial data load and /copy command in EC2 to transfer to AWS RDS.

We'd love your feedback!