We provide IT Staff Augmentation Services!

Sr. Cloud Engineer Resume

SUMMARY

  • Proficiency in DevOps life cycle across various cloud platforms (public and private), containerization tools, configuration management tools, versioning tools, automation tools along with various scripting languages.
  • Experience in designing, deploying, maintaining and operating the AWS cloud resources EC2, EBS, IAM, S3, ELB, RDS, VPC, ROUTE 53, OpsWorks, Cloud Watch, Kinesis, KMS, Cloud Formation and Terraform Templates, Auto scaling groups (ASG), Lambda, EMR, RedShift etc.
  • Creating and Managing Virtual Machines in Windows Azure and setting up communication with the help of Endpoints and VM Migrations from Transitional hosts on VMWare/ Boxes.
  • Experience on Administration and Troubleshooting of Azure IAAS Components (VM, Storage, VNET, OMS, NSG, Site to Site VPN, RBAC, Load Balancers, Availability Sets).
  • Good Knowledge on REST, JSON, APIs, Micro - Services, Java Script, jQuery, Nodejs, Pearl, Bash, YAML, Shell, Python, Power Shell.
  • Expert in deploying the code through web application servers like Web Sphere/Web Logic/ Apache Tomcat/JBoss and their installation, configuration, management, and troubleshooting.
  • Experience in installing, configuring, supporting, and troubleshooting Unix/Linux Networking services and protocols like NIS, NIS+, OSPF, LDAP, DNS, NFS, DHCP, SAN, NAS, FTP, SSH and SAMBA.
  • Experienced in custom VPC configurations and CloudFormation templates for rapid deployments, High Availability and well Architected Frameworks for the Cloud.
  • Experienced in using Docker in swarm mode and Kubernetes for container orchestration, by writing Docker files and setting up the automated build on Docker Hub.
  • Implemented several Continuous Integration and Continuous Delivery (CI/CD) Pipelines for various products using Hudson, Bamboo, Jenkins as well as build tools like Ant, Maven, Gradle.
  • Experience in using version control tools like GIT, TFS, Subversion (SVN), CVS and used GitHub and Bitbucket as repositories.
  • Integrated Docker into various infrastructure tools, including Ansible, Puppet, and VMware vSphere Integrated Containers.
  • Hands on experience on various Puppet objects such as Puppet resource, Puppet class, Puppet Manifest, Puppet modules and Puppet forge.
  • Experienced on AWS IAM Service: IAM Policies, Roles, Users, Groups, AWS Access Keys and Multi factor Authentication (MFA), SAML SSO.
  • Administered Git repositories and managed account administration, branching, merging, Patch fixes and snapshots.
  • Hands on experience on log monitoring the applications and servers using the tools like Splunk, Nagios and AppDynamics, ELK, Kibana, Log Stash.
  • Experience using modern storage and infrastructure services like Redis, Cassandra, MongoDB, RDS.
  • Strong grasp of current and future technologies including TCP/IP, IPv4/v6, RIP, EIGRP, OSPF, BGP, Frame Relay, VPN, Wireless LAN, and configuration of VLANS.
  • Experience in Installing Firmware Upgrades, kernel patches, systems configuration, system Performance tuning on Unix/Linux systems.

TECHNICAL SKILLS

Operating Systems: Red Hat, CentOS, SUSE, Ubuntu, Windows, Solaris, Amazon Linux.

Cloud Platforms: AWS, Azure, OpenStack, PCF.

Containerization Tools: Docker, Kubernetes, Mesos, OpenShift.

Versioning Control Tools: GIT, Subversion, CVS, Bitbucket, Gerrit.

CI/CD Tools: Jenkins, TFS, Bamboo.

Configuration Management Tools: Chef, Puppet, Ansible.

Build Tools: Ant, Maven.

Language/Scripting: Python, YAML, JSON, Ruby, Perl, C, C++, Java/J2EE, JavaScript, Powershell, Groovy, Bash

Application Servers: Apache Tomcat, JBoss, Web Sphere

Database: SQL, MySQL, NoSQL (DynamoDB, MongoDB, Cassandra), Amazon RDS, Redshift, Oracle.

Monitoring Tools: Nagios, New Relic, ELK, CloudWatch, Splunk, Alertsite, Azure Monitor.

PROFESSIONAL EXPERIENCE

Confidential

Sr. Cloud Engineer

Responsibilities:

  • Build, maintained and Administrated infrastructures in AWS and Azure providing different PAAS, IAAS, SAAS services supporting business to provide a cost and performance effective environments.
  • Automated environments build by creating CloudFormation, ARM and terraform templates to provide services such as VPC, EMR Clusters, Elastic Kubernetes Service (EKS), RDS, Redshift, EC2, Route53, ELB in AWS and Azure Kubernetes Service (AKS), VM Scaleset, Snowflake DB, Cosmos DB in Azure.
  • Created docker containers via Jenkins pipeline build releases by pushing the custom docker images to Elastic Container Repository (ECR) and then deployed and maintained in Elastic Container Service (ECS).
  • AWS Environment supports and maintain data management model applications such as Spark, HBASE, HIVE, ETL Tools, Elasticsearch and Kibana applications to provide a centralized and unified data.
  • Created pods in AKS using HELM Charts to monitor live streaming data of webserver utilizing Kafka, Zookeeper services.
  • Created pipelines in Azure VSTS/DevOps to automate data transfer from Snowflake DB to Webserver automated the process utilizing Azure functions.
  • Supported and Co-ordinated with Network teams to enable Secure VPN tunnels to traverse a VPN device (FortiGate or Palo Alto) by using Route Tables, Subnets, Security Groups and Secure Copy Protocol (SCP) to provide access from on-premises to Cloud.
  • For security and quality compliance monitored all opensource applications by using Black duck to provide a comprehensive software composition analysis.
  • Built VPC Peering connections to facilitate the bidirectional communication and transfer of data between the VPC’s in different AWS accounts using private IPv4 addresses or IPv6 addresses .
  • To secure the data processed with in the AWS Environment different AWS services are used like Key Pairs, Security Groups, Network Access Control Lists, AWS Key Management Service, Elastic Block Store Encryption, Database Encryption, Enforcing SSL.
  • Provided durable and available data by using S3 data store, versioning, lifecycle policies by configuring an SFTP server, and glacier as a disaster recovery option.
  • Worked with AWS CloudWatch and CloudTrail services to setup monitoring, alerts and to log all the activities within the AWS and Triggers to initiate the certain ETL process to push messages in different environments.
  • Created Lambda functions, CloudFormation templates and Automation Scripts using various scripting languages like Python, YAML, JSON, Shell, Groovy.
  • Supported different teams acting as Linux Administrator, by maintaining, configuring, and building new servers as per the request.

Confidential

Sr. Cloud and DevOps Engineer

Responsibilities:

  • Administered the AWS environments in a multitier Datica platform to achieve the HIPAA compliance.
  • Created Elastic Load balancers (ELB), EC2 instances using the Terraform templates for auto scaling and monitoring different environments such as Dev, QA, Stage runs on different VPC’s.
  • Created Ansible modules by python and shell scripting to enclave the code for deployment from JFrog artifactory.
  • Deployed Containerized applications by custom docker images and maintained clusters by Kubernetes services pods, replication controllers, labels, health checks.
  • Used YAML Node Anchors and Alias Nodes in Jenkins pipeline configuration as a single page workload to different environments.
  • Configured Applications like New relic and SFTP web clients to enable the authentication protocols like Security Assertion Markup Language (SAML) for Single-Sign-On (SSO).
  • Migrated the PHI data from the network shared drive to the S3 bucket and provided access to the users by authorizing via SSO.
  • Gradle Dependency Management system to build the source code, used it to deploy snapshots and release versions, into the JFrog artifactory.
  • Configured Nginx service proxy and associated SSL certificates to securely proxy web traffic to specific services.
  • Monitored application performance by integrating New Relic with applications, and used ELK stack to collect, search and analyze log files from the servers.
  • Configured monitoring alerts for the production sites in AlertSite and used DejaClick transactions recordings for the single page sites.
  • Used Atlassian tools, JIRA as ticketing tool for task/bug/request tracking and Confluence as collaboration tool and Bitbucket to share code segments.
  • Processed database records like Percona and PostgreSQL by writing Python and shell scripts for compilation and deployment process.
  • Coordinated with Datica support team to create and provide access to services as per the developer’s needs.

Confidential

Cloud Engineer

Responsibilities:

  • Performed the automation deployments using AWS by creating the IAM roles and used the code pipeline plug-in to integrate Jenkins with AWS and created the EC2 instances to provide the virtual servers.
  • Built and Customized AWS infrastructure for various applications from Scratch using IAM, Security groups, VPCs, OpsWorks, ELB, Auto Scaling, RDS, public/private subnets, roles, policies and established the connectivity between various AWS resources.
  • Modified CloudFormation template to reroute the NAT from instance to NAT gateway to improve the VPC performance and reduced the inbound traffic to the instances.
  • Created and configured Lambda functions to access the resources such as RDS Dynamo DB from within a VPC whenever a CloudWatch alert triggered.
  • Configured Workstation, Bootstrapped Nodes, generated Cookbooks and wrote recipes and uploaded them to Chef- server, Managed On-site applications Packages using Chef as well as AWS for EC2/S3 ELB with Chef Cookbooks
  • Automated the build process for core AMIs used by all application deployments including Auto scaling, and incorporating chef on the JSON Cloud Formation scripts for the configuration deployment onto the nodes
  • Integrated Jenkins with JSON, by locally designed JSON plugin, used to create AWS Cloud formation templates.
  • Worked out Source Code Analytics (SCA) before deployment to analyze the source code efficiency by using the Fortify Static code analyzer, SonarQube, Black Duck.
  • Assisted in deploying applications on multiple WebSphere servers and maintained Load balancing, high availability and Fail over functionality.
  • Installed and configured Oracle Web Logic, JBoss in Red hat Enterprise, CentOS and Ubuntu environments and supported Apache, Tomcat, and generated SSL keys to renew SSL certificates on webservers.
  • Assigned various monitoring tolls like Wiley, Kibana, Datadog, ELK (Elasticsearch, Logstash, Kibana) stack depending on the application type, functionality, and usage.
  • Worked on Unix platform which includes installation, configuration, and maintenance of applications in different environments like Development, Staging and Production environments.

Confidential

Cloud Engineer

Responsibilities:

  • Utilized Azure portal and Azure stack resources like Azure Cosmos DB, Azure Active Directory, Cloud Services, Container Instances, Azure DNS, HDInsight, and Traffic Manager for IAAS, PAAS, and SAAS.
  • Installed and configured Azure AD, AD Connect, ADFS and ADFS Proxy component, setup ADFS for SSO to support various type of authentication protocols like Security Assertion Markup Language (SAML) and token-based Kerberos
  • Automated the Docker Container building by Continuous Integration (CI) server and converted staging and production environment from a handful Azure Nodes to a single bare metal host running Docker.
  • Implemented a Continuous Integration and Continuous Delivery (CI/CD) pipeline with Docker, Jenkins and GitHub and Azure Container Service.
  • Worked with BlobStore for storing and managing Application code packages, Buildpacks and managed the lifecycle of containers and processes running using Diego cell rep in PCF.
  • Deployed the Spring Boot Microservices to Pivotal Cloud Foundry (PCF) using build pack and Jenkins for CI/CD and worked on PCF Dev to push and scale apps for debugging applications locally on a PCF deployment.
  • Automated various infrastructure activities like Continuous Deployment, application server setup, stack monitoring using Ansible playbooks.
  • Creation and maintenance of content for the Ansible community and implementation of Ansible modules based on customer and community requirements
  • Worked on Azure Fabric, Micro services, IoT & Docker containers in Azure and involved in setting up Terraform continuous build integration system
  • Managed servers on the Microsoft Azure Platform, Azure Virtual Machines instances using Chef Configuration Management tool by Creating Bootstrapping nodes, cookbooks, and recipes to automate system operations.
  • Integrated Azure Container Registry with Docker and Docker-compose and actively involved in deployments on Docker using Kubernetes.
  • Installed and configured Jenkins CI/CD pipelines, installed plug-ins, configured security, created a master and slaves for implementation of multiple parallel builds.
  • Created and maintained containerized micro services and configuring/maintaining private container registry on Microsoft Azure for Hosting Images and using Windows Active Directory.
  • Developed Chef Cookbooks to manage system configuration for Tomcat, MySQL, Windows applications and versioned them on GIT repositories and Chef Server.
  • Defined set of resources and specified deployment parameters to Azure Resource Manager (ARM) templates to create resource group of virtual machines.
  • Wrote Automation scripts to automate the provisioning of Azure resources like Virtual Machines, Virtual Networks, Traffic Manager, Storage, Service Bus and Scheduler make calls to scripts that provision or configure the servers on the instances.
  • Debugging Chef Recipes and their execution trying to pull logs into Splunk and Elasticsearch, Logstash, Kibana (ELK) stack and monitor deployments.
  • Installed and upgraded Red Hat & SUSE Linux server kernel and applying patches on a regular process.

Hire Now