We provide IT Staff Augmentation Services!

Cloud Devops Engineer Resume

Dallas, TexaS

SUMMARY

  • Working in the IT field for 8 years under different roles as Cloud/DevOps Engineer, Linux Administrator and Middleware Engineer within Linux/Unix/Windows 7 & 10 environment. Implemented Container Orchestration, Continuous Integration, Delivery and Deployments (CICD) pipeline for different infrastructures on premise and cloud (AWS).
  • Worked with various Amazon AWS Cloud services such as EC2, S3, VPC, ELB, IAM, CloudWatch, Elastic Beanstalk, Security Groups, EC2 Container Service (ECS), Code Commit, Code Pipeline, Code Deploy, Dynamo DB and Auto Scaling.
  • Hands on experience in creating and customizing IAM policies, Roles, and user management for the users within AWS.
  • Worked towards customizing the size of AWS Cloud formation templates such as VPC, Subnets, EC2 instances, ELB and Security Groups.
  • Used Linux and Unix based systems such as Red Hat Enterprise Linux (RHEL) 6.x/7.x Centos and Ubuntu.
  • Designed and Implemented Continuous Integration, Continuous Delivery, Continuous Deployment and DevOps processes for Agile projects .
  • Used Configuration management tools such as Ansible and Ansible Tower to automate repetitive tasks, quick deployment of critical application, Environments configuration Files, Users, Mount Points, Packages and proactively manages changes.
  • Hands on experience in Ansible Playbook, YAML for maintaining Roles, Inventory Files and Groups Variables.
  • Installed Chef Servers Enterprise on premises/workstations/bootstrapped the nodes using Knife and used Test - Kitchen and automated Chef Recipes/Cookbooks .
  • Hands on experience in Chef and other Configuration Management Tools to Deploy Consistent Infrastructure Code Across Multiple Environments and also automated the services related to deployment by designing Chef Cookbooks, Recipes, Roles and Data Bags .
  • Used Puppet and Puppet DB for deploying servers for configuration management to existing infrastructure and designed Puppet 3.8 manifests and Modules to deploy the builds for Dev, QA and production.
  • Hands on expertise in creating functions and assigning roles in AWS Lambda in order to run the python scripts.
  • Worked with Docker components like Docker engine, Hub, Machine, Compose and Docker registry and also worked towards custom creating Docker Container Images, Tagging and Pushing the image to Docker Hub.
  • Used Docker tools like Docker Swarm and Compose to provide native clustering functionality for Docker Containers and to ren the multi Docker Container applications .
  • Hands on experience in using ANT, MAVEN, Perl, Ruby, MS Build and Shell Scripts in order to automate various Builds and Deployments.
  • Experience in troubleshooting and automated deployment to web and applications servers like WebSphere, WebLogic, Tomcat.
  • Ownership of the release domain for multiple applications and servers.
  • Experience in Branching, Merging, Tagging and maintaining across the environments using SCM tools like GIT, Bitbucket.
  • Utilized Kubernetes as a platform to provide a platform for automating deployments, scaling and operation of application containers across a cluster of hosts. Managed deployment of micro-services using Kubernetes, Docker, Helm charts and Istio for service mesh to create and configure production environment.
  • Used RPM and YUM package management for Installing, Upgrading and Managing packages.
  • Hands on experience in tracking different client and setup tracking using JIRA.
  • Installed Quality Gates in SonarQube in order to implement quality profile and standards.
  • Expertise in design and installing Volume Group using LVM, creating partitions on the Logical and Physical Volume and monitored file system, created swap, monitored and extended Volume Groups.
  • Expertise in Installing and Configuring of RAID hardware/software for data backup and storage, and also in designing of new file systems, mounting file systems and unmounting file systems.
  • Hands on experiences in designing and configuring Apache HTTP, SMTP, DHCP, NFS, NIS, NIS+, LDAP, DNS, SAMBA, SQUID, Postfix, Send Mail, FTP, Remote Access, Security Management and Security trouble shooting skills.
  • Have a deep knowledge of Protocols like HTTP, DHCP, DNS, SSL/TLS.
  • Expertise in the field of Linux including Red Hat Enterprise Linux, CentOS, Ubuntu, Debian, Configuring and administration of Red Hat Virtual Machines in VMware Environment.
  • Have understood the standards and great practices in Software Configuration Management (SCM) in Agile-SCRUM and Waterfall methodologies.
  • Have knowledge about the workflow in JIRA/Service now BMC Remedy and Worked with Agile SDLC under these tools .
  • Worked towards communicating changes, enhancing and modifying - verbally or in written documentation to various groups in the organization to facilitate new or improve business process and system via change log.
  • Worked with multiple client’s with multiple domains like Telecommunication, Insurance and Financial Services in order to provide technological consulting on their business solutions.

Professional Experience

Confidential, Dallas, Texas

Cloud DevOps Engineer

Responsibilities:

  • Hands-on experience with Cloud Infrastructure, Configuration management ( Ansible), Continuous Integration ( Jenkins), Container Management using Docker with Kubernetes .
  • Custom Designed VPC size with the help of AWS cloud formation templates, subnet for deployment of web applications and database templates.
  • Developing and managing virtual private cloud (VPC), wrote cloud formation templates and deploying it using AWS resources, Configuration of IAM polices and security groups in the public and private subnets.
  • Worked on AWS Cloud platform using its features such as EC2, VPC, EBS, Cloud Watch, Cloud Trail, Cloud Formation, AWS Configuration, Auto Scaling, Cloud Front, IAM and S3.
  • Finding problems and halting the deployment of the pipelines were able to figure out by using Splunk monitoring system log essentials.
  • Created AWS Route53 in order to route traffic between different regions. Installed and managed cloud Virtual Machine’s with AWS EC2 command lines clients and management console .
  • Experience in creating AWS Cloud Watch for EC2 Instances to manage notification and alarms, Cloud Trail Services, Cloud Formation and cloud front to setup and manage the cached content delivery.
  • Hands on Experience in Whitelisting the IP address in CloudFormation by obtaining the new office IP address from the IT team and whitelisted the them alongside the old IP blocks in the entire CloudFormation templates.
  • Made analysis after the deployment of the fixtures failed on Jenkins-Build and analyzed as a part of Jenkins failure and found that the one of the resources in the CloudFormation stack was failing when it was updating the Elastic search domain because one of the resources in the Elastic search domain was still in processing status instead of active status in the CloudFormation stack.
  • Installing and Configuring new CI pipeline, Testing and deploying automation with Docker, Ansible and Jenkins for continuous integration.
  • Experienced with Jenkins in order to integrate maven to generate builds, using Junit Plugin and Selenium to conduct unit tests and regression test respectively, storing jar, war and ear files using Nexus Artifactory, JIRA for document generation and Splunk & Nagios for monitoring purposes.
  • Experienced in using Docker and Vagrant to stage virtual environments, created and maintained various DevOps related tools for the teams such as provisioning scripts and deployment tools.
  • Hands on experience in installing, configuring and monitoring tools such as Nagios, Splunk for Resources, Network and Log Trace Monitoring. Used SonarQube in the CI pipeline for sonar metrics and code coverage.
  • Experienced in maven build scripts to develop pom.xml files, used Atlassian products like JIRA and confluence.
  • Hands on Experience in creating Transit Gateway Attachment, Transit gateway was already created in Terraform and was using Recourse Access Manager, Created Transit Gateway Attachment for two different accounts in CloudFormation and Transit Gateway Route Table got created by default and added VPC routes for Transit Gateway for both the accounts in CloudFormation in order for the Jenkins scan which was on public subnet and needed access to services which was on a different account and on a private subnet. Experience in creating Terraform Modules for Athena and DynamoDB.
  • Hands on Experience in Creating Parameters and Conditions for each environment.
  • Experienced in configuring MySQL Confidential o perform basic database administration. continuous Integration to build out server’s automation used Jenkins and Maven for continuous deployment and build management system respectively.
  • Hands on experience in writing Lambda services using Python and Java in order to automate various tasks and also written shell scripts, Ruby, Perl, Python and Power Shel l for automating task.
  • Used Maven and Nexus repositories to download artifacts during build wrote Ansible Playbook which is the entry point for Ansible Provisioning, the automation is defined through the tasks using YAML format .
  • Configured AWS Nodes and tested playbooks on AWS instances by scripting Ansible Playbook using Python SSH which was used as a wrapper to manage the configuration of AWS nodes .
  • Configured Ansible Tower, to provide easy-to-use dashboard, and role-based access control, which become easy to allow individual teams access to use Ansible for the deployment.
  • Hand on Experience on various scripting languages such like Bash, Perl, Shell and Python.
  • Used Docker files to create Docker Images, used Kubernetes onto a node cluster to deploy, schedule and manage container, also used Kubernetes to remove images and managing Docker Volumes and Docker Container Snapshots.
  • Configured a continuous delivery pipeline with GitHub, Jenkins, Docker and AWS AMI’s, A new GitHub branch used to get started, then Jenkins was used for continuous Integration the servers automatically which attempts to build a new docker container from it, docker containers leverages the Linux container and would have the AMI baked in.
  • To clone the production servers docker container where used and Kubernetes orchestration was used to clone the production servers.
  • Created Docker Images using a docker file, worked on Docker container snapshots, removing images and managing Docker Volumes also deployed using Kubernetes .
  • Experience in using Docker simultaneously with AWS for advancement of application to the next level of the application with 2-way SSL support.
  • Experience in automatically starting Docker Services when the Instances are rebooted in order to perform ZAP scan.
  • Hand on experience on POC implementation for continuous deployment pipeline with Jenkins and Jenkins workflow on Kubernetes.
  • Worked on Kubernetes to manage containerized applications using nodes, ConfigMap, Helm charts, selector services and deployed application containers as Pods .
  • Hands on experience in managing Splunk monitoring system log essentials to find problems and to halt the deployment of the pipelines.
  • Used Python Scripting to automate various tasks and also used Python Scripting on AWS Lambda in order to interact with various applications which were deployed on to the AWS Instances and Storages such as EC2 and S3 Buckets.
  • Hands on Experience in converting the Elastic Load Balancer to Application Load Balancer, where the Internal Load Balancer was currently reading HTTP and changed it to ALB by giving HTTPS access .
  • Used API ’s to pull information on JSON files with the help of python scripts and converted them as dictionaries and used them to validate the scripts and also used Rest API to create API Gateway.
  • Installed and used network monitoring tools such as Grafana, Prometheus, App-Dynamics, Splunk.
  • Installed and configured LAMP (Apache/Tomcat/MySQL/php), Reverse-proxy servers (Nginx) .
  • Installed the java application into the web application servers like Apache Tomcat and also was involved to configure Nix as a web server .
  • Used Apache Web Servers for deployment of the applications, and also used Nix and application servers such as Tomcat, Jboss .
  • Configured the AWS Security groups which acted as a firewall that controls traffic and allows to reach out for more than one AWS EC2 instances.
  • Good understanding of the CCPA request and was working along with Security and Compliance Team in order to create backup of the required data and also was working on created a new dev account for the Security and Compliance Team .
  • Planned and performed the auditing process and also recommend improvements to the process. Consecutively worked towards optimization of the process, to see that the work flows uninterruptedly in order to improve turnaround and resolution time.

Confidential, Whitehouse station, NJ

Cloud DevOps Engineer

Responsibilities:

  • Experience in designing and deploying large number of applications in order to utilize the most of the available AWS resources, Stack includes not limited to EC2, Route53, RDS, S3, Dynamo DB, Aurora, SNS, SQS and IAM focused mainly on high-availability, fault tolerance and auto-scaling in AWS cloud formation .
  • Hands on experience in deploying Content Delivery Network (CDN) and system development in cloud environments which also involved working with cloud/storage system along with few SaaS applications.
  • Worked towards designing of AWS cloud using Terraform to customize the size of the VPC, Subnets, NAT and also to ensure successful deployment of the web-applications and database templates.
  • Worked on storage and databases services on AWS such as S3, EBC, Glacier, automate sync data to Glacier, RDS, Dynamo DB, Aurora, Elastic Transcoder, Cloud Front, and Elastic Beanstalk respectively.
  • Designed AWS Identity and Access Management (IAM) for group users and users in order to improve login authentication.
  • Designed inbound and outbound access to the network interfaces (NICs), VMs and subnets with the help of Network Security Groups (NSGs).
  • Installed, configured and administered Atlassian tools such as JIRA, Confluence, Bitbucket and Bamboo.
  • Used Bamboo with ServiceNow with the help of API in order to create schema to store the build information and implemented ITIL change management process
  • Hands on experience to automated various infrastructure such as continuous deployment, Application Server Setup, Stack Monitoring using Chef Cookbooks and integrated Chef with Bamboo.
  • Experience in customizing the S3 bucket Policies and worked on providing additional layer of security for the S3 buckets and also worked on multi-factor authentication in order to avoid accidental delete and tracked API calls in order to audit all the AWS resources by enabling Cloud Trail.
  • Designed/Integrated Bamboo with Bitbucket to pull codes, ANT to generate builds and push artifacts to AWS S3 and also used Bamboo AWS Code Deploy plugins to deploy and Chef for unattended bootstrapping in AWS.
  • Used Terraform modules for automating the creation of VPC’s and launched AWS EC2 Instances . Modules were used in the creation of VPC and VPN connection from data center to production environment and cross account VPC peering.
  • Experience in using Terraform for managing the infrastructure through the terminal sessions and executing scripts in creating alarms and notifications for EC2 instances using AWS Cloud Watch and used Ansible Playbooks for various applications and deployed them in AWS using Terraform.
  • Configured ELK Stack to monitor system logs and integrated ELK Stack to the existing applications for real time log aggregation analysis and querying.
  • Designed Terraform templates, Chef cookbooks, recipes and pushed them onto Chef server for configuring EC2 Instances.
  • Configured Chef Cookbook for sudo users and network configurations using chef server.
  • Created branches, tags, merge and resolved conflicts by using GIT version control system .
  • Used CI/CD system with Jenkins on Docker for the run-time environment to build, test and deploy .
  • Used several docker components like Docker Engine, Hub, Compose, Docker Registry for storing docker images and files, and also ran multiple contains in staging and production environment .
  • Hands on experience in installing Jenkins as a service inside the Docker Swarm cluster to reduce the failover downtime to minutes and automated docker containers deployment without using any configuration management tools.
  • Installed Java application to application servers in agile continuous integration environmen Confidential and automated the whole process.
  • Experience in deployment of automation of all micro-services to pull image from the private docker registry and deploy them to Docker Swarm cluster using Chef and also experience in deploying containerized codes onto docker swarm clusters for high availability
  • Worked on various Network Management like DNS, NIS, NFS, LDAP, TFTP and system troubleshooting skills.
  • Automated deployments to the servers using JBoss, Tomcat and Apache web servers .
  • Hands on experience with Application Performance Monitoring tools like AppDynamics to monitor JAVA, .NET and PHP applications with New Relic in order to monitor performance like browser and to track issues with SQL statement.
  • Worked with Docker services for rolling updates and involved in implantation of BLUE GREEN DEPLOYMENT to attain zero down-time.
  • Hands on experience in installing and configuring automated tools which includes the installation and configuration of the puppet master, agent nodes and admin control workstations .
  • Designed and updated puppet manifests and modules, files and packages stored in the GIT repository.
  • Used Jfrog Artifactory to create local, virtual repositories and integrated with TeamCity.
  • Have in-depth knowledge of computer applications and scripting like Shell, Python, PowerShell, Bash and Groovy.
  • Worked closely with development, QA and other teams to ensure automated test efforts and integrated with the build system in fixing the errors while doing the deployment and building.

Confidential, Charlotte, NC

Build and Release Engineer

Responsibilities:

  • Involved in different phases of SDLC Requirements, Analysis, Design, Documentation, Testing and Implementation.
  • Identified and analyzed the build errors and issue in the system and escalated it to the concern team. Worked and coordinated with the teams to get the build fixed before the release.
  • Worked with build teams to reduce the build time and accomplished to reduce compilation time and redundancies by using cache .
  • Hands on experience with Bamboo which is a continuous integration tool, used it for official nightly builds, test and managing change list and installed multiple plugins in it for smooth and release pipelines.
  • Used ANT to maintain built related scripts .
  • Designed and Maintained nightly builds, hot-fix builds/branches, custom and private builds.
  • Used SVN administration in creating branches, tags, user and group account requirements, resolution of user access issues and responsible for data security and also used SVN repositories to handle release and perform Branching and Merging process operations for SVN
  • Experience in deploying to multiple environment in Testing, Staging and Production environments.
  • Generated project artifacts by configuring Bamboo.
  • Worked on the source code and compiling it using ANT and package it in its distributable format such as JAR, WAR and EAR and deployed them in the WebSphere application server.
  • Worked on WebSphere for deployment and created deploy scripts and setting for production release.
  • Used Apache Tomcat servers to deploy application packages.
  • Responsible for managing host plans for AWS Infrastructure, Implementing and deploying workloads on Amazon EC2 Instances and Implement Storage and have good knowledge on implementing images/disk, Hyper-V, VMware technologies and system center components .
  • Hands on experience in administration, configuration and support for Application Life Cycle Management (ALM) used tools like JIRA, Team Forge to track the progress and also used as ticketing tools.
  • Involved in creation of release notes to every scheduled release.
  • Continuously participated in software configuration and change management processes to

improve the build accuracy, time, version controlling and doing scheduled releases on time.

Confidential

Middleware Engineer

Responsibilities:

  • Hands on Experience on Kickstart servers for interactively installing, upgrading and configuring Red Hat Linux.
  • Used Red Hat RPM, Yum which is used in several Linux distributions such as Red Hat Enterprise Linux for package management.
  • Configured various servers such as FTP, WEB, NFS, SAMBA, Send mail & Autofs
  • Experience in using command lines to install and configure Selinux, and gave permissions to files and directories according to the requirements.
  • Experience in system routine backup, scheduling jobs like disabling and enabling cron jobs, enabling system logging, network logging of servers for maintenance .
  • Used various services such as JDBC, JNMC, JNMI, SNMP and J2EE for configuring and maintaining of WebLogic Deployed application in WAR, JAR and EAR domain and clustered environment . Used Oracle 8.i database to configure JDBC connection pools and data sources.
  • Hands on experience in integration technologies such as Mule soft, Apache Camel, Jboss Fuse, Fuse Fabric8 and created Active MQ with different topologies in Enterprise Integration.
  • Used WebLogic Server 8.x for administration tasks such as installing, configuring, monitoring and performing tuning.
  • Hands on experience in installing, configuring and maintaining application Servers like WebSphere and Web servers like WebLogic, Apache HTTP and Tomcat on UNIX and Linux.
  • Used Crontab for installing, configuring, performing tuning, security, backup, recovery and upgrade/patches of Linux and UNIX servers.
  • Hands on experience in installing and administrating of VMware vSphere ESXi environment, implemented new hardware and software solution that benefits architecture and operations implementing VMware SRM and troubleshot performance issues.
  • Performed Virtual server builds, increasing memory, CPU, adding disks, maintenance on ESX hosts VM and server/storage migrations using VMware.
  • Used VMware tools for installing and upgrading virtual machine hardware versions and performed cloning for provisioning virtual machine and to deploy the virtual machine from templates. Created and deleted VM’s snapshots.
  • Created data stores and assigned LUNs to ESX/ESXi servers . Added virtual machine and vcpu.
  • Worked on VMware VSphere vCenter Update Manager to apply patch to virtual machines and also Virtual Private Networking (VPN) tools and techniques to maintain these VM’s.
  • Used Samba and Apace Web Services to perform different software changes in VMware environment on customer’s servers and following up on Data Center personal for hardware related changes.

Confidential

Linux Administrator

Responsibilities:

  • Worked on Installing and configuring JBoss application server on various Linux servers.
  • Experience in configuring, installing and troubleshooting of Red Hat, Ubuntu and HP-UX on various hardware platforms.
  • Expertise in RHEL, CentOS System administration in an enterprise environment.
  • Worked on Red Hat Linux Tools like RPM to install packages and patched for Red Hat Linux servers.
  • Hands on Experience in Configuring Kick-start for RHEL (4 and 5) with FTP,HTTP and NFS, Jumpstart for Solaris and also to perform image installation through network.
  • Experience in installing, configuring, maintaining and administrating of HTTP, DNS, FTP, NFS, SENDMAIL, Apache, JBoss on Linux.
  • Used VMware ESX/ESXi to create Linux Virtual Machines and to install operating system on the guest servers.
  • Developed scripts for atomization of tasks such as customizing user environment, and performed monitoring and tuning with nfsstat, netstat, iostat, vmstat and ss.
  • Installing and upgrading packages with the help of RPM and YUM package management, Yum Repository creation and management, patching for Red Hat Linux Servers.
  • Creating new disk partition under Logical Volume Management (Physical Volume, Volume Groups and Logical Volumes) and disk mirroring in HP-UX and Linux.
  • Creating and Managing users, adding users and groups accounts, disk space usages, security, profile and file permissions.
  • Managed file system and created partitioning and new file system.
  • Worked on Configuring Firewall/IP-tables rules on new servers.
  • Used Linux/Windows based system including hardware, software and applications.

Environment: RHEL, CentOS, Solaris, JBoss, Ubuntu, VMware ESX/ESXi, VDI, Logical Volume Management, DNS, FTP, NFS, NIS.

Hire Now