We provide IT Staff Augmentation Services!

Cloud/ Devops Engineer Resume

5.00/5 (Submit Your Rating)

Tracy, CA

SUMMARY:

  • IT Professional with 7+ years of experience in the IT industry as DevOps/Cloud Engineer, Build and release management. Experience in the Complete Software Development Life Cycle (SDLC), Waterfall, Agile environment and server - side deployment in the application, middleware layers.
  • UNIX System Administration, working on the server-based operating system; kernel configurations on RedHat Linux, CentOS, Debian 7, Ubuntu 12.x -16.x in a DevOps Environment with CI/CD as an Iterative process, proficient with Package management YUM & RPM for RHEL/CentOS.
  • Experience in AWS services such as EC2, ELB, Auto-Scaling, S3, IAM, VPC, RDS, Dynamo DB, Cloud Watch, Cloud Trail, Lambda, Elastic Cache, Glacier, SNS, SQS, Trusted advisor, Cloud Formation, Cloud Front, Elastic Beanstalk, KMS, EMR, ECS, AWS Workspaces.
  • Experience in writing Cloud formation templates to provision the infrastructure in AWS cloud environment.
  • Experience in creating Terraform modules for two-tier Architecture which includes AWS resources VPC, Subnets, Security groups, EC2, Load Balancers, Auto scaling group, Cloud watch Alarms, ECS clusters, S3 buckets for logs.
  • Written Templates for AWS infrastructure as code using Terraform to build staging and production environments.
  • Planning and implementation of data and storage management solutions in Azure (SQL Azure, Azure files, Blob storage, Table storage, Queue storage, File storage) and deployed Azure SQL Server.
  • Working experience on Microsoft Azure Cloud services, Storage Accounts.
  • Hands-on Expertise using Source code Management tools GIT, Bitbucket and SVN. Creating branching strategies for Developers and Admins also for Hot-fixes, Features, and Releases.
  • Implemented Continuous Integration and deployment by automation scripts in a pipeline process.
  • Experience in writing shell scripts to push the code from Jenkins. Experience in Continuous Integration (CI) and Continuous Deployment (CD) using Jenkins
  • Created a Jenkins job that runs an Ansible Playbook to deploy an Elasticsearch cluster using troposphere and CloudFormation.
  • Expertise in Integrating CI/CD server with SCM repository and Build a tool like Maven/Ant for build automation and then Junit for unit tests followed by SonarQube for code coverage analysis.
  • Virtualized the servers using Docker for the test environment and dev-environment needs, also configuration automation using Docker containers.
  • Worked on Kubernetes to orchestrate Docker containers of new and existing applications as well as deployment and management of complex run time environment.
  • Experience in managing the clusters using Kubernetes and worked on creating pods, replication controllers, services, deployments, labels, health checks.
  • Experience in providing support for technical requirements in automating the deployments on cloud environments using Jenkins, AWS, OpenStack, Docker, and Kubernetes.
  • A unique experience in creating and maintaining various DevOps related tools for the team such as provisioning scripts, orchestration, deployment tools and staged virtual environments using Docker and Vagrant.
  • Hands-on experience with configuration tools like Chef and Ansible. Created several Cookbooks, Manifests and Playbooks to automate infrastructure maintenance & configuration.
  • Experience in installing and Configuring Configuration Management tools like Chef, Ansible adding nodes as per the requirement and writing Chef recipes and Cookbooks, creating Run-lists and Ansible playbooks, Roles to have the task accomplished.
  • Experience in developing the Playbooks in Ansible integrating them to the Source code repository and deploying them onto the servers to reduce the downtime.
  • Worked on SQL, RedShift, MongoDB and DynamoDB databases. Migrating the databases and writing Queries, Stored procedures, Triggers as per the requirements.
  • Experience in Monitoring server performance with tools like Nagios, Splunk, Datadog, New Relic, Dynatrace and resolved network-related issues with manual commands and built Splunk Cluster environment with high availability resources.
  • Experience with Bug tracking tools like JIRA, Remedy to keep track of issues and progress of the issued tasks.
  • Worked on creating Lambda functions to have the serverless provisioned using the Boto3 module of python.
  • Skilled enough to build deployment and automate solutions(using scripting language like Bash,Shell and Python)
  • Automated various day-to-day administration tasks by developing Bash, Ruby, JSON, Perl, PowerShell, and Python Scripts.

TECHNICAL SKILLS:

Operating Systems: Linux, Unix, Windows.

Cloud Services: AWS, Open Stack, Azure, PCF.

Virtualization Platforms: Oracle virtual box, Vagrant, VMware ESXI.

Version Control: Git, SVN, Bitbucket.

Build Systems: Maven, Ant.

CI Tools: Jenkins, Bamboo.

Networking: TCP/IP, NFS, Telnet, FTP, DNS, DHCP, NAT, NetStat, HTTP, SAMBA, IPTABLES.

Containerization Tools: Docker, Kubernetes.

Configuration Management: Chef, Puppet, Ansible.

Application Servers: Oracle WebLogic, Tomcat, WebSphere.

Web Servers: Apache, Nginx.

Databases: Oracle, MYSQL,DynamoDB,PostgreSQL.

Scripting Languages: Shell, Ruby, Python.

Bug tracking and Ticketing tool: JIRA, Rally.

WORK EXPERIENCE:

Confidential, Tracy, CA

Cloud/ DevOps Engineer

Responsibilities:

  • Handled migration of on-premise applications to the cloud and created resources inside a cloud and in deploying of on-premises hosted applications on AWS platform
  • Build and configure a virtual data center in the Amazon Web Services cloud to support Enterprise Data Warehouse
  • Created Python scripts to totally automate AWS services which includes web servers, ELB, Cloud Front distribution, database, EC2 and database security groups, S3 bucket and application configuration, this script creates stacks, single servers, or joins web servers to stacks.
  • Responsible for defining and creating API architecture for a large-scale digital transformation using AWS API Gateway and Lambda functions.
  • Created Microservices using AWS Lambda and API Gateway using REST API. Deployed Lambda code function and configured it to receive events from Amazon S3 buckets
  • Created Lambda functions to implement tasks like taking EBS volume snapshots in regular intervals and copying the files in between the S3 buckets only when the actions are triggered.
  • Used Azure Storage services like migrating Blob Storage for document and media file, Table storage for structured datasets, Queue storage for reliable messaging for workflow processing and file storage to share file data
  • Ownership for Azure SQL server DB deployment & managed the continues integration & continues deployment.
  • Design the data model and creating schema on SQL Azure.
  • Used Hashi Corp Packer to create and manage the AWS AMI's and Vault to manage AWS secret keys.
  • Perform remediation of CIS Benchmark and Trusted advisor for Dev, Stage, Prod, Test accounts which includes disabling public access for s3 buckets, ensure credentials unused are disabled and ensure VPC flow logging is enabled in all VPCs.
  • Used AWS Snowflake and leverage SQL-db on top of the already existing data analytics and also to enable the flexibility in secure data-sharing between the teams.
  • Manager for installing Jenkins, Artifactory tools, etc., by running Helm charts
  • Created four staged CI/CD pipelines with AWS Code Pipeline plugin for Jenkins, GitHub repository, Jenkins build server EC2 instance using IAM instance role, proxy and firewall settings to allow inbound connections on server and AWS Code Deploy
  • Worked on writing the Maven script for the build process and maintained the configuration file POM.xml for continuous integration.
  • Integrated Jenkins CI/CD tool with Sonarqube, Junit, Nexus to run unit tests and review and analyze the code for greater quality and then push it to the Artifactory server.
  • Deployed various IAM Policies and Roles to AWS Users through Ansible Playbooks with the help of the Boto3 Framework.
  • Wrote Playbooks in YAML to automate the entire deployment process as well as infrastructure admin tasks and Used Ansible for Continuous Delivery, Managed CI/CD process and delivered all applications in RPMs.
  • Implemented Packer to bake AMI’s with Ansible Playbooks and build Packer AMI’s, Terraform Enterprise to Provision Infrastructure across AWS Workloads and Kubernetes Clusters, and Vault Enterprise for Secrets and Certificates Management using various Back-ends.
  • Good knowledge in setting up Kubernetes (k8s) cluster with CloudFormation templates and deploy them over AWS
  • Environment and monitoring the health of pods and for running microservices and pushed microservices into production with Kubernetes backed Infrastructure and automation through playbooks in Ansible.
  • Created Docker images using a Docker file, worked on Docker container snapshots, removing images and managing Docker volumes.
  • Virtualized the servers using Docker for the test environments and dev-environments needs and configuring automation using Docker containers.
  • Created Kubernetes Pods, clusters, replication controllers, services, labels, health checks, and ingress by writing YAML files and deployed microservices on Docker containers.
  • Used MAVEN build tool for the development of build artifacts on the source code (.WAR) and Installed and configured Nexus repository manager for storing the artifacts within the company using the Continuous Integration tool like Jenkins
  • Installed, Configured, Managed Monitoring Tools such as New Relic, CloudWatch and Nagios for Resource Monitoring, Network Monitoring and Log Trace Monitoring.
  • Configuring and managing an ELK stack, set up the elastic search ELK Stack to collect search and analyze log files from across the servers.
  • Configured tools like Elastic Search, Log stash and Kibana and monitored Application and server logs using the ELK stack.
  • JIRA as a Project management tool and used tool for issue tracking and Bug tracking. Reported performance-related issues to management by doing analysis, tracking of existing systems, and used Confluence to create, share and discuss content and projects.

Environment: AWS, Terraform, Azure SQL, Kubernetes, Ansible, Docker, Jenkins, Git, Jira, Maven, ELK, Snowflake, Java, Shell, Bash, Python, Nexus, DynamoDB, WebLogic, Tomcat, Linux.

Confidential, Milwaukee, WI

Cloud /DevOps Engineer

Responsibilities:

  • Involved in building and maintaining highly secure multi-zone AWS cloud infrastructure utilizing chef with AWS Cloud Formation and Jenkins for continuous integration.
  • Managed several IAM accounts in AWS for users with specific policies attached to each of them, implemented Multi-Factor Authentication to meet security compliance.
  • Designed and deployed AWS solutions using EC2, EBS, Elastic Load Balancer (ELB), Auto-scaling groups and Ops works.
  • Created several customized Cloud Formation Templates in AWS which has specific configurations for subnets, security groups, NACLs, NAT Gateways, VPC, EC2, ELB, and other Services as per the requirement.
  • Written templates in JSON format for cloud formation and python Script for CHEF automation and contributed source code to the GitHub repository.
  • Built a deployment pipeline for deploying tagged versions of applications to AWS Beanstalk using Jenkins CI.
  • Worked on Branching, Merging, Tagging and maintaining the version across the environments using SCM tools like GIT.
  • Configured various jobs in Jenkins for the deployment of Java-based applications and running test suites. Setup ANT script-based jobs and worked with Jenkins Pipelines.
  • Configured Jenkins to implement nightly builds on a daily basis and generated a change log that includes daily commits.
  • Utilized configuration management tool Chef and also created and managed Chef Cookbooks using recipes to automate system operations.
  • Highly involved in Configuring, monitoring, and multi-platform servers by defining Chef Server from workstation to manage and configure Chef Nodes.
  • Implemented environments, roles, and data bags in Chef for better environment management.
  • Written Chef Cookbooks and recipes in python to Provision several pre-prod environments consisting of Cassandra DB installations, WebLogic domain creations, and several proprietary middleware installations.
  • Extensively used Docker for virtualization, run, ship, and deploy the application securely for fastening the build/release engineering.
  • Created container based virtualized deployment using Docker working with Docker images, Docker-hub and Docker registries.
  • Created dashboards to monitor servers using Splunk. Debugged and solved Splunk Integration challenges and Splunk configuration issues.
  • Monitored and tracked server performance problems, administrations and open tickets with Splunk.

Environment: AWS, EC2, S3, RDS, EBS, IAM, Blob storage, Table storage, Queue storage, file storage, BeanStalk, Jenkins, Ant, Chef, Docker, Git, Splunk

Confidential

DevOps Engineer

Responsibilities:

  • Utilized VMware Virtual Client 3.5 in creating and cloning Linux Virtual Machines, and migrating servers between ESX hosts and building KVM hypervisor.
  • Involved in setting up and configuring Install Server, Configuration Server & Boot Server using PXE booting for Kickstart process & performed Kickstart to install OS on Linux boxes.
  • Worked on Multiple AWS instances, defined the security groups, Elastic Load Balancer, and AMIs, Auto scaling to design cost-effective, fault tolerant and high availability systems.
  • Configured AWS S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on the requirement.
  • Hands-on experience with AWS IAM to set up user roles with corresponding user and group policies using JSON.
  • Configuring the rules to each security group associated with AWS EC2 instances.
  • Designed AWS CloudFormation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
  • Created roles, groups for users and resources using AWS IAM.
  • Installing and configuring of Apache and Nginx web servers on AWS EC2 Instances.
  • Configuring and Networking of AWS Virtual Private Cloud and Written CloudFormation templates and deployed AWS resources using it.
  • Used AWS CloudWatch and ELK for maintaining system Logs.
  • Installing, configuring and administering continuous integration tool Bamboo on Linux machines.
  • Setup Master-slave architecture to improve performance and used Bamboo for CI & CD into Tomcat Application Server.
  • Deployment of build artifacts like WAR, JAR and EAR packages into Jfrog Artifactory using shell/python scripts.
  • Managing and Architecting more than 3500 virtual servers and Monitoring the Application Servers and Web-Servers through Nagios.

Environment: s: Solaris 8/9/10, AWS EC2, IAM, S3, VPC, NAT, CloudFormation, Terraform, Tomcat, Nginx, Git, Linux /Unix, Jfrog Artifactory, Bamboo, ANT, Maven, Python, Ruby, Chef, Docker, Nagios.

Confidential

Build Engineer

Responsibilities:

  • Deployed and implemented Perforce across a software development organization developing a business-critical application in a mixed Solaris/Windows environment.
  • Installed and configured Perforce server, administered Solaris OS, designed the architecture of CM libraries.
  • Created Perforce triggers, wrote Perl scripts and shell scripts to support trigger functionality.
  • Migrated existing code base from CVS and Visual SourceSafe into Perforce.
  • Installed perforce client software, developed training examples, a trained user
  • Served as a configuration management representative to the CCB.
  • Built release candidates for testing. Developed and tested installation scripts for automated deployment.
  • Installed and performed troubleshooting of Atlassian Jira, Crucible code review tool including customizing the workflows and e-mail notification features
  • Maintained and upgraded Jira issue tracker, Crucible code review tool.
  • Created Ansible script to automate and deployment process which deploys the application and restarts the servers.
  • Involved in migrating the application from Ant to Maven2 by analyzing the dependencies and creating the POMs to implement the build process using Maven
  • Managed all the dependencies and plugins for Maven in an Artifact repository.
  • Deployed the ANT or Maven generated artifacts for WebSphere application servers.
  • Designed and implemented GUI modifications, stored procedure changes, and report changes created documentation for design, review, and installation. Provided support for internal customers.
  • Created Perl scripts and SQL stored procedures for nightly batch job stream, data loads and corporate reporting

Environment: Anthill Pro, Ant, Maven, Perforce, WebSphere, Artifactory, Jira, Crucible

Confidential

Jr. System Administrator

Responsibilities:

  • Installation, configuration and Operating System and upgrade of, RedHat, CentOS, Ubuntu, and SunSolaris.
  • User account management and support.
  • Worked on Operating System installations and BIOS upgrade and Maintenance.
  • Installation and configuration of the LAMP server on Debian and CentOs.
  • Administration of file and directories with basic file permissions.
  • Configured Linux Firewall with IPCHAINS&IPTABLES
  • Worked on installation and configuring VMware/EServer’s for virtualization.
  • Configuring network services like NFS, DHCP, DNS, SAMBA, FTP, HTTP, TCP/IP, SSH and Firewall that runs on Red Hat Linux, SunSolaris, AIX.
  • Worked with LVM for the management of Volumes including the creation of physical and logical volumes on Linux.
  • Worked on issues to resolve daily hardware and software problems of organization machines.
  • Worked on the configuration of server monitoring tool like Nagios.
  • Monitored Network traffic using packet analyzer tools such as TCP/IP dump, Wireshark

Environment: s: RedHat, Debian, CentOs, Ubuntu, SunSolaris, AIX, NFS, DHCP, DNS, SAMBA, FTP, HTTP, TCP/IP, SSH, LVM, TCP/IP, LAMP, Nagios, Wireshark, VM Ware ESXi Servers.

We'd love your feedback!