We provide IT Staff Augmentation Services!

Sr. Cloud/devops Engineer Resume

4.00/5 (Submit Your Rating)

New Haven, CT

SUMMARY:

  • Core Qualifications: A Proactive, result oriented IT professional with around 7+ years of IT Professional experience in IT industry with major focus on developing, building, deploying and releasing of code in cloud platforms such as Amazon Web Services and other Public and Private Cloud Platforms and implemented DevOps environment to achieve Continuous Integration and Continuous Deployment(CI/CD) and automation of Infrastructure as code.
  • Experience in Amazon Web Services (AWS) such as EC2, VPC, S3, IAM, ECS, RDS, ELB, Route53, Dynamo DB, Autoscaling, RDS, Cloud Front, CloudWatch, Cloud Formation Templates, Lambda, SNS, SQS.
  • Excellent Knowledge in Azure compute services, Azure Web Apps, Azure Data Factory & Storage, Azure Media & Content delivery, Azure Networking, Azure Hybrid Integration and Azure Identity & access management.
  • Managed Containerization technology like Docker which include creating Docker images, Containers, Docker Registry to store images, cloud - based registry Docker Hub, Docker Swarm to manage containers.
  • Experience in creating clusters using Kubernetes, creating pods, replication controllers, deployments, labels, health checks and ingress by writing YAML files.
  • Worked with Kubernetes to provide a platform for automating deployment, scaling, and operations of application containers across clusters of hosts and managing containerized applications using nodes, config maps, selectors, and services.
  • Worked with Ansible and Ansible Tower as Configuration management tool, to automate repetitive tasks, and deploy critical applications quickly.
  • Used Chef to manage Web Applications, Config Files, Data Base, Commands, Users mount Points and Packages.
  • Developed Cookbooks and coded recipes for automating deployments and administering infrastructure of the nodes.
  • Worked with Jenkins for CI (Continuous Integration) and CD (Continuous Deployment) (CI/CD) methodologies and for end to end automation for all build and deployments.
  • Hands on experience using Maven and ANT for building the deployable artifacts (war & ear) from source code.
  • Developing Ant and Maven scripts to automate the compilation, deployment and testing of Web and J2EE applications to the above platforms.
  • Experience in branching, tagging and maintaining the version across the environments using SCM tools like Subversion (SVN), GIT, GitHub on Linux and windows environment.
  • Managed all necessary GIT configuration support for different projects and worked on branching, versioning, labeling, and merging strategies to maintain GIT repository, GIT Hub.
  • Worked with tracking tools like Remedy, JIRA and used these for reporting and managing bugs.
  • Installed and configured Splunk to monitor applications deployed on application server, by analyzing server log files.
  • Experienced in working with system health and monitoring tools like Nagios, Zabbix and Splunk, CloudWatch, New Relic, Elasticsearch, Logstash, &Kibana (ELK).
  • Experience in writing Python, YAML, Perl, Shell, Bash Scripting to automate the deployments.
  • Experience in the Java/J2EE based enterprise application development along with System Integration.
  • Good understanding of all phases of Software Development Life Cycle (SDLC) like Agile, Waterfall Methodologies.

PROFESSIONAL EXPERIENCE:

Confidential, New Haven, CT

Sr. Cloud/DevOps Engineer

Responsibilities:

  • Worked with Cloud team in designing the architecture and solutions for the AWS cloud environment and setting up the entire infrastructure in cloud and migrating the services from On-premise to Cloud.
  • Wrote AWS Cloud Formation Templates for provisioning the infrastructure and deployed the templates using AWS CLI.
  • Launched EC2 instances in both Linux and Windows and enabled the autoscaling group for these instances and attaching EBS Volumes and Security Groups too.
  • Worked on storage services such s3(enabled versioning) for storing the artifacts that are pushed from on-premise and Glacier to store historical data.
  • Launched Cloud Formation Templates that create databases in DynamoDb and RDS Aurora MySql and Data Warehousing service Amazon Redshift.
  • Used Amazon Kinesis Data Streams and Kinesis Firehose in order stream real time data that is coming from On-premise and then storing the data to redshift with intermediary storage as S3.
  • Worked with SQS for message queuing and SNS for sending out the event notifications to the respected email groups.
  • Worked on writing Lambda Functions that copy the artifacts from on S3 bucket to another S3 bucket which in different account and to copy data from Amazon S3 bucket to DynamoDb tables.
  • Worked on a POC for deploying docker images into ECS with EC2 instances and with Fargate too.
  • Worked on build docker images and then deploying images using EKS (Elastic Kubernetes Service), creation of config-maps, namespaces, deployments and pods into containers.
  • Used Kubernetes for automated deployments, scaling and management of containerized applications across Cluster of hosts.
  • Created Docker images using Docker files, worked on Docker container snapshots, removing images and managing Docker volumes and experienced with the Docker Container Service.
  • Used Shell scripting to deploy artifacts that are built by Ant.
  • Setting up Horizontal Pods Autoscaler (HPA) which helps the pods to autoscale according to the given metrics such as Memory Utilization, CPU Utilization etc.
  • Worked on Setting up CI/CD Pipeline in the AWS environment using Code Pipeline and building the artifacts using Code Build and deploying the application using Code Deploy.
  • Deployed applications in to Linux EC2 instances as a part of CI/CD pipeline using Code Deploy.
  • Utilized Cloud Watch to monitor resources such as EC2, CPU memory, Amazon RDS DB services, Dynamo DB tables, EBS volumes, to set alarms for notification or automated actions and to monitor logs for a better understanding and operation of the system.
  • Used security groups, network ACLs, Internet Gateways, NAT instances and Route tables to ensure a secure zone for organizations in AWS public cloud.
  • Worked on POC for azure to see if it is more compatible and approachable then AWS.
  • Selecting the appropriate Azure service based on compute, data or security requirements and leveraging Azure SDKs to interact with Azure services from your application.
  • Creating and maintaining containerized micro services and configuring/maintaining private container registry on Microsoft Azure for Hosting Images and using Windows Active Directory.
  • Used Azure Express Route to set up a private connection to Microsoft cloud services such as a Microsoft Azure and Dynamic 365.Configured Azure Virtual Networks, subnets, DHCP address blocks, Azure network settings, DNS settings, security policies and routing.
  • Designed Terraform templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates and migration from traditional to cloud environment.
  • Used GitHub repository for storing Terraform files and maintaining versioning.
  • Implemented Infrastructure automation through Ansible, for Auto provisioning, Code deployments, software installation and configuration updates.
  • Created CI/CD pipeline in the Jenkins, TeamCity and ran the build by integrating with GitHub repository by using build and deploy scripts. Stored the build Artifact to S3 bucket in AWS cloud.
  • Used GIT in branching, tagging and maintaining the versions across the environments and used for recovering files, saving file (using Stash), creating tags, viewing logs etc.
  • Worked with Development and QA teams to establish a build schedule, execute the builds and troubleshoot build failures.
  • Used AppDynamics to Monitor the performance of the application and servers, and Data Dog for Monitoring the performance of EKS Cluster and Worker Nodes.
  • As, a POC Used CloudWatch Container Insights and FluentD to monitor the performance of the application that is deployed in EKS just to see if it provides more functionalities than Data Dog.
  • Deployed JAVA/J2EE applications through Tomcat Application servers and JBoss and Python applications using Django and Gunicorn.

Environment: Git, GitHub, Jenkins, Chef, Ansible, JAVA/J2EE, Terraform, Jenkins, Docker, Kubernetes, EC2, Route53, S3, VPC, SQS, Auto-Scaling, CloudWatch, ELB, Azure Express Route, Dynamic 365.

Confidential Norwalk, CT.

Cloud/DevOps Engineer

Responsibilities:

  • As a Cloud and DevOps Engineer worked on Cloud Services such as AWS, Azure and Various DevOps tools for migrating applications into cloud.
  • Implemented AWS solutions like EC2, S3, IAM, EBS, Elastic Load Balancer (ELB), Security Group, 0Auto Scaling, and RDS in Cloud Formation YAML templates.
  • Used IAM to create new accounts, roles and groups and polices and developed critical modules like generating amazon resource numbers and integration points with S3, Dynamo DB, RDS, Lambda and SQS Queue.
  • Launching Amazon EC2 Instances using Amazon Web Services (Linux/ Ubuntu/RHEL) and Configuring launched instances with respect to specific applications.
  • Configured Elastic Load balancer for distribution of incoming application traffic across multiple EC2 instances. Configured Auto-Scaling group to spin up more instances on heavy load.
  • Used Python Boto3 Module in Lambda Service and effectively used Cron Jobs along with CloudWatch alarms and send SNS Notifications.
  • Designed AWS Lambda functions in Python to administer the provisioning, scaling and to automate redundant tasks such as changes to S3 buckets, updating DynamoDB, responding to API calls.
  • Configured CloudWatch Alert with ELB and Auto Scaling Rules and took EBS Snapshots and created S3 Versioning and lifecycle policies for Archiving.
  • Worked on AWS EKS, for deploying docker images that are already being used in On-prem application as a part of migrating the application to cloud.
  • Worked on Terraform for automating VPCs, ELBs, security groups, SQS, S3 buckets and continuing to replace the rest of our infrastructure and migration from traditional to cloud environment.
  • Worked on Terraform for managing the infrastructure through the terminal sessions and executing scripts for creating alarms and notifications for EC2 instances using Cloud Watch.
  • Helped in migrating an application in to cloud using Azure and deploying the servers to Azure using Azure portal as part of Proof of Concepts.
  • Designed Network Security Groups (NSGs) to control inbound and outbound access to network interfaces (NICs), VMs and subnets and designed few templates using Azure Resource Manager.
  • Involved in managing and maintaining the CI/ CD pipeline using DevOps Toolset which includes Continuous Integration of GitHub, Jenkins, JFrog Artifactory and deploying through Chef and Chef Cook Books.
  • Designed and Implemented the CI/CD architecture and automation solutions. Develop a Build scripts, maintain, and scale infrastructure for Dev, QA, and Production environments.
  • Implemented a CI/ CD pipeline with Docker, Jenkins and GitHub by virtualizing the servers using Docker for the Dev and Test environments by achieving needs through configuring automation using Containerization.
  • Written wrapper scripts to automate deployment of cookbooks on nodes and running the chef client on them in a Chef-Solo environment and helped in converting production support scripts to Chef recipes.
  • Worked on branching, tagging and maintaining the version across the environments using Git and GitHub on Linux and windows platforms.
  • Integrated GIT with Jenkins using the Git plugin to automate the process of source code check-out by providing the URL and credentials of the GIT repository.
  • Configured JIRA Workflows according to the needs of CM team and integrated the project management features of JIRA with the build and release process.
  • Configured Splunk to build, and to maintain log analysis for various systems and have developed Splunk queries, dashboards for understanding application performance, capacity analysis and worked on set up of various dashboards, reports and alerts in Splunk.
  • Experienced using different log monitoring tools like ELK (Elasticsearch, Logstash, Kibana) to see log information, monitor, get the health & security notifications from nodes.
  • Consistently use ELK (Elastic Search, Log stash and Kibana) stack to develop an end to end transaction processing system and analyzed the logs data and filter required columns by Log-stash configuration and send it to Elastic-Search.

Environment: AWS, Azure, Git, Github, Jenkins, JFrog Artifactory, Chef, Docker, Jenkins, Linux, Jira, Splunk, ELK.

Confidential, Mountain View, CA

DevOps Engineer

Responsibilities:

  • Created Amazon EC2 instances using command line calls and troubleshoot the common problems with instances and monitor the health of Amazon EC2 instances and other AWS services.
  • Configured Docker and created different containers to run different application instances for DEV and PROD environment.
  • Created a robust and scalable Jenkins cluster with multiple nodes which helped in the orchestration of many pipelines including build, release, and deployment.
  • Built, configured and administered Jenkins Continuous Integration tool on Linux machines along with adding/updating plugins such as SVN, GIT, Maven, and ANT.
  • Involved in Automation of regular administration tasks with the Shell Scripting and Configuration management tool such as CHEF.
  • Setup up and maintenance of automated environment using Chef recipes& cookbooks within Azure environment.
  • Create Chef Cookbooks to deploy new software and plugins as well as manage deployments to the production Jenkins server
  • Created and implemented chef cookbooks for deployment and used Chef Recipes to create a Deployment directly into Amazon EC2 instances.
  • Setting up Chef infrastructure, bootstrapping nodes, creating and uploading Chef Recipes, Chef Node convergence in Chef SCM.
  • Built Chef Manifests and bootstrap scripts to allow us to bootstrap instances to various roles without having to maintain AMIs.
  • Designed and implemented Chef, including the internal best practices, cookbooks, automated cookbook CI and CD system.
  • Managed user and database on the MySQL database granting distinct levels of permissions.
  • Install/configure/maintain the Linux servers, NIS, DNS, NFS, Mailing List, Send mail, apache, ftp.
  • Extensively involved in writing scripts on Bash, Shell and Ruby and Installing and configurating of Web Apache Server.
  • Monitoring of the system using Nagios and worked on installation or update of patch software’s, firmware, and security patches for all applications in installed infrastructure via RedHat network.
  • Performed Memory, CPU and Apache process tuning and reconfigured Apache server in httpd.
  • Managed servers on the Rackspace cloud server platform using Puppet configuration management.
  • Used JIRA as ticket tracking, change management and Agile/SCRUM tool.

Environment: RedHat Linux, AWS, Rackspace, Git, Jenkins, Maven, Ant, Docker, Jenkins, Jira, Chef, Bas shell, Ruby, NIS, DNS, NFS, FTP

Confidential

Build and Release Engineer

Responsibilities:

  • As a Build & Release Engineer responsible for continuous delivery, working with different teams to deliver high-quality applications to satisfy growing customer and business demands.
  • Coordinating different tasks with different teams for creating usage models for different projects.
  • Managed source control systems using GIT and SVN.
  • Designing, creating and maintaining GIT repositories to client specifications and involved for setting up of Subversion-SVN server, server maintenance, Client machines setup.
  • Performed regular builds and deployment of the packages for testing in different Environments (DEV, QA, CERT, UAT and PROD).
  • Implemented Clear Case and Subversion branching and merging operations for Java Source Code.
  • Performing smoke tests to ensure the integrity of code deployment.
  • Performed builds on Java projects using and MAVEN as build tools.
  • Regular builds are initiated using the continuous integration tool like Jenkins.
  • Configured Jenkins for doing the build in all the non-production and production environments.
  • Implemented MAVEN builds to automate artifacts like jar, war and ear.
  • Release Engineer for a team that involved different development teams and multiple simultaneous software releases.
  • Developed and implemented software release management strategies for various applications according to agile process.
  • Managed Sonar type nexus repositories to download the artifacts during the build.
  • Used configuration management tools to deploy consistent infrastructure code across multiple environments.
  • Worked on Scrum methodology to maintain software development and coordinated with all 0
  • Deploying Java Enterprise applications to Apache Web Server, JBoss Application server.
  • Created a complete release process documentation, which explains all the steps involved in the release process.

Environment: ANT, Maven, Apache & Tomcat, Shell scripting, Subversion, Git, Puppet, Jenkins, Windows, Linux (RHEL)

Confidential

Web Application Developer.

Responsibilities:

  • Took part in Agile Scrum development process to develop the application and involved in setting up the application with various frameworks.
  • Developed the application web pages using Spring MVC, HTML, CSS, Bootstrap, JSP, Javascript and jQuery.
  • Implemented Client Side (Front end) validations using Angular, JavaScript and JQuery.
  • Designed the Java Server Pages (JSP) using Style Sheets (CSS), HTML and XML.
  • Involved in the integration of spring for implementing Dependency Injection (DI/IoC). Developed code for obtaining bean references in Spring IoC framework.
  • As a software Developer to assess the time required for the application to be finished and to share the roles as a developer and to discuss the techniques and programming methodology that would be used in the project.
  • Customized Restful Web Service using Restful API, sending JSON format data packets.
  • SQL to read data from the databases such as stored procedures
  • Build Angular filters, Controllers, Service Calls, Directives, Regular Expressions to validate the input calls.
  • Worked on Jasper report enhancements and performed client-side forms validations using JavaScript and jQuery.
  • Worked with Core Java technologies including: Collections, Serialization, Generics, Annotations and Exception Handling to implement Back-End Business Logic including Entity Beans and Session Beans
  • Used Git/SourceTree for version control/Source code repository. JIRA for issue/defect tracking and project management.

Environment: Java, Spring Boot, Spring MVC, Spring IOC, JSP, JSTL, Apache Tiles, Spring Rest, Jackson, JDBC, JPA, Hibernate, jQuery, JavaScript, AngularJS, Ajax, Jasper, Oracle, Git/SourceTree, JIRA, Tomcat, Maven, Jenkins, Junit, Eclipse.

We'd love your feedback!