We provide IT Staff Augmentation Services!

Aws Devops Engineer Resume

Minneapolis, MN

SUMMARY

  • Experienced in working on DevOps/Agile operations process and tools area (Build & Release automation, Environment, Service, Incident and Change Management).
  • Experience in Implementation and Administration of (CI) Continuous Integration and (CD) continuous deployment process using CloudbeesJenkins, Bamboo and Configuration Management tools in Agile Environment.
  • Experience in version control tools like git bash, git gui, github, stash, bitbucket and svn and build tools like ant, maven, gradle for java, .net and react applications with node js.
  • Extensive experience of deploying applications on Tomcat, WebLogic, WebSphere, Apache, JBoss, HAproxy and Nginx proxy servers.
  • Experience on container - based deployments using Docker (Master), Docker hub, external private registry, Kubernetes Container orchestration tool for management of Applications/Containers.
  • Proficient of AWS services like VPC,, Cloud front, EC2, ECS, EKS, Elastic bean stalk, Lambda, S3, Storage gateway, RDBS, Dynamo db, Red shift, Elastic Cache, DMS, SMS, Data Pipeline, IAM, WAF, Artifacts, API gateway, SNS, SQS, SES, Auto Scaling, Cloud Formation, Cloud Watch and Cloud Trail
  • Experience in installation and setting up Ansible infrastructure of customizing/creating the Ansible playbooks for the Continuous deployment configurations by integrating with Jenkins.
  • Used ansible and Ansible Tower as Configuration management tool, to automate repetitive tasks like day-to-day server administration tasks, deploy critical applications quickly and manages changes proactively.
  • Experience with Setting up Chef Infrastructure of Workstation, Chef-Server and bootstrapping multiple nodes by allocating relative roles on chef server as well as on CLI with knife commands.
  • Involved in Writing cookbooks, recipes, rolesto update the configurations onnodes like Jenkins servers and writing unit tests using kitchen, vagrant and rakefile.
  • Worked on provisioning the Infrastructure in Cloud like AWS and Azure using terraform and cloud formation templates in AWS.
  • Well Experienced in configuring Azure Web Apps, Azure App Services, Azure Application Insights, Azure Application Gateway, Azure DNS, Azure Traffic Manager, Azure Network Watcher, Implementing Azure Site Recovery, Azure Backup and Azure Automation.
  • Expertise in writing shell scripts to automate the jobs in a pipeline with the help of Groovy and declarative pipeline syntax and managing Jenkins jobs with Jenkins Job Builder(JJB) .
  • Experience in deploying the applications in OpenShift with the docker containerization and creating services and routes with the help of AppViewX team for the route automation of multiple applications.
  • Analyzing skills on latency in data exchange cloud services across the internet and managing the users of multiple cloud services with Role based access control (RBAC) using Puppet master and Puppet agent.
  • Experience in automating the AWS infrastructure with python scripts using boto3 libraries
  • Experience with Some of the Atlassian tools like Jira, Confluence, Rally, bitbucket, and bamboo
  • Good Knowledge on Pivotal cloud foundry for the development and deployment which provides highly scalable and available architecture.
  • Working Experience with multiple monitoring tools in multi-tenant projects App Dynamics, Kibana, New Relic, Nagios, Splunk, Sumo logic, Solar wind (network) and Data Dog.
  • Expertise in architecting and deploying fault tolerant, highly available, cost effective and secure servers in AWS. Used AWS Beanstalk, AWS EKS and ECS for deploying and scaling web applications and service containers developed with Java, PHP.
  • Experience on working with command line tools like awscli forAWS cloud, kubect l in Kubernetes, oc in openshift, and rsync to transfer the files from local/remote linux machines to containers and vice versa.
  • Experience in integrating Bamboo and Jenkins with Jacoco, sonar qube, Selenium and Cucumber reports for unit tests, integration tests in a pipeline using multiple plugins.
  • Expertise in managing the bug changes, bug reviews, and verification of defects with the help of Jira and Bugzilla,Elastic search to analyze the data and ELK/kibana to create dashboards, filter the logs based on application needs.
  • Experience in working with Application Load balancers, auto scaling groups for high availability and fault tolerance with zero downtime.
  • Good knowledge in networking concept including Firewalls, VPN, Tunnelling across data centres, zones and security on premise and as well as cloud.
  • Familiar with micro services strategies to speed up the development by using distributed Environments and collaborating with all the stake holders, product owners, Dev, QA and Performance and security teams.
  • Experienced in migrating the data from on premise (Linux and Windows)/VMS to the cloud platforms by testing, verifying, precisely with zero down time with the help of pipelines.
  • Creating confluence documentations on a daily basis including diagrammatic representation of the complete end to end pipeline structures.
  • Provide responsive off-hours support in a 24/7 environment and ensure maximum availability of all servers and Applications.

TECHNICAL SKILLS

Configuration Management: Ansible, Chef, Puppet

Continuous Integration: Jenkins, Bamboo.

SCM Tools: GIT, SVN, Bit Bucket/Stash.

Artifact Management: Jfrog, Nexus.

Virtualization: VMware ESX, VSphere.

Build Management Tools: Ant, Maven, Gradle

Project management: Jira, Rally, Confluence.

Monitoring Tools: App Dynamics,Kafka, Kibana, Nagios, Datadog, Splunk.

Containerization: Docker, Docker Swarm, Kubernetes, Openshift.

Log Management: ELK (Elastic search, Log stash, Kibana) stack.

SDLC: Agile, Scrum, Waterfall Methodology.

Languages: C, C++, Ruby, Objective C, Java.

Scripts: Shell,Groovy, Python,Ruby.

Database: Oracle, MySQL, PostgreSQL, MongoDB.

Testing Tools: Selenium, Junit, HP Quality Centre, Test Flight.

Cloud Platforms: AWS, Azure and Pivotal cloud foundry

Operating Systems: CentOS, Ubuntu,Windows Variants.

Web Technologies: Web logic, Web Sphere, Tomcat, Apache, nginx, and F5Load Balancers.

PROFESSIONAL EXPERIENCE

Confidential - Minneapolis, MN

AWS DevOps Engineer

Responsibilities:

  • Supporting (CI) Continuous Integration of automated build and (CD) Continuous Deployment pipeline with Jenkins, Ansible in multiple environments from development to Production.
  • Implementing a Continuous Delivery Pipeline using Git,Maven, nexus, docker,Jenkins andElastic Container service.
  • Deployed Java with spring boot, Vertex, .net and react applications as containers on Ec2 Instances through Elastic Containerservice.
  • Implemented Multibranch Pipeline jobs to trigger the release branch pipelines to deploy into Preprod and prod Environments and Development pipelines to deploy in development and QA Environments.
  • Created Pipelines with Jenkinsfile for continuous Integration testscreated by Using Karate Frameworks for testing automation by collaborating with QA testing Team.
  • Generated Cucumber reports for the integration tests including graphical representation of the load performance of the applications
  • Creating the Aws Infrastructure using terraform scripts, updates the configuration using ansible playbooks and ansible tower for clear picture and maintaining the state files in S3 buckets.
  • Wrote Ansible Playbooks with Python SSH as the wrapper to manage Configurations of Nodes and test Playbooks on AWS Instances using Python.
  • Developed python scripts to automate day to day jobs on Aws like creating AWS Zone delegation in Route53, getting the NAT gateways of AWS accounts with the help of Boto3 libraries.
  • Updating the confluence documentsfor tracking daily procedures, life cycles and scenarios implementing on daily basis in the organization.
  • Worked with awscli Command line tool and ecs commands to create tasks, register task definitions with Incremental revisions and service updates with docker compose.
  • Worked with IAM to manage Users, Roles and Policies to maintain the permissions of users.
  • Wrote and managed Ansible playbooks using YAML to provision Nginx, ApacheSpark, ApacheWebservers, and Tomcatservers, Cron Jobs, HDFS, PostgreSQL, Jenkins and other servers.
  • Collaborating with Application team leads to take the requirements and guide the relative tools to implement the deployments in secured and reliable manner.
  • Worked on Building images using Docker commands on Jenkins server and push it to the Elastic Container Registry (ECR) with versioning using tagging.
  • Integrating Jenkins with Ansible, andElastic container service to deploy the applications on AWS by creating and updating services and task-definitions with incremental revisions.
  • Troubleshoot and fix the issues by working with respective application teams with the help of monitoring tools Appdynamicsand Kibana which is integrated with cloud watch.
  • Creating Dashboards and alerts like cpu and memory usage in ELKand managing the infrastructure changes by tracking with the help of cloud trail.
  • Integrated with vault to retrieve the Application secrets in run time with the help of dex token helper for the access of multiple Kubernetes clusters without kube config file.
  • Tracking and updating the tasks with Jiraand continuous standups to discuss in the grooming sections.
  • Supporting developers for the development of containers, images, pod failures while deploying and upgrading Kubernetes clusters of Ec2 instances with no downtime.

Environment: s:Git, terraform, Maven, Jenkins, Nexus, Docker, AWS(IAM,S3,Ec2,VPC,Elastic Search,API Gateway, Dynamo DB,SNS,SQS,ECR,ECS,Route53,Application load balancer, Auto scaling), ApacheWebserver, Tomcat, JIRA, Confluence, Python, Bash Shell Scripts, Linux, Windows.

Confidential - Wilmington, DE

AWS/DevOps Engineer

Responsibilities:

  • Workedon single sign on server (SAML) Okta, for the access of applications integrating with the Okta, and assigning and removing the user from Okta.
  • Working on Source Control Management tool bitbucketwith development, feature and release branches including tagging with the releases.
  • Created Jenkins jobs Using Jenkins Job Builder( JJB) by updating the yml files and automating the jobs triggeredin a CICD pipeline to build and deploy the code from Stashrepositories.
  • Updated the Groovy scripts in a pipeline jobs with multiple stages using a declarative pipeline syntax and bash shell scripts.
  • Build the Applications Using Gradle and bump up the versions using the jenkins plugins with the help of gradle commands in the pipeline.
  • Integrating with SonarQubeto get reports of bugs, vulnerabilities of code ahead of merge by integrating with bitbucket with the help of jacoco.
  • Supporting Developers by Collaborating with OpenShift teams in creating routes, fixing deployment issues load balancer issues and latency issues in image pull and pod creations.
  • Implemented Deployments of Different versions of a Single application for supporting multiple features to deploy simultaneously with services and respective routes in OpenShift.
  • Worked with OCOpenShift Command line tool to build images, build Configs and deployments with the help of yaml files configuring with deployment configs, service and Pod Configs.
  • Worked on XL Deploy for some projects to automate, Standardize Complex deployments in the cloud, container and legacy environments with easy of tracking the releases.
  • Captured Metrics, generate reports and analyze the deployment process to improve the continuous deployments using XL Deploy console with centralized auditing and secured infrastructure.
  • Worked on Container based deployments using Docker, Pods, Containers in OpenShift Console for the infrastructure and managing configurations and secrets.
  • Created and maintained the Python deployment scripts for WebSphere web applications and wrote shell scripts to automate the repetitive tasks.
  • Troubleshoot deployments of Java and react applications in the Development, testing and production environments with the help of OpenShift logs and monitoring tool Datadog.
  • Worked with Monitoring teams to get the detailed logs of applications on time basis including graphical representation Using Datadog and Sumologic.
  • Maintaining a separate repository for application configurations and toggles information to update for the applications running as containers.
  • Integrated with Cyberark Vault and maintaining the secrets of applications hosted on OpenShift environment.
  • Supporting Production teams for the application releases going live with management of change tickets through Service now and easily roll backs with XL deploy tool.

Environment: s: Bitbucket, Gradle, Jenkins, Docker, Openshift, Shell Scripting, Python, Atlassiantools((JIRA,Confluence), Stash, Okta, Eclipse, Outlook, Visual Studio, Service now, XL Deploy, datadog, Sumologic,Cyberark vault, Java, aps.net, jacoco, SonarQube, AppViewX, Groovy.

Confidential

AWS/DevOpsEngineer

Responsibilities:

  • Deployed and configured SVN repositories with branching, forks, tagging and merging pull requests.
  • Build Artifacts using the Maven tool and publish the artifacts/binary files to Jfrog Artifactory.
  • Creating Ec2 Instances, Elastic load balancers, subnetson AWS using the Cloud Formation templates with the puppet integration for the server management, Adding the IAMroles and security groups for allowing the traffic.
  • Administered Bamboo servers which include install, upgrade, backup, adding users, creating plans, installing the local/remote agent, adding capabilities, performance tuning, troubleshooting issues, and maintenance.
  • Created New Custom AMI’s using the Confidential packer templates in AWS Cloud and sharing the AMI’s to the multiple aws accounts in non-prod and Production Environments with approval of Security teams.
  • Setting up continuous integration and formal builds using Bamboo withtheArtifactory repository and resolved update, merge and password authentication issues in Bamboo and JIRA.
  • Designed and Implemented Puppet-basedconfiguration management system for all new Linux machines (Physical and Virtual)
  • Setting up monitoring tools like Nagios and Amazon Cloud watch to monitor major metrics like Network packets, CPU utilization, Load Balancer Latency.
  • Working with third Party Rackspace team for onboarding of the new applications for the alerts and the actions of the fixing the networking issues immediately.
  • Involved in Architect, Build and maintain Highly Available secure multi-zone AWS cloud infrastructure utilizing Puppet, AWS Cloud Formation and Bamboo for continuous integration.
  • Reducing the latency and troubleshoot the unpredictable nature of various network connections between on premise and cloud applications causing problems.
  • Extensively involved in maintaining large amounts of structured, semi-structured, and unstructured data across multiple data centres and the cloud using Cassandra.
  • Setting up monitoring tools like Nagios and Amazon Cloud watch to monitor major metrics like Network packets, CPU utilization, Load Balancer Latency.
  • Utilized AWS CLI to automate backups of ephemeral data-store to S3 buckets, EBS and create AMIs for mission critical Production servers and backups.
  • Responsible for maintaining and expanding AWS (Cloud Services) infrastructure using AWS Stack especially worked with database setup and maintenance on AWS EC2.

Environment: s:Puppet, SVN,Artifacts, Maven, Ec2, Jfrog, Bamboo, Jira, Elastic load balancer, Rackspace, Nagios, Amazon cloud watch, Cassandra, AWS CLI, S3, EBS, AMI, subnets, Cloud Formation, IAM.

Confidential

Build & Release Engineer

Responsibilities:

  • Worked on developing terraform modules for Azure Services like SQL , NoSQL , Storage , Networkservices , Active Directory, API Management , Auto Scaling and Virtual machines to deploy in multiple Environments.
  • Provisioned various resources in Azure using Azure Portal , PowerShel l on Azure Resource Manager deployment models. Deploying servers as Infrastructure as Code (IAC) using ARM Templates (JSON).
  • Deployed Azure IAAS virtual machines and cloud services (PAAS role instances) using secure Virtual Networks , subnets, DHCP address blocks, Azure network settings, DNS settings, security policies and routing.
  • Hands on experience on Azure VPN-Point to Site, Virtual networks, Azure Custom security, end security and firewall.
  • Wrote Maven related Pom.xml files and Ant build.xml files for build scripts and Utilized WAR and JAR files for deployment of enterprise apps.
  • Configuring virtual networks, load balancing, subnets, modifying network configurations and designing and implementation of hybrid network and multisite network with Azure.
  • Deployed and managed Pivotal Cloud Foundry (PCF) environment across development, testing and production environments.
  • Developed Chef Cookbooks for the configuration management of Jenkins Server and nodes including base images of RHEL Centos.
  • Updated Cookbooks with the help of Berkshelffor dependencies, templates for installation files and data bags for security credentials to store and handling distributed environments or organizations on chef server.
  • Implemented Unit tests of cookbooks using Chef spec, Kitchen, rubocopfor syntax, Vagrant and Rakefile.
  • Push the Cookbooks from Workstation to chefserverand Chef Supermarket with regular update of metadata files and bootstrap with nodes depends on the runlist of roles and recipes.
  • Implemented Chef Recipes for Deployment on build on internal Data Centre Servers. Also re-used and modified same Chef Recipes to create a Deployment directly into Azure VM's. Set up Chef Infra, Bootstrapped nodes, created and uploaded recipes, and node convergence in Chef SCM.
  • As member of Release Engineering group, redefined processes and implemented tools for software builds, patch creation, source control, release tracking and reporting on LINUX platform. Led Jenkins continuous integration server installation and configuration for all GIT Repositories.
  • Involved inPeriodic archiving and Storage of the source code for disaster recovery and prepared Junit test cases and executed server Configurations.
  • Best practices knowledge with setup and deployment of cluster database technologies using Postgres,MYSQL, Mongo DB, Memcache, and Redis.
  • Establishing and testing disaster recovery policies and procedures, maintaining Documentation.
  • Administrative tasks such as System Startup/shutdown, Backup strategy, Printing, Documentation, User Management, Security, Network management, dumb terminals and devices carried out.
  • Wrote bash shell scripts to automate the repeated manual implementations on a daily basis like creating cron jobs.
  • Wrote Functions,conditional executions, parameter expansions to pass the variables dynamically and loops to iterations, arrays and grep checks based on the requirements using bash shell scripting.

Environment: s:Azure, SQL, NOSQL, API Management, Auto scaling, Storage, Azure, ARM Templates, CHEF, MYSQL, Mongo DB, Memcache, Redis, Postgres, Cookbooks, Chef supermarket, Vagrant, Rakefile, Data bags, Templates, Bash shell scripting.

Hire Now