Aws Devops Engineer Resume
SUMMARY
- IT Professional with 5+ years’ experience as DevOps Automation Engineer, Application Developer focusing mainly upon Automation, Infrastructure as a Code (IaaC), Continuous Integration and Continuous Delivery Pipeline in Finance, Health and Telecom domains.
- Hands - on with AWS cloud migration strategies(6R’s) including Services like EC2, S3, Lambda, ECS, EKS, VPC, CloudFront, SNS, SQS, ELB, EBS, VPC, Route53, DynamoDB, RDS, APIGateway, Code Pipeline, Code Build, Code Deploy.
- Accomplished savings on infrastructure provisioning and management using Terraform, slashing 40.76% cost estimates from 9245$ to 5476$ on monthly basis reducing workload using cron jobs processes improving server performance after 6 hours.
- Redefined existing branch strategies along with tagging, merging and maintaining versions of code in Source Code Management (SCM) tools like Gitlab, SVN and bitbucket to meet up Scaled Agile deployment.
- Designed and Created Continuous Integration and Continuous Deployment pipelines using Jenkins Automation Server with Ant, Maven, Gradle plugins and configured Jenkins File by Groovy Scripting language.
- Handled Nexus and JFrog Artifactory Repositories to store binaries, JAR, WAR, EAR files of successful builds across Development, Staging and Test Environments along with DNS configuration using Route53 and AWS Cloud migration.
- Created Ansible Playbooks using YAML Scripts to configure remotely deployed instances with Apache web server dependencies at specified path of Windows 2016 Servers hosted in AWS cloud environment.
- Containerized security of FIM application framework using Docker Images, Docker Hub, Docker Private Registry, Docker Swarm, Docker Service, Docker Compose and Docker file and performing Docker Twist-lock scan to assess vulnerabilities.
- Managed docker orchestration and containerization using kubernetes along with container deployment, scaling and security.
- Bootstrapped Chef nodes, created and uploaded Chef Recipes using Ruby, performed Static Code Analysis using Food-critic and Robocop, unit tests using ChefSpec and integration tests using Test Kitchen and In-Spec.
- Monitored Infrastructure both on-prem, cloud using Nagios, CloudWatch, Stack driver and Elasticsearch Logstash Kibana (ELK) Stack by deploying APM agents on remote Linux and windows servers and gathering logs on timely basis.
- Implemented multi cloud Architect solutions for integration of AWS Services like EC2, S3, Lambda, API Gateway with GCP Services like Bigtable, CloudSQL using python scripting language.
- Deployed vendor cloud solutions based on Compute Engine, App Engine and Google kubernetes Engine in Google Cloud Platform.
- Developed enterprise applications using Java/J2EE, JDBC, Spring MVC, RESTful Web services and implementation of web technologies like XML, JSON and databases like RDS, MySQL and DynamoDB.
- Experienced with IT Governance (COBIT, ITIL, CMMI, and FAIR) and Compliance like ISO 27001\27002.
TECHNICAL SKILLS
Programming and Scripting Languages: Java, SQL, Groovy, Python, Ruby, Bash, YAML, JSON, XML
Version Control Systems: Git, Gitlab, GitHub, Apache Subversion, Bitbucket, IBM ClearCase
Continuous Integration/Continuous Delivery: Jenkins, Bamboo, IBM Urban Code Deploy, TeamCity, Go CD
Configuration Management: Ansible, Chef, Puppet, Terraform, Vagrant
Containerization: Docker, Docker Swarm, Kubernetes, GKE, ECS, EKS
Monitoring: Nagios, New Relic, CloudWatch, ELK, Stack driver, Prometheus, Grafana
Web Servers: Apache, Tomcat, Nginx, IIS, LightSpeed, Lighttpd
Databases: Oracle, MySQL, DynamoDB, RDS, PostgreSQL, MongoDB, Bigtable, SQL
Cloud Platforms: Amazon Web Services (AWS), Google Cloud Platform (GCP), Azure
Networking & Operating Systems: IP, ARP, TCP, UDP, SMTP, FTP, TFTP, DNS, DHCP, Windows, Linux
PROFESSIONAL EXPERIENCE
Confidential
AWS DevOps Engineer
Responsibilities:
- Created AWS Cloud Infrastructure, customer end Virtual Private Cloud (VPC) using Terraform DSL reducing cost estimates by 40.76% versioning with terraform templates and enhanced provisioning of infrastructure using terraform plan.
- Hands-on with multiple VCP(Verizon Cloud Platform) Services like IAM, S3, VPC, EC2, Lambda, SNS, SQS, RDS, DynamoDB, Kinesis, RedShift, Config, KMS, ECS, EKS, ECR, EBS, Cognito, CodeCommit, CodeBuild, CodeDeploy, CodePipeline.
- Redefined release strategy by creating release branch in beginning of sprint and later merging to master after release using gitlab.
- Created and Designed CI/CD using Jenkins pipeline for API automation runner cloud data-migration domain supporting migration of telecom markets using Groovy scripting language, also monitored and extracted logs from Jenkins Automation Server to restrain performance bottlenecks.
- Built Cron Scheduler to remove process older than 6hrs after running a Jenkins job and clean up Working directory to improve server performance. Created automated snapshots of Linux based instances.
- Managed code deployments in large scale production environments using Jenkins, Groovy as scripting language across AWS Lambda, EBS and EC2 instances
- Achieved lower remote configuration timeline by integrating Jenkins with Ansible using ansible plugin and delivering artifacts, dependencies to 536 remote AWS staging Linux/windows servers by writing playbooks using YAML in Ansible.
- Provisioned non-prod and staging EC2 instances to run selenium test cases within nodes created and configured encrypted AMI using Ansible configuration management tool and provisioned as Jenkins job.
- Configured Selenium grid hub node architecture established Route53 record within Amazon Web Services non-prod and PLE environments and pointed to AutoScaling EC2 instances.
- Created windows batch scripts to run agent JAR, WAR, EAR and Selenium Stand-alone server files on windows server to connect with mobility platform to get testing jobs into windows instances and deploy binaries into apache web server.
- Rebuilt existing schema and tables with RDS in AWS staging-environment for release management of data migration domain and offloaded storage into new tables as staging.
- Devised integration of AWS Elastic Load Balancer and attached EC2 instances to application ELB as rolling deployment. Raised cross account access within AWS using IAM driving multi account solutions for vendors to access sensitive data.
- Integrated Elasticsearch, Kibana with CloudWatch logs using python streaming to Kibana for data visualization and created dashboards and developed python APIs to automate process.
- Integrated JFrog Artifactory with Jenkins to store builds, binaries, dependencies and packages of successful jobs.
- Managed docker orchestration and containerization using kubernetes orchestrator including deployment, scaling and clustering application run-time environment and created docker images by writing docker files using YAML
- Customized docker image scanning and twist-lock policy to run vulnerabilities in docker images. Encrypting passwords using the kubernetes secret.
- Created kubernetes object definition files like Deployment.yaml, development.yaml, svc.yaml, config.yaml and ingress.yaml and configured helm charts to package and deploy application updates on kubernetes cluster.
- Designed and versioned AWS Cloud Infrastructure using CloudFormation templates for Disaster Recovery, Template based Infrastructure as a Code (IaaC) are stored into GitLab source code management repositories as versions.
- Configured Security groups, NACLs between On-prem and AWS Staging environment to run ArcGIS map service application.
- Involved in troubleshooting and debugging of failovers across AWS non-prod environment and mobility platform.
- Implemented and maintained monitoring, alerting of production and corporate servers using ELK (Elastic Search, Log stash, Kibana) for application logs.
- Implemented automated tickets in JIRA after unsuccessful builds in Jenkins automation pipeline using python wrappers.
Confidential
DevOps/Platform Engineer
Responsibilities:
- Designed and created clustered environment in Google Cloud Platform using Terraform DSL.
- Integrated slack with GCP Pub-Sub, Cloud Function displaying messages from GCP Disks and CloudSQL.
- Created integration of source code management bitbucket with bamboo using bamboo spec and automated deployment process.
- Built database Model, APIs and Views utilizing Python in order to build an interactive web-based solution. Wrote python modules to extract/load asset data from CloudSQL database.
- Analysed data in Spark, Kafka framework on both real time and batch processing along with Hadoop integration for data ingestion, data mapping and data processing capabilities.
- Configured heira files, managed, created roles and profiles for various technology stacks in Puppet.
- Designed Continuous delivery pipeline with puppet module and deploying to lower UAT environment using puppet enterprise.
- Deployed remote docker cluster using puppet orchestrator through eventual consistency by pushing code to puppet master.
- Created production pipeline Jenkinsfile comprising Deploy to Staging, Staging accepting tests, Promote to Production, Noop production and Deploy to production stages using groovy scripting language and using plugins like puppet enterprise pipeline plugin, docker pipeline plugin and NexusArtifactUploader plugins.
- Deployed Multi-Threaded, low latency, high throughput applications across highly distributed cloud infrastructure using Docker swarm engine for container orchestration.
- Managed and integrated code quality tools like SonarQube, manage sonar rules, Quality gates.
- Deployed container-based builds using Docker, working with Docker images, Docker Hub, Docker-registries and kubernetes.
- Achieved Continuous Delivery goal on high scalable environment, used Docker coupled with load-balancing tool Nginx.
- Integrated Big Data Analytic Solutions based on Hadoop, SOLR, Spark, Kafka, Storm and web Methods.
- Integrated Maven with Shell and made automated deployments of Java based Enterprise application test suite to staging servers on timely manner using Jenkins Jobs and deploying JAR files of successful builds into Nexus repository.
- Merged bugzilla bug tracking tool for interaction with testing team updates and integrated within dev environment.
- Built Monitoring solutions for infrastructure based on Splunk, Nagios and New relic tools for various projects.
- Performed Blue-green deployments into Google Cloud Platform while migrating data from on-premise kubernetes cluster to Google Kubernetes Engine.
- Deployed custom-made application in Non-Prod, Staging environments to JBoss, Apache, Tomcat and Nginx web servers and automating process with Bamboo.
- Set up web services for schedule installation and validating config services which obtain coordinates for given input address.
- Migrated existing legacy oracle based OLTP system on to Google Cloud Platform solutions like Cloud Spanner.
Confidential
AWS Devops Engineer
Responsibilities:
- Created VPC’s, subnets and security groups for provision of customized AMI Launch configurations.
- Provisioned and maintained multiple EC2 nodes with various services installed on it as per requirement of cloud infrastructure in AWS.
- Used JIRA for creating bug tickets, storyboard, pulling reports from dashboard, planning sprints and used Git as version control tool and maintained Source code in GitHub repository.
- Integrated Jenkins with Ant and Maven for builds and wrote scripts using Shell and Python for automation of various tasks for Continuous Integration/Continuous Deployment process.
- Constructed build tool jobs to store binaries of builds in Nexus repository which are further used in Integration testing.
- Configured Elastic Beanstalk by setting up different layers for deploying and scaling web applications in pre-production and staging Environments.
- Bootstrapped Chef nodes, created and uploaded Chef Recipes using Ruby, performed Static Code Analysis using Food-critic and Rubocop, unit tests using ChefSpec and integration tests using Test Kitchen and In-Spec.
- Created roles and environments for continuous deployment within group of chef nodes using runlists.
- Created and stored DB snapshots in Amazon Simple Storage Service (S3) for backup and restore. Configured DynamoDB tables for application support, configured Lifecycle policies and versioning in S3 bucket for the storage of data.
- Registered and maintained domain names for application using the Route53 service by creating hosted zones and record sets along with Application Elastic Load Balancer(ELB) to direct application traffic to specific endpoints based on type of microservices.
- Implemented and maintained monitoring, alerting of production and corporate servers using ELK (Elastic Search, Log stash, Kibana) for application logs.
- Worked with Development Team Leads and testing teams to establish build schedule, execute builds and troubleshoot build failures, if any.