Aws Cloud Engineer Resume
Foster City, CA
SUMMARY
- AWS Certified Solution Architect Associate with 7+ years’ experience in IT industry with progressive experience in deployment, implementation, Automation, troubleshooting and maintenance of Cloud (AWS) environments.
- Experience in Administration, development, operations, maintenance and support for AWS cloud resources like launching, maintaining, and troubleshooting of EC2 instances, S3 buckets, VPC - Virtual Private Clouds, ELB - Elastic Load Balancers, RDS - Relational Database Services, AMI, SNS, SES, SQS, IAM, Route 53, Elastic Container Service, Elastic Beanstalk, Autoscaling, CloudFormation, CloudFront, CloudWatch anoted other services of the AWS and designed AWS cloud models for Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS).
- Hands on experience on DevOps tools such as GitHub, Maven, Jenkins, Docker, Kubernetes, Chef, Ansible, Puppet, Vagrant, Packer, Terraform.
- Hands on experience in configuring and managing cloud infrastructure in AWS using Terraform and CloudFormation. Hands-on experience in creating custom size VPCs, NACLs and NAT Subnets to deploy web applications and database templates using custom Terraform templates.
- Proficient in DevOps Tool offerings of Hashi Corp like Terraform, Packer, Vault and Consul.
- Expertise in working with Terraform, Terraform Cloud, Terraform Vault, and other key features such as Infrastructure as a code (IAC), Execution plans, Resource Graphs, Change Automation. Experience inwriting new plugins to support new functionality in Terraform and experienced in writing Terraform templates to deploy infrastructure on a cloud with EC2 and ELB in JSON.
- Expertise in integrating Terraform with Ansible, Packer to create and Version the AWS Infrastructure, designing, automating, implementing and sustainment of Amazon machine images (AMI) across the AWS Cloud environment. Worked with Terraform for automating VPCs, ELBs, security groups, SQS queues, S3 buckets, and continuing to replace the rest of our infrastructure.
- Worked with Terraform Templates to automate the AWS IaaS virtual machines using Terraform modules and deployed virtual machine scale sets in production environment. and created Terraform templates for provisioning virtual networks, subnets, VM Scale sets, Load balancers and NAT rules and used Terraform graph to visualize execution plan using the graph command.
- Hands on experience in creating multiple accounts using AWS Organizations and configuring AWS SSO.
- Involved in Blue/green deployments by making new applications which are indistinguishable from the production environment using Elastic Beanstalk to divert traffic from the old to the new environment.
- Experienced inElastic Search, Logstash and Kibana.
- Worked on creation of customDockercontainer images, tagging, and pushing the images.
- Created Docker images using Docker Swarm to handle multiple images for domain configurations and for middleware installations.
- Knowledge about Kubernetes.
- Hands on Experience using the SCM tools like GitHub, Bitbucket, Azure DevOps to maintain the version, branching and tagging across various environments.
- Created shell scripts (Bash), Python and Power Shell to automate tasks and deployments.
- Managed AWS Infra and automation with CLI, API. Managing DATA Center Inbound and outbound services. Hands on with automation tools likeAnsible, chefandPuppet.
- Designed, Installed, and ImplementedAnsibleconfiguration management system.
- Managed applications, OS, Services and Packages using AWS EC2, S3, Route53 and ELB with Chef Cookbooks.
- Strong experience creatingMAVENwithPuppet build scriptfor deployment artifacts. Experience in convertingbuild.xmlintopom.xmlto build the applications usingMAVEN.
- Worked with multiple DevOps methodologies andContinuous Integration (CI)/Continuous Delivery tools (Jenkins) Infrastructure.
- Knowledge in Jenkins Plugin Management, Securing and Scaling Jenkins, Integrating Code Analysis, Performance issues, Analytics and Test Phases to complete the CI/CD pipelines within Jenkins.
- Responsible for Configuring Jenkins master/slave nodes and created builds for CI/CD. Created and automated Jenkins pipeline using pipeline groovy script for the applications.
- Experience with installation, configuration and administration of various Application Servers like WebLogic, WebSphere, and Tomcat.
- Hands on with JIRA, HP Service Manager, ServiceNow in various projects.
- Experience in integrating Unit Tests and Code Quality Analysis Tools like JUnit, Selenium.
- Strong understanding in network design, operational support, or hands-on implementation and configuration of routers, hubs, switches, Controllers.
TECHNICAL SKILLS
Languages: Python, Shell, JSON, YAML, and HTML.
Version Control System: GitHub, Bitbucket, Nexus, Artifactory, Subversion (SVN), GIT.
Databases: Oracle, MYSQL, Aurora, PostgreSQL
Build Tools: Maven, Bamboo, AWS Code CommitIAAC Terraform, CloudFormation
CI Tools: Jenkins, Bamboo
Ops Tools: Docker, Kubernetes, Ansible
Operating Systems: Windows, Linux
Application Servers: Apache Tomcat, Web Logic, Web Sphere, Jboss
Bug Tracking Tools: JIRA, HP Service Manager, ServiceNow.
AWS: EC2, Elastic Block Storage, ECS, Cloud formation, VPC, SubnetsIAM Roles and Pollicises, SNS, S3, API GatewayElastic Cache, Elasticsearch, DynamoDB
Networking software: Wireshark, Putty.
PROFESSIONAL EXPERIENCE
Confidential, Foster city CA
AWS Cloud Engineer
Responsibilities:
- Managing Cloud Services using Terraform, which helped developers and businesses an east way to create a collection of related AWS resources and provision them in an orderly and predictable fashion.
- Responsible for all the ITIL process which includes Request, Incident tickets related to AWS services and many prod deployments of various internal projects.
- Member of Cloud Services team which controls all the cloud across the organization and act as administrators implement cloud activities across the organization.
- Responsible for creating new AWS accounts and setting up all mandatory features like SSO, Config rules, VPC’s, Subnets, TFS repositories, Jenkins pipeline for all the accounts to meet RGA standards and compliance.
- Acting as a point of contact for many application developments teams regarding their cloud issues and day to day-to-day cloud activities.
- Experience working with creating config rules which can flag the resources which are beyond policy compliance such as SSL/TLS encryption for RDS clusters.
- Responsible for creating new IAM roles, groups, users, security groups, allowing different traffic ports using terraform and other access related permission issues.
- Experience working on implementing AWS Inspector, enabling Security Hub and Guarduty in all the AWS accounts using Terraform.
- Responsible for deploying changes, updates, creation of resources and deploying them to production accounts across different projects/environments.
- Experience working on creating S3 buckets, enabling versioning, adding lifecycle policies adding logging, DynamoDB using Terraform.
- Experience working on ServiceNow for creating change tickets, Request, Incident tickets and handling them to meet the progress of task.
- Experience working on TFS as a version control tool and Azure DevOps for creating user stories and tracking the progress of work.
- Experience working on Jenkins for Continuous Integration and Continuous Delivery, creating Jenkins Pipelines, service/web hooks and responsible for merging, branching, and deploying the changes to different environments.
- Experience working on creating VPC, Route Tables, NACL’s, NAT Gateways, VPC peering tasks, setting up VPN configuration, and other Infrastructure related tasks using Terraform.
- Experience working on Windows FSX, setting up cloud watch alarms and creating AMI’s for windows without causing licensing issues.
- Responsible for setting up Cloudability integration with AWS and Azure and maintaining all Cloudability operational activities.
Environment: AWS, Terraform, CloudFormation, DynamoDB, S3, Docker, Git, Jenkins, Jira.
Confidential -Baltimore, MD
Infrastructure Engineer
Responsibilities:
- Responsible for designing, implementing, and supporting of cloud-based infrastructure and its solutions.
- Responsible for writing python scripts for tagging thousands of AWS resources and pulling the resources dat are not tagged based on tag key.
- Designed python scripts in production and non-production environments and provided immediate solutions for the issues.
- Responsible for creating Terraform scripts for deploying various resources in AWS for various environments.
- Worked on the databases of the Amazon RDS and carried out functionalities for creating instances as per the requirements.
- Implemented Continuous Integration using Jenkins and GIT from scratch and Responsible for performing tasks like Branching, Tagging, and Release Activities on Version Control Tools like GIT and Bitbucket.
- Created and configured elastic load balancers and auto scaling groups to distribute the traffic and to have a cost efficient, fault tolerant and highly available environment.
- Provided technical support and developed solutions for the issues dat arise in the execution of wave in the cloud migration process.
- Documented all the solutions and processes of configuration, installation, setup, data flows and various changes dat occur in the infrastructure.
- Well versed on Atlassian tools like JIRA for Agile and Kanban, Git, Bitbucket for Code Repository and Confluence as Wiki.
- Co-ordinate with all the teams and work along with them to align the process before, during and after the migration process.
- Managing Cloud Services using Terraform, which helped developers and businesses an east way to create a collection of related AWS resources and provision them in an orderly and predictable fashion.
- Responsible for configuring, setting-up and proof-of-concept a monitoring tool named AWS Application Discovery Tool which is used to find all the details and dependencies of a servers and a part of Cloud Migration and internal purpose of regular maintenance.
- Responsible for customizing and creating python scripts to import data from servers and then move data from migration hub console to S3 buckets.
- Experience in writing python automation scripts for continuous flow of data into S3 buckets and scheduling cron jobs for events.
- Created various docker containers for Streamsets, Elasticsearch, Kibana and Grafana. Pulled docker images from official repository and then pushed them to trusted private registry.
- Worked extensively in Branching, Merging, Tagging, and maintaining the version across the environments using tools like GIT on Linux and Windows platforms.
- Rationalize the servers among the on-prem servers dat are to be moved from on-prem to AWS cloud.
- Responsible for creating streamsets data pipelines and configuring them to move data from S3 buckets to Elasticsearch Database.
- Responsible for modifying data in Elasticsearch using Kibana to fit the data to visualize it in Grafana.
- Created data sources and tables in Grafana Dashboard to visualize data in Grafana and created dependency graphs in Kibana.
- Part of risk assessment meeting about application discovery tool and giving technical solutions and answers to the team.
- Created drafts for different tools such as Database Migration Service (DMS), Neptune, Quicksight and many other for the purpose of using them in the process of cloud migration.
Environment: Tomcat, apache, Java/J2EE, Subversion, Jenkins, JIRA, OpenStack, Maven, GIT, Puppet, AWS, Azure, Python, Unix Shell Scripting.
Confidential, Foster city CA
Cloud Engineer
Responsibilities:
- Involved extensively in re designing of EE/ ER Portal, ISSO, POGH using updated technologies
- As a part of automation developed REST API’s to integrate with LAMBDA functions and other AWS Services.
- Well versed with Amazon Route 53 which effectively connects user requests to the infrastructure running on AmazonEC2 instances and Amazon S3 buckets .
- Developed WAF to protect web application from attacks like SQL Injection, Cross-site scripting, file inclusion etc..
- Created EC2 instances in AWS. Also worked with Aws concepts like IAM, S3, and Cloud watch .
- Used IAM to create new accounts, roles and groups and polices and developed critical modules like generating amazon resource numbers and integration points with S3, Dynamo DB, RDS, Lambda and SQS Queue.
- Created Custom Domain dat can be accessed from virtual private cloud using VPC endpoint.
- Enhancement of Amazon Virtual Private Cloud in the scalable environment which provides advanced security features such as security groups and network access control lists to enable inbound and outbound filtering at the instance level and subnet level.
- Proficient in writingAWS Cloud Formationtemplates to create custom sized VPC, subnets, NAT, EC2 instances, ELB’s and Security groups .
- Experience in writing Lambda functions.
- Provided highly durable and available data by using S3 data store, versioning, lifecycle policies.
- Managed Source code using GIT and used the concepts of branching, merging and tagging for release management.
- Worked on resolving build failures related to environments, tools and scripts.
- Used Terraform to map more complex dependencies and identify network issue.
- Initiated the process of deployment for automation to Web Sphere servers by developing Python scripts.
- Created various ‘. gitlab-ci.yml' templates with build, codescan, veracode, docker, quay and deploy stages.
- Used Kubernetes for automated deployments, scaling and management of containerized applications across clusters of hosts.
- Troubleshooted the automation of Installing and configuring applications in the test environments.
Environment: GIT, AWS EC2, S3, ROUTE 53,VPC, python, cloud watch, rest api’s, custom domain,aws elastic load balanced, awslambda,iam, terraform, powershell, bash.
Confidential, Burlington MA
AWS Cloud Engineer
Responsibilities:
- Used Terraform to provision the infrastructure in AWS
- By writing Terraform code created AMIs and Build EC2 and VPCs as per the requirement
- Used to raise pull request in Bitbucket once the Terraform code is pushed into repo.
- Used Lucid chart designed infrastructure for the application as per the requirement.
- Working on CI/CD once the code is merged to master branch. Build jobs using groovy script and freestyle jobs.
- Worked on automation using Terraform and Python.
- Worked on developing and testing APIs for front-end integration involving Angular JS AJAX, HTML, CSS
- Written Integrations to JavaScript, JSP, AJAX, AngularJS and jQuery for dynamic manipulation of the elements on the screen and to validate the input
- Manage version control repositories and branching in GIT integrated with ECLIPSE IDE.
- Daily routine includes deploying the code to the lower environments and automating the deployments.
- Coordinated with different teams across the globe to deploy different builds to different environments on parallel development for multiple projects.
- Supported development, testing and production support teams (24*7) from configuration, deployments environments.
- Coordinated with Release Management regarding appropriate system releases among other development-platforms.
Environment: GIT, AWS, EC2,S3, IAM,Terraform,powershell,bash.,html5,Javascript,jquery,Bootstrap,Ajax,css, adobe Dreamweaver, json
Confidential - NC
Cloud Operations Engineer
Responsibilities:
- Worked in hybrid cloud environment with public and private cloud. Terraform (IaC) to build ECS infrastructure for Hygieia.
- Providing infrastructure Support needed for enterprise tools and DevOps tools in Production and development accounts.
- Architecting designs infrastructure for enterprise tools Jira, Confluence, GitHub, and DevOps tools SonarQube, Artifactory, Xray and Hygieia.
- Created drafts for different tools such as Database Migration Service (DMS), Neptune, Quicksight and many other for the purpose of using them in the process of cloud migration.
- Implemented ECS Docker container networking and load balancer dynamic port mapping techniques.
- Worked on ECS Task Definitions, container definitions, NACLs, Security Groups, ACM, WAF
- Setting up RDS and MongoDB for different applications using terraform.
- Built terraform for EKS (Elastic Kubernetes Cluster) cluster to deploy Jenkins application.
- Deployed JIRA and Confluence applications using CloudFormation (IaC) with highly available and fault tolerant architecture.
- Work on Transit VPC, VPN, Direct Connect, Directory services, DNS routing with route53.
Environment: Java, Hygieia, AWS, Git, Subversion, Shell & Python scripting.
Confidential
DevOps Engineer
Responsibilities:
- Detailed technical noledge and hands-on experience of DevOps, Automation, Build Engineering and Configuration Management.
- Working for DevOps Platform team responsible for specialization areas related to Cloud Automation and Created cloud formation templates (CFT) to launch the stacks.
- Developed installer scripts using Ant, Python and UNIX for various products to be hosted on Application Servers.
- Migrating the current code to CI/CD pipeline via Ant to Maven and Anthill Pro to Jenkins.
- Designing and implementing CI (Continuous Integration) system: configuring Jenkins Servers, Jenkins nodes, creating required scripts (Perl & Python), and creating/configuring VMs (Windows/Linux).
- Managed Elastic Cloud Computing (EC2) instances utilizing auto scaling, Elastic Load Balancing, and Glacier for our QA and UAT environments.
- Estimating AWS usage costs and identifying operational cost control mechanisms.
Environment: Linux, Puppet, Agile, PostgreSQL, Jenkins, Shell/python scripting, SVN, ANT, J2EE, Python, Apache Tomcat, RHEL, Nexus.