We provide IT Staff Augmentation Services!

Aws Cloud Engineer Resume

SUMMARY:

  • 6+ years of experience in IT Industry with expertise in AWS cloud, Devops Configuration and experience in Installation, configuration and troubleshooting of Red hat enterprise Linux, Ubuntu and Windows on various hardware platforms.
  • Hands on experience with an in - depth level of understanding in the strategy and practical implementation of AWS cloud-specific technologies like Elastic Compute Cloud (EC2), Simple Storage Services (S3), Route 53, Cloud Formation, Elastic IPs, Virtual Private Cloud (VPC), RDS and Cloud Watch, SNS & SES.
  • Configured Elastic Load Balancers with EC2 Auto Scaling groups. Optimized volumes and EC2 instances and created multi AZ VPC instances. Good understanding of AWS Elastic Block Storage (EBS), various volume types and utilization of different types of EBS volumes based on requirement.
  • Configured S3 lifecycle of Applications and Database logs, including deleting old logs, archiving logs based on the retention policy of Apps and Databases. Implement and maintain the monitoring and alerting of production and corporate servers/costs using Cloud Watch.
  • Created detailed AWS Security Groups, which behaved as virtual firewalls that controlled the traffic allowed to reach one or more AWS EC2 instances. Handled operations and maintenance support for AWS cloud resources which includes launching, maintaining and troubleshooting EC2 instances, S3 buckets, Virtual Private Clouds (VPC), Elastic Load Balancers (ELB) and Relational Database Services (RDS).
  • Deployed applications on AWS by using Elastic Beanstalk.
  • Understanding of secure-cloud configuration (Cloud Trail, AWS Config), cloud-security technologies (VPC, Security Groups, etc.) and cloud-permission systems (IAM). Creating snapshots and Amazon machine images (AMIs) of the instances for backup and creating clone instances.
  • Experienced with AWS Cloud Formation templates on creating IAM Roles & total architecture deployment end to end (creation of EC2 instances and its infrastructure).
  • Build and configured virtual Data centre in Amazon cloud to support Enterprise hosting which includes VPC, public, private subnets, Security groups and Route tables.
  • Implemented DNS service through Route 53 on ELBs to achieve secured connection via https. Experienced in architecting and configuring secure VPC through private and public networks in AWS.
  • Good knowledge on experience with AWS Redshift. Experience working on administering various AWS Services using Amazon AWS Console, CLI in Linux and windows environment and by using Amazon API using java and Python.
  • Experience in deploying and monitoring applications on various platforms using Elastic Bean Stalk.
  • Utilized Cloud Watch to monitor resources such as EC2, CPU memory, Amazon RDS services, EBS volumes, to set alarms for notification or automated actions and to monitor logs for a better understanding and operation of the system.
  • High exposure to REMEDY and JIRA defect tracking tools for tracking defects and changes for change management.
  • Experienced in Setting up Chef Server, Workstation and Bootstrapping Nodes/clients and creating Cookbooks, recipes, databags and utilized community cookbooks as well.
  • Good knowledge on Amazon Workspaces and Workspaces Application Management.
  • Worked with Ansible playbooks for virtual and physical instance provisioning, configuration management, patching and software deployment Scheduled, deployed and managed container replicas onto a node cluster using Kubernetes.
  • Worked on cloud-based service and software for managing connected products and machines and implementing Machine-to-Machine (M2M) and Internet of Things (IoT) applications like Axeda and ThingWorx.

TECHNICAL SKILLS:

Cloud Computing services: Amazon Web Services: EC2, S3, ELB, Auto scaling Servers, Glacier, Storage Lifecycle rules, Elastic Beanstalk, Cloud Front, Elastic cache, RDS, VPC, EBS, Route 53, Cloud watch, Cloud trail, Ops work, Cloud Formation, IAM & Roles, SNS subscription service, SQS, SNS, Code Commit, Redshift, Dynamo Db, Lambda, Code Deploy, EFS, EMR, AD.

Web/Application Servers: WebLogic, WebSphere, Apache Tomcat, JBOSS

Operating Systems: RedHat Linux, CentOS, Ubuntu, UNIX, AIX, Windows

Scripting Languages: Batch Script, Ruby, Python, Shell Script, PowerShell, Perl.

CI/CD Provisioning Tools: GIT, Big bucket, Jenkins, ANT, Ansible, MAVEN, Docker, Kubernetes.

Monitoring/Performance tools: Splunk, ELK Stack (Elasticsearch, Logstash, Kibana, Beats), Graphana.

Networking/Protocol: TCP/IP, HTTP/HTTPS, DHCP, LAN, SSL/TSL Certificate Verify.

Internet of Things(IoT) applications: Axeda, ThingWorx

PROFESSIONAL EXPERIENCE:

Confidential

AWS Cloud Engineer

Responsibilities:

  • Experience in designing and deploying AWS Solutions using EC2, S3, EBS, Elastic Load balancer (ELB), auto scaling groups.
  • Responsible for Design and architecture of different Release Environments for new projects.
  • Worked at optimizing volumes and EC2 instances and created multiple VPC instances.
  • Writing Maven and Ant scripts for application layer modules.
  • Implementing new projects builds framework using Jenkins & maven as build framework tools.
  • Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation JSON templates.
  • Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
  • Implementing a Continuous Delivery framework using Jenkins, Chef, Maven & Nexus as tools.
  • Experience involving configuring S3 versioning and lifecycle policies to and backup files and archive files in glacier.
  • Utilize Amazon Glacier for archiving data.
  • Creating alarms in Cloud watch service for monitoring the servers' performance, CPU Utilization and the disk usage etc.
  • Developed, deployed, and managed event-driven and scheduled AWS Lambda functions to be triggered in response to events on various AWS sources including logging, monitoring, and security related events and to be invoked on scheduled basis to take backups.
  • Deployed a code using blue/green deployments with AWS Code deploy to reduce downtime due to application deployment. If something unexpected happens with your new version on Green, you can immediately roll back to the last version by switching back to Blue.
  • Used Terraform and did "Infrastructure as code" and modifying terraform scripts as and when configuration changes happens.
  • Responsible for the operation, maintenance and integrity of a distributed networked Linux environment.
  • Written Chef Cookbooks and recipes in Ruby to Provision several pre-prod environments consisting of Cassandra DB installations, WebLogic domain creations and several proprietary middleware installations.
  • Developed Scripts for AWS Orchestration
  • Created Amazon Workspaces for employees.
  • Worked on a cloud-based service and software for managing connected products and machines and implementing Machine-to-Machine (M2M) and Internet of Things (IoT) applications like Axeda iSupport.
  • Responsible in creating and deploying an Agent Gateway, Agent Connector in Axeda iSupport.
  • Involved in a platform for the rapid development of applications designed for smart, connected sensors, devices, and products or the Internet of Things (IoT) like ThingWorx.
  • System monitoring with Nagios & Graphite.
  • Installed, configured and maintained web servers like HTTP Web Server, Apache Web Server and WebSphere Application Server on Red Hat Linux.
  • Business data analysis using Big Data tools like Splunk, ELK.
  • Experience in CI and CD with Jenkins.
  • Used Puppet server and workstation to manage and configure nodes.
  • Experience in writing Puppet manifests to automate configuration of a broad range of services.
  • Designed tool API and Map Reduce job workflow using AWS EMR and S3.
  • Implementing Hadoop clusters on processing Big Data pipelines using Amazon EMR.
  • Prepared projects, dashboards, reports and questions for all JIRA related services.
  • Generated scripts for effective integration of JIRA applications with other tools.
  • Defining Release Process & Policy for projects early in SDLC.
  • Branching and merging code lines in the GIT and resolved all the conflicts raised during the merges.
  • Designed highly available, cost effective and fault tolerant systems using multiple EC2 instances, Auto Scaling, Elastic Load Balance and AMIs.
  • Highly skilled in the usage of data centre automation and configuration management tool such as Docker.
  • Perform Deployment of Release in various QA & UAT environments.
  • Responsible for installation and upgrade of patches and packages on RHEL 5/6 using RPM & YUM.
  • Supporting different projects build & Release SCM effort e.g. branching, tagging, merge, etc.

Environment: AWS, S3, EBS, Elastic Load balancer (ELB), auto scaling groups, VPC, IAM, Cloud Watch, Glacier, Dynamo Db, Elastic Cache, Directory Services, EMR(Elastic Map Reduce), Route53, Puppet, Jenkins, Maven, Subversion, Ant, Bash Scripts, GIT, Docker, Jira, Chef, and Nexus in Linux environment, OpenStack, Axeda, ThingWorx.

Confidential

AWS Engineer

Responsibilities:

  • Created Terraform scripts to move existing on-premises applications to cloud.
  • Had an extensive role in On-Premises Mid-tier application migrations to the Cloud-lift and shift to AWS infrastructure.
  • On boarded and migrated test and staging use cases for applications to AWS cloud with public and private IP ranges to increase development productivity by reducing test-run times. Created monitors, alarms and notifications for EC2 hosts using Cloud Watch.
  • Implemented DNS service through Route 53 on ELBs to achieve secured connection via https. Utilized Amazon Route53 to manage DNS zones and also assign public DNS names to elastic load balancers IP's.
  • Involved in reviewing and assessing current infrastructure to be migrated to the AWS cloud platform. Created new servers in AWS using EC2 instances, configured security groups and Elastic IPs for the instances.
  • Lead many critical on-premises data migrations to AWS cloud, assisting the performance tuning and providing successful path towards Redshift Cluster and RDS DB engines
  • Set up an Elastic Load Balancer to balance and distribute incoming traffic to multiple servers running on EC2 instances. Performed maintenance to ensure reliable and consistently available EC2 instances. Built DNS system in EC2 and managed all DNS related tasks.
  • Created Elastic Cache for the database systems to ensure quick access to frequently requested databases. Created backup of database systems using S3, EBS and RDS services of AWS.
  • Set up Route 53 to ensure traffic distribution among different regions of AWS. Set up a content delivery system using AWS Cloud Front to distribute content like html and graphics files faster.
  • Built DNS system in EC2 and managed all DNS related tasks. Created Amazon VPC to create public-facing subnet for web servers with internet access, and backend databases & application servers in a private-facing subnet with no Internet access.
  • Experience with VPC Peering in data transfer from one VPC to Another VPC.
  • Responsible for building out and improving the reliability and performance of cloud applications and cloud infrastructure deployed on Amazon Web Services.
  • Create and attach volumes on to EC2 instances.
  • Provide highly durable and available data by using S3 data store, versioning, lifecycle policies, and create AMIs for mission critical production servers for backup.
  • Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation JSON templates.
  • Build servers using AWS, importing volumes, launching EC2, RDS, creating security groups, auto-scaling, load balancers (ELBs) in the defined virtual private connection.
  • Create the new instance with the latest AMI with the same IP address and hostname.
  • Manipulated Cloud Formation Templates and upload to S3 Service and automatically deploy into an entire environment.
  • Implemented, supported and maintained all network, firewall, storage, load balancers, operating systems, and software in Amazon's Elastic Compute Cloud.
  • Used Python scripts to store data in S3 and retrieve those files in redshift by using programmatic access by AWS CLI.
  • Tested and configured AWS Workspaces (Windows virtual desktop solution) for custom application requirement.
  • Managed Ansible Playbooks with Ansible modules, implemented CD automation using Ansible, managing existing servers and automation of build/configuration of new servers.
  • Worked on Building server less web pages using API gateway and lambda.
  • Manage Amazon Redshift clusters such as launching the cluster and specifying the node type.
  • Security reference architecture spanned security groups, NACL, IAM group and Custom Roles, Key management services and Key Vault, CloudHSM and Web Application Firewall.
  • Worked on writing different automation scripts to help developers to interact with SQS and SNS performance tuning of various processes running based on the SQS Queue.
  • Configure and ensure connection to RDS database running on MySQL engines.
  • Solid experience with onsite and offshore model. Directed build and deployment teams remotely, technically and effectively.

Environment: EC2, Elastic IPs, Cloud Formation, SQS, SNS, Elastic Load Balancer, S3, EBS, RDS, Cloud watch, Route53, Cloud Front, Cloud Trail, Active Directory, Jenkins, NACL, Security Groups, AWS Config, AWS CLI, WAF, Terraform, Redshift, Python Scripts, ELB, Autoscaling, KMS, CloudHSM.

Confidential

Cloud Engineer

Responsibilities:

  • Design & Implemented VPC service for extension of customer's on-premises datacentre with AWS Cloud using AWS VPC and VPN, Direct Connect Services.
  • Implemented Amazon Web Services (AWS) provisioning and working with AWS services like EC2, S3, Glacier, ELB (Load Balancers), RDS, SNS, SWF, and EBS.
  • Configured S3 to host static web content, versioning and lifecycle policies to and backup files and archive files in Amazon Glacier.
  • Create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
  • Creating an AWS RDS MySQL DB cluster and connected to the database through an Amazon RDS MySQL DB Instance using the Amazon RDS Console.
  • Deployed highly available applications using Elastic Load Balancers with EC2 Auto scaling groups.
  • Created monitoring alarms and notifications for EC2 hosts using CloudWatch, SNS.
  • Utilized CloudWatch to monitor resources such as EC2, CPU Memory Utilization and Amazon EC2 to design high availability applications on cross availability zones.
  • Enabled Continuous Delivery through Deployment into several environments of Test, QA, Stress and Production using Jenkins.
  • Worked in the team that developed web Services using XML messages that use REST.
  • Refined automation components with scripting and configuration management (Ansible).
  • Implemented Ansible to manage all existing servers and automate the build/configuration of new servers. All server’s types were fully defined in Ansible, so that a newly built server could be up and ready for production within 30 minutes OS installation.
  • Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS nodes and Tested Playbooks on AWS instances using Python.
  • Have run Ansible Scripts to Provide Dev Servers.
  • Configured Security and System in Jenkins added multiple nodes to Jenkins and configured SSH for continuous deployments.
  • Build the code Using Maven tool and moved builds into Git.
  • Launched Java Applications on the servers like Tomcat.
  • Analysed tracked Sprint Performance and Bug Tracking using the JIRA
  • Implemented rapid-provisioning and lifecycle management for Ubuntu Linux using Amazon EC2 and custom Bash scripts.
  • Configured and monitored distributed and multi-platform servers using Nagios.
  • Used Git as source code management tool: Creating local repo, Cloning the repo, Adding, Committing, Pushing the changes in the local repo, saving changes for later (Stash), Recovering files, Branching, Creating Tags, Viewing logs etc.
  • Actively involved in architecture of DevOps platform and cloud solutions.

Environment: Identity Access Management (IAM), EC2, S3, Virtual Private Cloud(VPC), Security groups, Auto-scaling groups, Elastic Load Balancer(ELB), Route 53, Cloud Watch, Chef, GIT, Jenkins, Web Logic Server, Unix/Linux, Shell Scripting

Confidential

Software Engineer

Responsibilities:

  • Develops and maintains applications and databases by evaluating client needs; analyzing requirements; developing software systems.
  • Involved in programming using C, C++.
  • Confirms program operation by conducting tests; modifying program sequence and/or codes. Involved in business meetings with users in gathering report requirements and translating the same into reports.
  • Created report on biweekly and monthly basis and exported them in different formats (PDF and EXCEL) depending on user's requirement.
  • Prepare release documentation for TEST and UAT (User acceptance testing) and Pre- production environments.
  • Analyzed the queries using Statistics, Optimized, and fine-tuned the SQL using Explain Plan utilities for better performance.
  • Effectively involved in Unit Testing and deployed the reports from development server for Testing and Production environment.
  • Applying personnel programs experience, analyzes, develops and evaluates issues, and develops recommendations to resolve substantive problems and issues of effectiveness and efficiency of work operations in a major command program.
  • Experienced with all phase of Software Development Life Cycle (SDLC) involving Systems Analysis, Design, Development and Implementation.
  • Enthusiastic and project-focused team player with good communication and leadership skills and the ability to develop creative solutions for challenging client needs.

Hire Now