We provide IT Staff Augmentation Services!

Aws Engineer Resume

2.00/5 (Submit Your Rating)

Austin, TX

SUMMARY:

  • Over 7 years of experience as a System/ AWS Engineer with excellent experience in hosting and managing servers in AWS and Windows
  • Experience in deploying, managing, and operating scalable, highly available and fault tolerant systems in AWS includes Load Balancer and Auto Scaling.
  • Utilized Cloud Watch to monitor resources such as EC2, CPU memory, Amazon RDS DB services, EBS volumes; to set alarms for notification or automated actions; and to monitor logs for a better understanding and operation of the system.
  • Experience in designing and deploying AWS Solutions using EC2, S3, EBS, Elastic Load balancer (ELB), auto scaling groups.
  • Worked under Waterfall, RAD and Agile Software Development Lifecycle methodologies
  • Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, XML etc.
  • Experienced in integration of various data sources (IBM DB2, SQL Server, PL/SQL, Oracle, Teradata, and XML) into data staging area.
  • Excellent experience in troubleshooting of Data Stage jobs and addressing production issues like performance tuning and enhancements.
  • Have good working experience on various operating systems like UNIX and Windows.
  • Strong experience with AWS IAM service in creating IAM users & groups, defining roles and policies and Identity providers. And used them to restrict access to certain services.
  • Created alarms and trigger points in CloudWatch based on thresholds and monitored the server's performance, CPU Utilization, disk usage.
  • Worked with Red Hat Open Shift Container Platform for Docker and Kubernetes. Used Kubernetes to manage containerized applications using its nodes, Config Maps, selector, Services & deployed application containers as Pods.
  • Created and Maintained Chef Recipes and cookbooks to simplify and expedite deployment of applications and mitigate user error.
  • Rewrote many Puppet modules to modern code - quality standards, Use of Docker to manage Microservices for development and testing
  • Worked on installing of Docker using Docker toolbox, creation of custom Docker container images, tagging and pushing the images, removing images, and managing Docker volumes.
  • Performed automation tasks on various Docker components like Docker Hub, Docker Engine, Docker Machine, Compose and Docker Registry.
  • Automated application deployment in the cloud using Docker technology using Elastic Container Service (ECS) scheduler.
  • Experience Setting up databases in AWS using RDS, storage using S3 bucket and configuring instance backups to S3 bucket.
  • Worked on Amazon Web Services(AWS) infrastructure with automation and configuration management tools such as Chef & Puppet
  • Hands on experience in configuring and managing services like AWS EC2 using AMI's available.

TECHNICAL SKILLS:

Operating Systems: RHEL/CentOS 5.x/6.x/7/CentOS, Ubuntu/Debian/Fedora, Windows XP 2000/2003/2008

Languages: C, C++, Python, Ruby, Java/J2EECI

Tools: JENKINS, HUDSON, Bamboo, Anthill Pro, Nexus

CM Tools: CHEF, Puppet, Ansible

Databases: MySQL, MongoDB, SQL Server

Scripts: Shell Script, ANT Script, Batch Script, Perl Script, Power Shell Script, Groovy.

Version Control Tools: GIT, SVN, Bitbucket, GitHub

Web Technologies: Servlets, JDBC, JSP, HTML, Java Script, XML.

Web/App Server: Apache, IIS, HIS, Tomcat, WebSphere Application Server, JBoss

RDBMS: Oracle, SQL SERVER, MYSQL.

Build Tools: ANT, MAVEN, Gradle, MS build

PROFESSIONAL EXPERIENCE:

Confidential, Austin, TX

AWS Engineer

Responsibilities:

  • Extensively involved in gathering requirements from business, analysis, design, development and testing.
  • Created SQS standard queues for reading the messages and processing it through python code.
  • Created Success/Error Queues for posting the messages respectively and updating the status in Dynamo DB table.
  • Built Cloud Formation Templates (CFT) in YAML and JSON format to build the AWS services with the paradigm of Infrastructure as a Code.
  • Configure the New Relic Infra Agent on each system for resource monitoring and configure the New Relic APM Agent on the App servers.
  • Created ECS cluster, Task and Docker repository with auto scaling groups.
  • Worked on S3 for reading secret, environment files and for archiving of messages.
  • Created AWS Lambda function for extracting the data from SAP database and post the data to AWS S3 bucket on scheduled basis (every 4 hours) using AWS cloud watch event.
  • Worked on Validation of XML messages and convert it to JSON for further processing.
  • Worked on error handling in Python code and creating common format for handling various errors.
  • Created Log collection in ELKB (Elastic search, Log stash, Kibana, File beat) installed File beat on all nodes in cluster to send log data to Log stash. Applied filters on log data before sending to Elastic search.
  • Created topics in SNS to send notifications to subscribers as per the requirement.
  • Worked on creating Terraform files for building AWS SQS Queues, ECS Clusters, Auto-Scaling groups, SNS, LAMBDA, Dynamo DB and Cloud Watch Event.
  • Worked on deployment of code to AWS Code Commit using GIT commands (pull, fetch, push and commit.) from AWS CLI.
  • Experience in using Docker and setting up ELK with Docker and Docker-Compose. Actively involved in deployments on Docker using Kubernetes.
  • Involved in monitoring cloud watch events and fixing the performance issues raised during Python HTTP service scale up.
  • Worked on fixing performance issue raised while reading the data from s3 bucket and processed it in Python code by using python cache.
  • Using Python unit test library, created various unit test cases for testing python code and generating report with details.

Environment: AWS (SQS, ECS, EC2, ELB, S3, Cloud Watch, AWS Auto Scaling, Lambda, SNS, Dynamo DB, Code Commit, Code Build, Code Pipeline), GIT, Jira, AWS CLI, Unix/Linux, Python 3.6, Shell scripting, Terraform, YAML, JSON, XML, SAP WSDL.

Confidential - Denver, CO

AWS Engineer / Support

Responsibilities:

  • Work closely with development teams to integrate their projects into the production AWS environment and ensure their ongoing support.
  • Automated the front-ends platform into highly scalable, consistent, repeatable infrastructure using high degree of automation using Chef, puppet and cloud Formation.
  • Created network architecture on AWS VPC, subnets, Internet Gateway, Route • Table and NAT Setup.
  • Designed high availability environment for Application servers and database servers on EC2 by using ELB and Auto-scaling.
  • Used Cloud-watch for monitoring AWS cloud resources and the applications that deployed on AWS by creating new alarm, enable notification service.
  • Selecting the appropriate AWS service based on data, compute, database or security requirements.
  • Add project users to the AWS account with multi factor authentication enabled and least privilege permissions.
  • AWS Import/Export accelerates moving large amounts of data into and out of AWS using portable storage devices for transport.
  • Perform S3 buckets creation, configured the storage on S3 buckets, policies and the IAM role based policies.
  • APM monitoring and deployment using AppDynamics, CA Wily and Splunk.
  • Automated the contiguous process writing code using shell, python scripting languages
  • Developed a system to monitor agile teams and performed log analysis on ELK stack.
  • Handled installation, administration and configuration of ELK stack on AWS and performed Log Analysis.
  • Used S3 bucket on Elastic search to data back up and restoring.
  • Configured auto-scaling website platform to build secure, highly scalable and flexible systems that handled expected and unexpected load bursts.
  • Server configuration management via chef and puppet.
  • Worked with the development team to create appropriate cloud solutions for client needs.
  • Developed hybrid cloud delivery model allowing for customers to choose the mix of public and private clouds to meet their individual needs.
  • Build servers using AWS, importing volumes, launching EC2, creating security groups, auto-scaling, load balancers (ELBs) in the defined virtual private connection.
  • Migrated applications to the AWS cloud.
  • Build and configure a virtual data centers in the Amazon Web Services cloud to support Enterprise Data Warehouse hosting including Virtual Private Cloud (VPC), Public and Private Subnets, Security Groups, Route Tables, Elastic Load Balancer.
  • Manage Amazon redshift clusters such as launching the cluster and specifying the node type as well. • Use AWS Beanstalk for deploying and scaling web applications and services developed with Java.
  • Deployment of entire infrastructure using cloud formation.
  • Designed AWS CloudFormation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
  • Implement Continuous Integration using Jenkins and CloudFormation Update Stack.
  • Use EC2 Container Service (ECS) to support Docker containers to easily run applications on a managed cluster of Amazon EC2 instances

Environment: Amazon Web Services, IAM, S3, EBS, AWS SDK, CloudWatch, CloudFormation, Chef, Puppet, Apache HTTPD, Apache Tomcat, JSON, Shell

Confidential - Richardson, TX

DevOps AWS Cloud Engineer

Responsibilities:

  • Implemented Large Scale Cloud Infrastructure using AWS services (S3, EC2, ELB, EBS, Route53, VPC, auto scaling etc.) and deployment services (OpsWorks and Cloud Formation) and security practices (IAM, Cloud watch and Cloud trail) and services Lambda, EMR, Redshift, ECS, Elastic Bean Stalk, X-ray.
  • Managed Elastic Cloud Computing (EC2) instances utilizing Auto scaling, Elastic Load balancing, and Glacier for our Dev and Test environments as well as infrastructure servers for GIT and CHEF.
  • Migrated present Linux environment to AWS/CentOS/RHEL by creating and executing a migration plan per scheduled timeline to complete the migration.
  • Worked on Terraform for managing the infrastructure through the terminal sessions and executing the scripts for creating alarms and notifications for EC2 instances using Cloud Watch.
  • Hands on Experience in using Network Load Balancer, Security groups, Firewalls and Route53.
  • Deployed JSON template to create a stack in Cloud Formation which include services like Amazon EC2, Amazon S3, Amazon RDS, Amazon Elastic Load Balancing, Amazon VPC, SQS and other services of the AWS infrastructure.
  • Created functions and assigned roles in AWS Lambda to run python scripts, Used AWS Lambda with java to perform event driven processing. Created Lambda jobs and configured Roles using AWS CLI.
  • Used Identify and Access Management (IAM) to assign roles and to create and manage AWS users and groups, and user permissions to AWS resources.
  • Working with best DevOps practices using AWS, Elastic Bean stalk and Docker with Kubernetes. And Changed AWS infrastructure from Elastic Beanstalk to Docker and used Kubernetes for Orchestration.
  • Wrote Chef Recipes for various applications and deployed them in AWS using Terraform.
  • Worked with Chef Enterprise Hosted as well as On-Premise, Installed Workstation, Bootstrapped Nodes. Wrote Recipes and Cookbooks and uploaded them to Chef-server.
  • Written CHEF Recipes for deployment on internal Data Centre Servers further modified the recipes to create Deployment directly to AWS EC2 instances.
  • Written Chef Cookbooks and recipes in RUBY to provision several pre-prod environments consisting of Cassandra DB installations, Web Logic domain creations and several proprietary middleware installations.
  • Written Terraform templates, Chef Cookbooks, recipes and pushed them onto Chef Server for configuring EC2 Instances.
  • Used Ansible and Ansible Tower as Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manages change.
  • Implemented Ansible to manage existing servers and automate the build/configuration of new servers.
  • Performed workload analysis on prod system based on APM/RUM tools and come up with workload to be simulated in pre-production environment.
  • Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python.
  • Deployed applications using Jenkins server and Troubleshooted build and release job failures.
  • Implemented continuous deployment using Jenkins in Linux environment. Wrote Jenkins pipeline code for automation. And maintained build jobs effectively, Installed different marketplace plug-ins for automation.
  • Created and maintained continuous build and continuous integration environments in SCRUM and Agile projects. Used Red hat satellite server for deploying and managing the instances.
  • Dealt with High level troubleshooting to fix hardware and Software issues on UNIX/VMware platforms.
  • Heavily utilized the LAMP stack (Linux, Apache, MySQL, PHP/Perl) to meet customer needs
  • Worked closely with different projects for build and release SCM effort like Branching, Tagging, Merge, etc.
  • Highly Expertise in Ticketing tool JIRA, Service Now to track defects and changes for change management.
  • Ensure deployments happen in blue/green concept and ensure that there is business continuity, site reliability and manage applications by providing necessary monitoring 24x7
  • Involved in everyday SCRUM meetings to ensure successful project forecasting and to discuss the roadblocks.
  • Excellent client relation skills and the drive to complete tasks effectively, and efficiently where customer service and technical skills are demanded.

Environment: AWS (EC2, S3, VPC, ELB, RDS, EBS, Cloud Formation, Cloud watch, Cloud trail, Route 53, AMI, SQS, SNS, Lambda, CLI, CDN), Docker, Chef, Jenkins, ANT, Maven, Git, SVN, Cron, Jira, Bash, Shell, Perl, Python, Ruby, Tomcat, WebLogic, Auto scaling, WebSphere, Route53, DNS, Bamboo Nagios, Cassandra, RHEL 5.11/6.x

Confidential

Build & Release Engineer

Responsibilities:

  • Created Branches, Labels and performed Merges in GITHUB.
  • Worked on integrating GIT with Jenkins and scheduling jobs by using Poll SCM and integration in code checkout.
  • Supported Local System Administrators to troubleshoot Configuration Management and Network issues. Used Agile practices and Test-Driven Development (TDD) techniques to provide reliable, working software early and often.
  • Created and maintained Continuous Build and Continuous Integration environments in scrum and agile projects.
  • Wrote ANT and MAVEN Scripts to automate the build process using Jenkins, Antifactory, and Gradle.
  • Worked on building and deploying Java code through Jenkins to automate builds and deployments.
  • Performed integration of Code Quality Analysis using SonarQube with Jenkins.
  • Experienced in writing Shell Scripts, Python Scripts, maintaining Virtualization environments like VM ware, Vagrant.
  • Troubleshoot build issues in Jenkins and generating metrics on master's performance along with jobs usage.
  • Integrate Artifactory repository server to handle multiple external and internal project binaries.
  • An active part of DevOps team. Developed Puppet modules to automate the IaaS on both Windows and Linux, including SQL Server, Patrol, New Relic, etc. HPE/VMware as cloud platform.
  • Worked in configuring baselines, branches, merge in SVN, and automation processes using Shell and Batch scripts.

Confidential

LINUX Administrator

Responsibilities:

  • Responsible for Installation, configuration and maintenance of apache tomcat, web sphere and JBoss servers in sun LINUX environment.
  • Responsible for system administration, system planning, coordination and group and user level management.
  • Installed red hat Linux 7 partition on VMware esxi 6.0
  • Creating, cloning Linux Virtual Machines, templates using VMware Virtual Client 3.5 and migrating servers between ESX hosts.
  • Synchronize and migrate data, database objects between development and production environments.
  • Hand on experience Linux Shell Scripting and PowerShell Scripting.
  • Linux Firewall / Security / Unified Threat Management System - IP tables configuration, Cyberoam UTM configuration and Administration.
  • Experience in Troubleshooting of all aspects of the Unix/Red Hat Linux operating environments.
  • Configuring FTP Server and Apache Web Server on Red Hat Enterprise Linux.
  • Experience in installing, configuring and troubleshooting Networking services and protocols like NIS, LDAP, DNS, NFS, DHCP, FTP, SSH and SAMBA.
  • Very good understanding of OSI model and 7 layers and network communication protocols like TCP/IP, UDP and HTTP.
  • Performed SAN Migration on RHEL.
  • Configured auto mounting setup on NFS clients.
  • Experience working with HP Open view client and server configuration.
  • Responsible for Patching Linux Servers. Installed removed and updated and querying packages using RPM & YUM in Red Hat Linux
  • Installing and supporting SQL server and Oracle database.
  • Worked in production & application support 24/7 in NAS, EMC SAN Storage on Linux Environment.
  • Configured and enabled SSH, SCP services for access from remote terminals, Linux, Unix, Solaris.
  • Performance Monitoring Tools such as Topas, vmstat, netstat.
  • Routing connection between local and remote system and server.
  • Creating space to users, adding proxy entry on server, Adding route entry on Linux Unix and Solaris server

Environment: VMware, Red hat Linux, JBoss server, Virtual Machines, Unix, sell scripting

We'd love your feedback!