Sr. Aws Cloud Engineer Resume
TX
SUMMARY
- Around 9+ Years of IT industry Experience with Software Configuration Management, Change Management, build automation, Release Management and DevOps experience in large and small software development organizations.
- Experience in using Build Automation tools and Continuous Integration concepts by using tools like ANT, Jenkins and Maven.
- Experience in using Configuration Management tools like Puppet, Chef, Ansible.
- Developed Puppet modules to automate application installation and configuration management.
- Expertise on all aspects of chef server, workstations, Nodes, chef clients and various components like Ohai, push jobs, super market etc.
- Extensively worked on Vagrant & Docker based container deployments to create environments for dev teams and containerization of env’s delivery for releases.
- Experience in working on Docker Hub, creating Docker images and handling multiple images primarily for middleware installations and domain configuration.
- Knowledge on various Docker components like Docker Hub, Machine, Compose and Docker Registry.
- Maintained Jenkins masters with over 80+ jobs for over 10+ different applications supported Several Quarterly and project releases in parallel.
- Experienced in Gitlab CI and Jenkins for CI and for End - to-End automation for all build and CD.
- Expertise in using Nexus and Arti factory Repository server for Maven and Gradle builds.
- Ability to build deployment, build scripts and automated solutions using Shell Scripting.
- Experience in using monitoring tools like Icinga, Nagios.
- Experienced in branching, tagging and maintaining the version across the environments using, Software Configuration Management tools like GITHUB, Subversions (SVN) like GIT, and Team Foundation Server (TFS) on Linux and Windows platforms.
- Experienced migrating SVN repositories to GIT.
- Ensured successful architecture and deployment of enterprise grade PaaS solutions using Pivotal Cloud Foundry(PCF)as well as proper operation during initial application migration and set new development.
- Worked in GIT implementation containing various Remote Repositories for a single application.
- Experienced with handling Cloud environments AWS and Open Stack.
- Have working knowledge on various Docker components like Docker Engine, Hub, Machine and Kubernetes.
- Well experience in setting up VPC peering between two VPCs and remote VPN.
- Worked in all areas of Jenkins setting up CI for new branches, build automation, plugin management and securing Jenkins and setting up master/slave configurations.
- Analyze and evaluate existing architecture at Customer on Premise Datacenters and Design, Configure and Migrate complex network architectures to AWS Public Cloud.
- Proficient in AWS services EC2, IAM, S3, Elastic Bean stalk, VPC, ELB, RDS, EBS, Route53.
- Provisioning EC2 instances and have knowledge on all resource areas of EC2 like instances, dedicated hosts, volumes, Keypairs, Elastic IP’s, Snapshots, Load Balancers and Security Groups.
- Hands on experience in using various Hadoop distros (Cloudera (CDH 4/CDH 5), Hortonworks, Map-R, IBM Big insights, Apache and Amazon EMR Hadoop Distributions.
- Experience with Terraform or Cloud formation scripting including Infrastructure as code utilizing tools like Terraform, Cloud formation and Salt Stack.
- Worked in managing VMs in Amazon using AWS and EC2.
- Hands on Experience in AWS provisioning and good knowledge of AWS services like EC2, S3, Glacier, ELB, RDS.
- Good Knowledge in Bash, Ruby, Python and Perl scripting.
- Staying up-to-date with current Web application and development technologies and services.
- Responsible for delivery of new environments with various middleware configuration for newly assigned projects and performed backfill activities on all environments to bring the env’sup to current release cycles.
- Created AWS EBS volumes for storing application files for use with AWS EC2 instances whenever they are mounted to them and installedPivotal Cloud Foundry (PCF)on EC2 to manage containers created byPCF.
TECHNICAL SKILLS
Cloud: Amazon Web Services, GCP, Dockers, Vagrant, Puppet, Chef, Kubernetes, Terraform and Ansible.
Databases: MySQL, SQL Server, RDS, DynamoDB
Languages: OpenCL, C, PHP, Python, Java, C++, HTML, CSS, XML, JavaScript, Node.js
Operating Systems: UNIX, UBUNTU/LINUX, Red hat 5.x/6.x/7.x, Windows Server
Server services: DHCP, DNS, Active Directory, FTP, Apache, WebLogic
Tool: /Methodologies AWS CLI, PUTTY, MS Project, SQL Profiler, Git, SVN
Protocols: TCP/IP, HTTPS
Network Security: Firewall & NAT, MS ISA server, IPTables
Remote Management: RDP, Symantec PC Anywhere, VNC
PROFESSIONAL EXPERIENCE
Confidential, TX
Sr. AWS Cloud Engineer
Responsibilities:
- Worked on migrating monolithic applications into microservices adapting CI/CD/CO framework.
- Developed CI/CD/CO pipeline libraries to deploy helm charts in AWS EKS using fargate profiles.
- IAC Automation with Terraform (GCP, AWS).
- Created terraform templates to create resources like EKS configs (namespaces), KMS, S3, Aurora DB, EKS cluster, SQS in AWS and synced them with CI/CD pipelines.
- Developed multiple templates in CI/CD & CO for the dev teams to adapt them for their pipeline enablement
- Implementing PEM (acquired from IBM) including architecture of project into AWS/GCP from Scratch.
- Used AWS EMR to transform and move large amount of data into and out of other AWS data stores and databases, such as Amazon Simple Storage (S3) and amazon Dynamo DB
- Developed multiple automation script in Python and Bash for automation.
- Developed pipelines in Gitlab for CI/CD using pipeline as code.
- Developed several Terraform modules as part of enabling Continuous Observability with Datadog, Splunk.
- Developed multiple helm charts to deploy applications using docker images into Kubernetes environment.
- Performed end-to-end architecture & implementation assessment of various AWS Services like Amazon EMR, Redshift, S3
- Enforced Docker Content Trust to improve security in the CD pipelines.
- Followed 12 Factors app for the better delivery of the applications.
- Developed highly reliable, scalable, robust, failure recovery DevOps pipelines.
- Developed multiple helm templates and used Confidential corp Vault to store and retrieve secrets by creating policies and service roles.
- Running of apache Hadoop, CDH and Map-R distros,dubbed Elastic MapReduce(EMR) on(EC2)
- Developed python scripts to enable Role Based Access Control to promote or deploy artifacts into the environments.
- Experienced with container-based deployments using Docker, working with Docker images, Docker Hub and Docker-registries and Kubernetes.
- Configured Rancher Kubernetes cluster for the teams to use and deploy applications into the cluster.
- Configured several virtual and remote repositories in JFrog for storing artifacts and pulling them into CI/CD for deploying applications.
- Developed automation scripts to scan the source codes/ artifacts in Veracode and Sonarqube for checking vulnerabilities.
- Maintained Hadoop cluster on AWS EMR
- Developed automation scripts to deploy artifacts and promote them into multiple environments based on environment lifecycles.
Confidential
Sr. Cloud DevOps Engineer
Responsibilities:
- Implemented scalable, secure and disaster recovery cloud architecture based on Amazon WebServices.
- Involved in deploying multi-tier applications utilizing AWS stack (EC2, Route53, S3, RDS, DynamoDB, SNS, SQS, IAM) focusing on fault tolerance and auto-scaling.
- Managed EC2 instances using launch configuration, Auto scaling, Elastic Load balancing, automated the process of provisioning infrastructure using Cloud Formation, Ansible templates, and created alarms to monitor using CloudWatch.
- Designed and worked with team to implement ELK (elastic search, log stash and Kibana) Stack on AWS
- Other than AWS, have also worked and on GCP and Azure to implement the same on other application.
- Managed storage in AWS using Elastic Block Storage, S3, created Volumes, configured Snapshots.
- Implemented a server less micro architecture using API Gateway, Lambda, and DynamoDB.
- Deployed AWS Lambda code from AmazonS3 buckets. Created a Lambda Deployment function and configured it to receive events from your S3 bucket.
- Used AWS Beanstalk for deploying and scaling web applications and services developed with PHP, Node.js, Python, Ruby, and Docker on familiar servers such as Apache, and IIS.
- Experience in Designing, Architecting and implementing scalable cloud-based web applications usingAWSandGCP.
- Used BOTO and Fabric for launching and deploying instances in AWS.
- Setup Elastic search cluster using Terraform scripts to block all the Spam and Phishing attacks.
- Used Terraform in AWS Virtual Private Cloud to automatically setup and modify settings by interfacing with control layer.
- Handle the various platforms like Linux, Windows andGCP for automation purpose at same time.
- Experience in Migrating the Legacy application intoGCP Platform.
- Responsible for Deploying Artifacts in GCP platform by using Packer.
- Responsible for managing the GCP services such as Compute Engine, App Engine, Cloud Storage, VPC, Load Balancing, Big Query, Firewalls, Stack Driver.
- Responsible for managing the Docker orchestration for transferring the data from store database to REDIS cache server.
- Worked on TERRAFORM for provisioning of Environments inGCP platform.
- Experience with Unix servers as Confidential 's corporate and windows servers for Confidential 's stores and GCP platform for SNB and CNC for support the TVS Application.
- Developed templates for AWS infrastructure as a code using Terraform to build staging and production environments.
- Installing, configuring, deploying, administering, maintaining, tuning, and upgrading databases on Amazon Web Services platform. Databases currently reside inRedShift and Oracle. Configuring database- and table-level encryption of data-at-rest. Performing datatype conversion from Oracle toRedShift. Creating database objects inAWSRedShift.
- Installing and configuring SQL client such as Agility workbench and SQL workbench. Loading data from S3 intoRedShift usingAWS copy command. FollowedAWS best practices to convert data types from oracle toRedShift. Created database objects in AWS RedShift
- Worked on Container Platform for Docker and Kubernetes.
- Used Kubernetes to manage containerized applications using its nodes, Config Maps, Selector, Services and deployed application containers as Pods.
- Deployed CoreOS Kubernetes Clusters to manage Docker containers in the production environment with light weight Docker Images as base files.
- Created Docker images using a Docker file, worked on Docker container snapshots, removing images and managing Docker volume and Implemented Docker automation solution for Continuous Integration / Continuous Delivery model.
- Worked on creating Docker containers and Docker consoles for managing the application lifecycle and worked on setting up the automated build on Docker HUB.
- Responsible for maintaining GIT/SVN Repositories, and access control strategies.
- Coordinated and assisted developers with establishing and applying appropriate branching, labeling/naming conventions using GIT source control.
- Built scripts using Maven build tools in Jenkins to move from one environment to other environments.
- Maintained build related scripts developed in shell for Maven builds. Created and modified build configuration files including POM.xml.
- Used ANT and Maven as a build tools on java projects for the development of build artifacts on the source code.
- Setup Jenkins master slave architecture to use Jenkins as pipeline as service.
- Integrated Gradle builds into Jenkins and configure GIT parameterized builds. Also installed many custom plugins along with ANT and Maven plugins.
- Used different Jenkins plugins like Global Build Stats plugin, job generator plugin to help developers create new jobs, Hudson Post build task Plugin to publish artifacts to repositories once the build is succeeded and to perform some other tasks depending on the output of the build, Amazon EC2 plugin to create slaves on EC2 servers etc.
- Integrated ANT/Nexus, Jenkins, Urban Code Deploy with Patterns/Release, Git, Confluence, Jira and Cloud Foundry.
- Managed configurations of multiple servers using Ansible.
- Worked with Ansible Tower to manage Multiple Nodes and Manage Inventory for different Environments and developed Python Modules for Ansible Customizations.
- Automated various infrastructure activities like Continuous Deployment, application server setup, stack monitoring using Ansible playbooks.
- Worked on creating inventory, job templates and scheduling jobs using Ansible tower and writing Python modules for Ansible customizations.
- Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python.
- Run Ansible Scripts to provision Dev servers and Responsible for writing/modifying scripts using BASHShell.
- Backup and restore Mongo databases using LVM snapshots and Ops manager backups. Migrated Mongoinstances from MMAPV1 to Wired tiger storage engine.
- Established Chef Best practices approaches to systems deployment with tools such as vagrant, bookshelf and test-kitchen and the treatment of each Chef cookbook as a unit of software deployment, independently version controlled.
- Developed Chef Cookbooks, Recipes in Ruby scripts and Resources, run lists, managing Chef Client nodes, and uploading cookbooks to chef-server.
- Implemented monitoring and logging of different application logs using ELK and Nagios.
- Used JIRA as a ticketing tool to track the issues related to Dockerization of legacy apps and implement strategies to reduce common problems faced.
- Installed, monitored and configured applications in Nginx and Apache Tomcat Server and establish connectivity to databases and troubleshoot issues on the fly.
- Scripting in multiple languages on UNIX, LINUX and Windows - Bash, Python etc.
- Worked on Group/User administration, Startup & Shutdown Scripts, Crontabs, File System Maintenance, Backup Scripts and Automation, Package management
- Resolved system issues and inconsistencies in coordination with quality assurance and engineering teams.
Environment: AWS (EC2, S3, VPC, ELB, RDS, EBS, Cloud Formation, Cloud watch, Cloud trail, Route 53, AMI, SQS, SNS, Lambda, CLI, CDN), Azure, GCP, Docker, Chef, Jenkins, ANT, Maven, Git, SVN, Cron, Jira, Azure, Bash, Shell, Perl, Python, Ruby, Tomcat, WebLogic, Autoscaling, Route53, DNS, Nagios, RHEL 6.8/7. x.
Confidential, IN
AWS Cloud Engineer
Responsibilities:
- Experience in automation and continuous integration processes with Jenkins, Chef.
- Code repository management, code merge and quality checks with various tools, especially Git, nexus, etc.
- Architect Development, Test, Integration, and Production AWS environments.
- AWSEC2, EBS, Trusted Advisor, S3, Cloud Watch, Cloud Front, IAM, Security Groups, Auto-Scaling.
- AWS CLI Auto Scaling and Cloud Watch Monitoring creation and update.
- Solid understanding of Linux OS, including security, compilation, and installation of third-party software and networking.
- Continuous integration and automated deployment and management using Jenkins, Chef, Maven, Ant, or comparable tools.
- Setup and build AWS infrastructure various resources, VPC, EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation JSON templates.
- Experience with web deployment technology specifically Linux/Nginx/Apache/Tomcat.
- Redesigned infrastructure for high availability using multiple AWS availability zones.
- Development, Acceptance, Integration, and Production AWS Endpoints.
- Responsible for mentoring and cross-resource platform standardization of Web Stack technology and development and implementing of policies and procedures.
- Experience in analyzing and monitor performance bottlenecks and key metrics in order to optimize software and system performance.
- Route 53 configuration.
- Created Python scripts to: Totally automate AWS services, which includes web servers, ELB, Cloud front distribution, database, ec2 and database security groups, S3 bucket and application configuration, this script creates stacks, single servers, or joins web servers to stacks.
- Experience running LAMP (Linux, Apache, MySQL, and PHP) systems in agile quick scale cloud environment.
- Dynamically add and remove servers from AWS production environment.
- Automating backups by shell for Linux to transfer data in S3 bucket.
Environment: AWS Cloud, RHEL 6.x, Solaris and Windows, Chef, Shell, Python, AWC EC2, WLST, Tomcat 7.x, Science Logic, Zabbix, jira, putty, Jenkins, Unix, linux.
Confidential
AWS Cloud Egineer
Responsibilities:
- Implemented and maintained monitoring and alerting of production and corporate servers such as EC2 and storage such as S3 buckets using AWS Cloud Watch.
- Defined dependencies and plugins in Maven pom.xml for various activities and integrated Maven with GIT to manage and deploy project related tags.
- Configured local Maven repositories and multi-component Ant projects with Nexus repositories and scheduled projects in Jenkins for continuous integration.
- Integrated Subversion (SVN) into Jenkins to automate the code check-out process. Configured SonarQube code quality tool and integrated it with Jenkins
- Software Build and Deployment: Performed regular software release build and deployment based on defined process and procedure.
- Designed highly available, cost effective and fault tolerant systems using multiple EC2 instances.
- Developed and scheduled bash shell scripts for various activities (deployed environment verification, running database scripts, file manipulations, Subversion (SVN).
- Created Shell scripts for automation of build and release process.
- Implemented Chef Recipes for Deployment on build on internal Data Centre Servers.
- Also re-used and modified same Chef Recipes to create a Deployment directly into Amazon EC2 instances.
- Written wrapper scripts to automate deployment of cookbooks on nodes and running the chef client on them in a Chef-Solo environment.
- Automating infrastructure in AWS with Chef, Ruby, and maternal unit, Created EC2 Instances and VPC network and assigned Roles and permissions via IAM key management.
- Implemented rapid-provisioning and life-cycle management for Ubuntu Linux using Amazon EC2, Chef, and custom Ruby/Bash scripts.
- Bootstrapping instances using Chef and integrating with auto scaling.
- Designed and implemented Chef, including the internal best practices, cookbooks, automated cookbook CI and CD system. Written multiple cookbooks in Chef.
- Developed chef Modules for installation & Auto healing of various CI&CD tools like Jenkins, MSSQL, Nexus etc. these modules are designed to work on both windows and Linux platforms.
- Expert in installing and configuring Continuous Integration tools such as Bamboo, Build Forge, Cruise Control and Hudson for build and deployment automation.
Environment: SVN (Subversion), Anthill Pro, ANT, Maven, Chef, Devops, Jenkins, Clear case, MS Build, Unix, Linux, Perl, Bash, Ruby, Cruise control, AWS, Chef, SonarQube, SharePoint, Bamboo, Hudson, JIRA, Shell Script, WebSphere.