Sr Aws/devops Engineer Resume
San Jose, CA
SUMMARY
- Around 8 years’ experience working as a DevOps Engineer and strong knowledge of buildingand release Management process, includes end - to-end code configuration, building binaries & deployments of artifacts.
- Experienced in Automating, Configuring and deploying instances onAWS, Azure environments and Data centers, also familiar with EC2, Cloud watch, Cloud Formation and managing security groups onAWS.
- Configuration Management and source code repository management using tools like GIT, SVN.
- Continuous integration and automated deployment and management using Jenkins, Chef, Maven, Ant, Docker, Ansible or comparable tools.
- Extensively worked on CI/CD pipeline for code deployment by engaging different tools like GIT, Jenkins, Code pipeline in the process right from developer code check-in to Production deployment.
- Experience in working on version controller tools like GitHub (GIT), Subversion (SVN) and software builds tools like Apache Maven, Apache Ant.
- Strong base knowledge on Software Development Life Cycle (SDLC). In depth knowledge about Agile, Waterfall and Scrum methodologies.
- Worked with optimization of server resources, Amazon Elastic Cloud instances and website security, Web services /REST, Amazon AWS, Chef and Puppet.
- Expertise inDevOps, Configuration Management, Cloud Infrastructure, Automation. It includes Amazon Web Services (AWS), Dockers, Jenkins, Puppet, GitHub, Terraform, Nagios.
- Implemented, supported and maintained all network, firewall, storage, load balancers, operating systems, and software in Amazon's Elastic Compute Cloud.
- Perform system monitoring and maintain Logs using Cloud Watch.
- Experience working on several DOCKER components like Docker Engine, Hub, Machine, Compose and Docker Registry.
- Configure and install servers with different environment such as Windows, Linux, Cent OS, Ubuntu.
- Strong analytical, diagnostics, troubleshooting skills to consistently deliver productive technological solutions.
TECHNICAL SKILLS
Cloud Computing: Amazon Web Services EC2, IAM, Elastic Beanstalk, and Elastic Load balancer (ELB), RDS (MySQL), S3, Glacier, Route 53SES, VPC, Monitoring
Configuration Management: ANT, Maven, GIT, SVN Subversion, Jenkins, Chef, Ansible, Sonar, Nexus.
Tools: / Webservers: Web Sphere Application Server 3.5, 4.0, NetscapeMQSeries, WebLogic Server, Jira, JBOSS, Apache Tomcat serverScripting/ Languages Shell scripting, Python, Ruby and Perl Scripting.
Database: MySQL, DB2
Networking/ Protocols: DNS, TCP/IP, FTP, HTTPS, SSH, SFTP, SCP, SSL, ARP and DHCP
Operating Systems: Sun Solaris 7, 8, 9 &10 Linux (Red Hat 5.x, 6.x, SUSE Linux 10)AIX, VMware ESX, Windows NT/ 2000/2003/2012 , CentOSUbuntu.
PROFESSIONAL EXPERIENCE
Confidential, San Jose, CA
Sr AWS/Devops Engineer
Responsibilities:
- Proficient in AWS services like VPC, EC2, S3, ELB, Auto Scaling Groups, EBS, RDS, IAM, CloudFormation, Route 53, CloudWatch, CloudFront, API Gateway and CloudTrail.
- Implemented Amazon Web Services solutions to support both production and non - production workloads for the client.
- Developed and maintain automation and orchestration software and scripting to integrate with all underlying public and private cloud technologies.
- Written Templates for AWS infrastructure as a code using Terraform to build staging and production environments.
- Implemented domain name service (DNS) through Infoblox to have highly available and scalable applications.
- Used security groups, network ACL's, internet gateways and route tables to ensure a secure zone for the organization in AWS public cloud.
- Created NAT gateways and instances to allow communication from the private instances to the internet through bastion hosts.
- Defined all server types in Ansible, so that a newly built server could be up and ready for production within 30 minutes OS installation.
- Experience in automating the image build and test process using the tools like Packer, Jenkins.
- Configured S3 buckets with various lifecycle policies to archive the infrequently accessed data to storage classes based on requirement.
- Design and implemented 17a4 compliance archival solution for the client to store the electronic records in WORM format.
- Implemented fully automated server build, management, monitoring and deployment solutions spanning multiple platforms, tools and technologies.
- Designed, implemented and maintained Continuous Integration (Jenkins) and Delivery environments.
- Deployed and configured Git repositories with branching, tagging, and notifications.
- Developed custom Integrations with Splunk and Amazon web services.
- Experienced in integrating cloud services into DevOps framework (GitHub, Jenkins, Ansible, Splunk) which enables API request to provision and configure infrastructure through infrastructure as code capability.
- Maintained appropriate controls and documentation to ensure compliance of audit requirements and qualifications.
- Engaged vendor support (AWS) constantly for effective solutions according to business requirement.
- As a part of Disaster Recovery Plan(DRP), I was able to develop and implement server-based replication solution for the client. AWS is used as a DR target for Zerto virtual replication.
- Got expertise in virtual firewalls, AWS Direct Connect, VPN tunnels, VPN, DNS load balancing and general understanding of the need for and use of multi-tier architectures, load balancers, caching, web servers, application servers, databases, and networking.
- Involved in scrum meetings, product backlog and other scrum activities and artifacts in collaboration with the team.
- Performed capacity and performance management analysis.
Environment: Linux, IBM Mainframes, Arti factory, Docker, Ansible, Jenkins, Maven, Chef, GitHub, AWS, Shell and Ruby Scripts, Apache Tomcat.
Confidential, Houston, TX
DevOps Engineer
Responsibilities:
- Experience in setting up the infrastructure using AWS services including ELB, EC2, Elastic container services(ECS), Auto - scaling, S3, IAM, VPC, RDS, Red Shift, DynamoDB, Cloud Trail, Cloud Watch, Elastic Cache, Lambda, SNS, Glacier, Cloud Formation, SQS, EFS, and Storage Gateway.
- Experience in various Linux platforms (Red Hat, Ubuntu, Centos) providing YUM and RPM package installations and patches.
- Extensive experience in the areas of Build/Release/Deployment/Operations, with in depth knowledge of principles and best practices of SCM in Agile, Scrum operations.
- Gained good working experience on DevOps Environment as a DevOps engineer working on various technologies/applications like Puppet, CHEF, GIT, SVN, Jenkins, Docker, AWS, and Maven.
- Experienced in installation, configuration, tuning, security, backup, recovery and upgrades of Linux (Red hat, CentOS, Ubuntu).
- Utilized build tools such as Maven, ANT for creating jar, war and ear files.
- Proficient with DevOps release tools such as Chef, creating and maintaining recipes and cookbooks, Puppet, Puppet Master and Puppet Agents, Ansible, Ansible Playbooks with Python, and AWS (OPS work).
- Planned and executed releases from the initial stages of QA to a production environment. Managed code branches from development teams for current and future releases.
- Automated resource creation process using Python, Bash, and JSON scripts utilizing bootstrap processes.
- Utilized Nagios in a monitoring environment to identify and resolve infrastructure problems prior to critical processes being affected.
- Created Custom Puppet modules for bootstrapping new servers with a required application on dependencies and packages.
- Created and built a pipeline for application versions, using Jenkins Continuous Integration.
- Automated application deployment in the cloud using Docker technology and using Elastic Container Service scheduler.
- Created and managed a Docker deployment pipeline for custom application images in the cloud using Jenkins.
- Experienced with container-based deployments using Docker, working with Docker images, Docker Hub and Docker registries and Kubernetes.
- Experience in writing the Python scripts to automate AWS services which includes Web servers, ELB, Cloud Front distribution, Database, EC2, database security groups and S3bucketon, IAM, SNS.
- Created and monitor Events by using Cloudberry Explored and s3cmd CLI.
- Build and Customized Amazon Machine Images (AMIs) & deployed these customized images based on requirements.
- Configured and migrate Applications to AWS Route 53 and hence providing traffic management, application availability and high availability.
- Created Snapshots and Amazon Machine Images (AMIs) of the instances for backup and creating clone instances.
- Experienced in creating and managing User Accounts, Security Rights, Disk Space, Quotas and Process Monitoring in Red Hat Enterprise Linux.
- Experienced in creating the company's DevOps strategy in a mixed environment of Linux (RHEL, CENTOS) servers along with creating and implementing a cloud strategy based on Amazon Web Services..
- Experience Worked on Version control tools like GIT and SVN.
- Familiarity with NoSQL technologies, MongoDB, Redis.
- Working knowledge/exposure in TOMCAT APACHE, WEB LOGIC & WEB SPHERE.
- Knowledge on hosting and deploying applications by using WebSphere Application Servers.
- Created User Administration and Hardware setup and support of Storage and managed paging space.
- Expanded Experience in Network Management like DNS, NIS, NFS, LDAP, TFTP and system troubleshooting skills.
- Expert in the Implementation of TCP/IP & related Services-DHCP/DNS/WINS.
- Gained sound knowledge in product deployment in servers, email servers, monitoring tools & shell scripts, networking, SQL/MySQL.
- Hands on experience (Knowledge) on performance monitoring tools like App-Dynamics, Cloud Watch and Services Related to AWS.
- Gained good knowledge in Linux command line & Bash Shell scripting.
- Worked on Virtualization Products VMware ESX Server/ Virtualization Client 2.5.
- Performed all the Maintenance and Auditing tasks during Maintenance window.
- Experienced in setting up a PXE boot environment with Red Hat Linux.
- Experienced in supporting 24x7 production computing environments, on-call and weekend support.
Environment: VMWARE, SGI Servers, Centos, GIT, Ubuntu, Jenkins, Maven Chef, Jira, AWS, Ansible, Sonar, Nexus.
Confidential, West Chester, PA
Sr DevOps Engineer
Responsibilities:
- Launch Amazon EC2 Cloud Instances using Amazon Web Services (Linux/ Ubuntu) and Configuring launched instances with respect to specific applications.
- Build and configure high availability and fault tolerant infrastructure in AWS
- Highly motivated and committed Cloud and DevOps Engineer experienced in Automation and Configuration.
- Clone and configure the AWS EC2 and added Elastic IP, Elastic Load Balancer, and DNS (Route 53) in the AWS environment.
- AWS Import/Export accelerates moving large amounts of data into and out of AWS using portable storage devices for transport.
- Implement &maintained the branching and build/release strategies utilizing Subversion/GIT.
- Manage configuration of Web App and Deploy to AWS cloud server through Chef.
- Experience in setting up the Chef SCM using Chef Enterprise and Chef Solo.
- Experience in setting up Chef Workstation, creating Chef Cookbooks and Recipes.
- Using GIT as source code management tool: creating local repo, cloning the repo, adding, committing, pushing the changes in the local repo, saving changes for later (Stash), and recovering files.
- Creating alarms in Cloud watch service for monitoring the server performance, CPU Utilization, disk usage.
- Creating and configuring Jenkins jobs, build and delivery pipelines.
- Experience in installing, administering, patching, up - gradating, performance tuning and troubleshooting various Linux based servers like Red Hat Linux 5/6/7, CentOS5/6.
- Develop build and deployment scripts using ANT and MAVEN as build tools in Jenkins to move from one environment to other environments.
- Use Splunk to monitor server and application performance.
- Have good experience of creating and maintaining the Docker containers.
- Experience involving versioning for AWS S3 along with the lifecycle policies for backup and archiving data in Glacier.
- Involved in migrating physical Linux/Windows servers to cloud (AWS).
- Designed and worked with the team to implement ELK (elastic search, log stash, and Kibana) Stack on AWS.
- Used AWS console and AWS CLI to launch and manage VM's with Public/private subnet and setting up security groups, load balancing etc.
- Build servers using AWS: Importing volumes, launching EC2, RDS, S3, IAM, Route53, VPC, Code Deploy, creating security groups, auto-scaling, Lambda, load balancers (ELBs) in the defined virtual private connection.
- Implement, deploy and maintain cloud infrastructure using AWS. Automating backups by shell for Linux to transfer data in S3 bucket.
Environment: GIT, GITHUB, Shell, Bamboo, Chef, AWS (amazon web services like EC2, S3, RDS, EBS, Elastic Load Balancer, Auto scaling), IAM, ANT, MAVEN, Ubuntu, Windows Server and LINUX, XML, JAVA.
Confidential
AWS/Linux Administrator
Responsibilities:
- Extensively worked on AWS tools like EC2, ECS, Elastic Beanstalk, S3, Auto Scaling, Elastic Load balance, EBS, RDS, VPC, CloudFront, CloudWatch, CloudTrail, CloudWatch, SQS, SNS, Route 53, IAM, CloudFormation, AWS SSO, Trusted Advisor, Certificate Manager, DynamoDB.
- Launched EC2 instances with various AMI's and Configured Application Servers on those instances by deploying a code in elastic beanstalk. Integrated EC2 instances with various AWS tools by using IAM roles. Created Images of critical EC2 instances and used those images to spin up a new instance in different AZ s.
- Ensured high availability, scalability, reliability and visibility of EC2 instances by using autoscaling to scale - in and scale-out as per the requirement and ELB s to route the traffic to different instance in case of failover.
- Used Elastic Container Services for container orchestration. Docker containers have been setup and deployed application server in it. ECS provides continuous monitoring, alerts and logs.
- Created and managed S3 buckets to store files in it. Hosted a static website using S3 and enabled public access. Used cross-region replication to replicate same files in different region. Experience in creating life cycle policies in AWS S3 for the infrequently accesses files to glacier.
- Provided authenticated access to AWS resources using Multi-Factor Authentication, created and managed users accounts, roles, groups and policies using Identity Access Management (IAM).
- Created Custom VPC with private and public subnets. Installed webservers in public subnets and enabled route out through internet gateways, installed database servers and application servers in private subnets and used NAT gateways to route out the traffic. Configured HTTP, HTTPS and SSH ports using security groups and Network Access Control Lists (NACL).
- Created health checks on Route 53 and configured different routing policies like Simple, Weighted, Geo location and Fail-over.
- Experience in creating and managing databases like MySQL, PostgreSQL, Oracle, SQL Server using Relational Database Service (RDS) in AWS to perform basic database administration and managed virtual cloud resources as required. The overall objective is to improve scalability, performance, reliability and high availability fault tolerant cloud infrastructure.
- Migrated on premises database to Amazon Web Services Cloud platform using Data Migration Service (DMS) Tool.
- Utilized Cloud Watch to monitor resources such as EC2 instances CPU utilization, Memory utilization, disk space, Amazon RDS DB services, Dynamo tables, EBS volumes and to set alarms and use SNS for notification and to monitor logs for a better understanding and operation of the system.
- Used CloudTrail to audit the logs of AWS account activity, including actions taken through AWSmanagement console, Command line tool, AWS SDKs and other AWS services.
- Involved in designing and deploying multitude applications utilizing almost all of the AWS stack (Including EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling in AWSCloudFormation.
- Administering local and remote servers on daily basis, troubleshooting and correcting errors.
- Experienced with networking using TCP/IP and resolving network connectivity using tools like dig, nslookup, ping.
- Monitoring of web servers and other services using Nagios monitoring tool.
- Administered Linux servers for several functions including managing Apache Tomcat servers, mail servers and oracle, MySQL databases in both development and production.
- Troubleshooting backup and restore problems creation of LVMs on SAN using Linux utilities and Linux network, security-related issues, capturing packets using tools such as IP tables, firewall, and TCP wrapper and NMAP.
- Performed UNIX system administration. Fine tuning, Kernel debugging, process scheduling, disk and file system I/O, kernel internals, TCP/IP communications.
Environment: Linux, Docker, Ansible, Jenkins, Maven, Chef, GitHub, AWS, Shell and Ruby Scripts, Apache Tomcat, XCode.
Confidential
Jr AWS/DevOps Engineer
Responsibilities:
- Integrated AWS CloudWatch with AWS EC2 instances for monitoring the log files, store them and track metrics.
- Created AWS S3 buckets, performed folder management in each bucket, Managed cloud trail logs and objects within each bucket.
- Created Highly Available Environments using Auto-Scaling, Load Balancers, and SQS.
- Hands on Experience in AWS Cloud in various AWS Services such as RedShift, Cluster, Route53 Domain configuration.
- Installed Jenkins/Plugins for GIT Repository, Setup SCM Polling for Immediate Build with Maven and Maven Repository (Nexus Artifactory) and Deployed Apps using custom modules through Puppet.
- Managing Amazon Web Services (AWS) infrastructure with automation and configuration management tools such as Udeploy, Puppet or custom-built designing cloud-hosted solutions, specific AWS product suite experience.
- Installed/Configured/Managed Puppet Master/Agent. Wrote custom Modules and Manifests, downloaded pre-written modules from puppet-forge. Upgradation or Migration of Puppet Community and Enterprise.
- Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation from Dev to Eval, Eval to Pre-Prod/ Pre-Prod to Production systems using Jenkins, GIT, Chef automation tool.
- Create Chef Automation tools and builds and do an overall process improvement to any manual processes.
- Written Chef Cookbooks for various DB configuration to modularize and optimize end product configuration, converting production support scripts to Chef Recipes and AWS server provisioning using Chef Recipes.
- Used Puppet server and workstation to manage and configure nodes, experienced in writing puppet manifests to automate configuration of a board range of services.
- Defined branching, labeling, and merge strategies for all applications in Git.
- Setup and Configuring the Puppet Configuration Management. Worked on CentOS operating system to make Puppet modules.
- Administer CI/CD (Jenkins, Puppet, Chef) tools stack administration and maintenance.
- Configured Elastic Load Balancers with EC2 Auto Scaling groups.
- Configured S3 to host Static Web content.
- Used Ansible to manage Web applications, Environments configuration Files, Users, Mount points and Packages.
- Created monitors, alarms and notifications for EC2 hosts using CloudWatch.
- Well Versed with Configuring Access for inbound and outbound traffic RDS DB services, DynamoDB tables, EBS volumes to set alarms for notifications or automated actions.
- Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation using Jenkins along with Shell scripts to automate routine jobs.
- Configured AWS Identity Access Management (IAM) Group and users for improved login authentication.
- Implemented Autoscaling for scaling out to ensure availability and scalability of customer website and applications.
- Configured complex middleware environments with several variations of tomcat installations consisting of 3-5 instances in each installation.
- Worked with Custom AMI's, created AMI tags and modified AMI permissions.
- Created Security Groups, configuring Inbound /Outbound rules, creating and importing Key Pairs.
- Leveraged AWS S3 service as Build Artifact repository and created release-based buckets to store various modules/branch-based artifact storage.
- Experience in Installation of Oracle and MySQL.
- Configure and ensure connection to RDS database running on MySQL engines.
- Utilized CloudWatch service to monitor the QA/on-demand instances, S3 metrics, configuring alarms for performance environments during load testing.
Environment: Linux, Shell, Ansible, AWS, GIT, JIRA, python, Jenkins, Amazon IAM, S3 Buckets, EC2, EBS, Terraform, Elastic Search, Log Stash, Kibana, Hive, Hadoop.