Devops/ Cloud Engineer Resume
Sioux Falls, SD
SUMMARY:
- As a DevOps Cloud engineer with over 4 years of experience in IT industry, specializing in building automated cloud Infrastructure in AWS and expertise in building flexible pipelines which automates from code integration to deployment. I excel at automating redundant tasks and defining clear, efficient processes so that my developers can work on constructive, creative, & development activities.
- Expertise in Infrastructure Development and Operations involving AWS Cloud Services like EC2, S3, IAM, EBS, VPC, ELB, Route 53, Auto scaling, Security Groups, Cloud Watch, API gateway, SNS, SQS, RDS, DynamoDB, CloudFront, Elastic Storage, NAT, AWS Lambda, Firewalls and experienced in Cloud automation.
- Experience in DevOps tools such as Git, Maven, SonarQube, Jenkins, Nexus, Ansible, Docker, Kubernetes, Tomcat.
- Experience in Blue/green deployment strategy by creating new applications which are identical to the existing production environment using CloudFormation templates & Route53 weighted record sets to redirect traffic from the old environment to the new environments.
- Configure, monitor and automate Google cloud Services as well as involved in deploying the content cloud platform using Google compute engine, Google storage buckets.
- Knowledge in Designing, Architecting and implementing the Platform as a service (PaaS) built for Multi - tenant platform within the Hybrid cloud infrastructure within AWS and GCP.
- Experience in Docker Container, Docker Swarm for creating Docker images and handling multiple images primarily for middleware installations and domain configurations.
- Experience in branching, tagging and maintaining the version across the environments using SCM tools like GIT, Bitbucket on UNIX/LINUX environment.
- Expertise in all areas of Jenkins like Plugin Management, Securing and scaling Jenkins, integrating Code Analysis, Performance issues, Analytics and Test Phases to complete the CI/CD pipelines within Jenkins.
- Experienced in several areas of Jenkins like master/slave administrations, access controls, report generations.
- Authored Ansible Playbooks with SSH as the Wrapper to manage configurations of AWS Nodes and Test Playbooks on AWS instances using Shell.
- Managed distributed builds generated by Maven using binary repositories like Nexus and Artifactory.
- Worked on Web servers and Application servers like Apache, Tomcat, JBOSS to deploy code.
- Expertise in working with Bug Tracking Tool like JIRA.
- Skilled in monitoring servers using Nagios, New Relic, and Cloud Watch
- Experience in integrating Unit Tests and Code Quality Analysis Tools like JUnit, SonarQube.
- Knowledge on Docker based container deployments to create shelf environments for development teams and containerization of environment’s delivery for releases.
- Experience on packer to build Amazon EC2 AMI with pre-installed software's. Written Packer scripts to Build and Provision custom AMI.
- Expertise in Atlassian Confluence software for team collaboration and Technical documentation.
- Experienced in building scripts, deployment and automated solutions using scripting languages such as Bash, shell, Python.
- Efficient in working closely with core product teams to ensure high quality and timely delivery of builds.
- Experienced as a Build and Release Engineering in automating, building, deploying, and releasing of code from one environment to another environment. and Excellent comprehension of SDLC Methodologies like Agile, Waterfall and other processes.
TECHNICAL SKILLS:
Platforms: UNIX, Linux (Red Hat 5.x,6.x, 7.x), CentOS, Ubuntu, Windows 8/7/Vista/ XP
Version Control Tools: GIT, Bitbucket
CI/CD Tools: Jenkins, Ansible
Build Tools: Maven
Programming Languages: C, Shell, Groovy, YAML, Python, Java, XML, HTML, CSS, Java Script, C#
Container Platforms: Docker, Kubernetes, ECS
Monitoring Tools: Nagios, CloudWatch, Grafana, New Relic.
Cloud Platform: AWS, GCP
Bug Tracking Tools: JIRA, Confluence
Artifactories: Nexus, Artifactory
Web/Application Servers: JBOSS, Apache Tomcat, Nginx, Apache HTTP server
Network Protocols: TCP/IP, DNS, SNMP, SMTP, Ethernet, NFS
Database Systems: SQL Server 2000/2005, My SQL, Oracle DB
Web Services: REST
PROFESSIONAL EXPERIENCE:
DevOps/ Cloud Engineer
Confidential, Sioux falls, SD
Responsibilities:
- Setup and build AWS infrastructure various resources such as VPC EC2, S3, IAM, EBS, Security group, Auto Scaling, SNS and RDS in Cloud Formation Yaml templates.
- Used AWS lambda to run servers without managing them and trigger to run code by S3 and also orchestrate the Ec2-Scheduler Start stop of the instances.
- Developed automated processes that run daily to check disk usage and perform cleanup of file systems on LINUX environments using powershell scripting. created and wrote shell scripts Python and PowerShell for setting up baselines, branching, merging, and automation processes across the environments using SCM tools like GIT, Subversion (SVN) on Linux.
- Automated the cloud deployments using Python (boto & fabric) and AWS Cloud Formation Templates.
- Creating the AWS Security groups to enable firewall and Deployed many AWS resources by using Cloud Formation and GCP instances using Deployment Manager.
- Configured and managed New Relic for monitoring over existing AWS Cloud platform and also Stack Driver monitoring for Google Cloud.
- Infrastructure Build for both AWS and GCP, Migrate On-prem servers to Google Cloud and server handling and monitoring.
- Configuration of various plugins for Jenkins for automation of the workflow and to optimize and smooth running of build jobs and ability to restore the Jenkins jobs whenever needed by creating Jenkins jobs constant backup
- Experience in using Version control systems including GitHub/Git and Integrated GIT into Jenkins to automate the code check-out process.
- Involved in Linux backup/restore with tar including formatting and disk partitioning.
- Responsible for setting up Cron jobs scripts on production servers for reverse lookup and also Start Stop scripts for the servers.
- Created and modified users and groups with SUDO permission and applied appropriate support packages/patches to maintain system integrity.
- Used Jenkins to build, test and publish the project artifacts, implemented build per branch as part of CI procedures to run Sonar for code coverage and run unit and integration tests to help improve the pipeline efficiency.
- Updated Jobs. Groovy for the migration of Jenkins jobs from AWS to Google cloud.
- Migrated .Net application to Microsoft Azure Cloud Service Project as part of cloud deployment.
- Worked on .net applications by automating their build from end to end, even by integrating test tools like SonarQube and Visual Studio using Jenkins
- Build Java code and .NET code on to different Jenkins servers as per the schedule.
- Build and improve automation tools for all stages of the operational pipeline, including build systems, deployment and infrastructure orchestration.
- Redesigned infrastructure for high availability using multiple Google cloud availability zones, Configure and managing daily and hourly scheduled snapshots backup.
- Responsible for defining branching & merging strategy, check-in policies, improving code quality, automated gated check-ins, defining backup and archival plans.
- Knowledge on services like RabbitMQ, Activiti and Redis hosted in Cloud that support to manage the application state.
- Performed Autoscaling in Google cloud based on the Increase in the number of queues in RabbitMQ by generating the metrics in Stack driver Monitoring.
- Experience in monitoring a rule processing Engine that analyzes loan information to generate exceptions.
- Experience in setting up a local docker environment by integrating Docker with IntelliJ.
- Implements build stage to build the microservice and push the docker container image to the private docker registry.
- Executed a Kubernetes POC(proof of concept ) to demonstrate the technical viability of container orchestration and handled large volumes of containers with docker swarm and Kubernetes.
- Got back the locked RDP Jump host in Production environment in Cloud by enabling the serial port to access the VM using PowerShell script.
- Installation and administration of RHEL 7.0 and SUSE 11.x.
- Schedule and monitor the cosmos jobs using XFLOW and airflow. Troubleshoot the failed cosmos job and fix the issues in timely manner.
- Scheduled the Jobs to pull in the raw data from the source into our VC.
- Develop the scripts which push the data from the cosmos output streams into SQL Server Databases.
- Worked primarily on User requests via JIRA like, system related to system access, logon issues, home directory quota, filesystem repairs, directory permissions, disk failures, hardware and software related issues.
- Expertise in Atlassian Confluence software for team collaboration and Technical documentation.
- Experienced as a Build and Release Engineering in automating, building, deploying, and releasing of code from one environment to another environment. and Excellent comprehension of SDLC Methodologies like Agile, Waterfall and other processes
DevOps/ Cloud Engineer
Confidential, Dublin, Ohio
Responsibilities:
- Extensively worked on Amazon web services (AWS) cloud instances. Create EC2 instances, generated key pairs, dedicated hosts, volumes, Load balancers and Security Groups.
- Created S3 buckets and modifying it to use as a Storage and file system creations and installation of EFS in AWS to create common storages for different teams.
- Created roles and groups for users and resources using AWS Identity Access Management (IAM)
- Designed AWS Cloud Formation templates using JSON to create customized VPC Subnets, NAT to ensure successful deployment of Web applications and databases.
- Create Dashboards with views for each application widgets in Cloud ability to list the servers and their performance metrics along with separate views and subdivide into granular application level in PMOD2 for usage and cost optimization.
- Create an environment for Attunity in AWS and Build the environment for Attunity using the cloud formation in GCP.
- Configured a Google cloud Virtual Private Cloud (VPC) and Database Subnet Group for isolation of resources.
- Create an enterprise-wide AWS and GCP instances that are not managed by the automated shutdown and standard shutdown tags and labels will be brought down every day at standard timings
- Create custom image for longer retention and devise and test a procedure to retain custom images that can be used as Gold Copies.
- Worked on Docker to install, configure and manage the Docker containers, Docker Images, docker repo and docker registry. And Utilized several Docker best practices to create images from clear, readable maintainable Docker files and that images are uploaded to Docker hub.
- Designed Infrastructure for clustering and scheduling docker containers using Docker Swarm to run and deploy the applications by using load balance and scaling containers in between nodes, worked on all major components of Docker like Docker Daemon, Hub, Images, Registry.
- Integrated Sonar GitHub plugin which will help to analyze the code before committing the code to GIT hub at the developer stage only and gives the report to the developer end and also helps to reduce the error build files.
- Created CI/CD pipeline, upstream/downstream projects, multi-branch pipelines with the help of Jenkins, setting up Maven repositories to automate everyday builds and integrated SonarQube with Jenkins to get the better quality of the Source code to automate everyday builds, testing and deployed in artifactory repository like Nexus.
- Authored Ansible playbooks for configuring instances and create security baselines, roles, templates for more reliable deployments and automated cloud deployments using AWS Cloud Formation templates and Ansible.
- Developed Nagios plug-in scripts, various reports, and project plans in the support of initiatives to assist and maintain Nagios Distributed system.
- Performing POC on creating a tunnel between two clouds so that they can act as a hybrid cloud and run applications on both clouds, created tunnels by VPN’S on both clouds AWS, GCP and performed SSH between all instances to check connectivity, Used Elastic IP Address for both VPN (AWS and GCP) having kubernetes on both clouds used weave for pods communication.
- Worked with Containerization tools, can implement transition to chef and develop distributed cloud system using Kubernetes
- Worked primarily on User requests via JIRA like, system related to system access, logon issues, home directory quota, file system repairs, directory permissions, disk failures, hardware and software related issues.
DevOps Engineer
Confidential, woburn, MA
- Created, managed S3 buckets and used S3 buckets, Glacier for storage and backup on AWS.
- Launched EC2 instances and configured AWS IAM roles for them.
- Implemented Elastic Load Balancers and auto scaling groups in AWS on production EC2 instances to build fault tolerant and highly available applications.
- Designed the project pipelines using Jenkins for Continuous Integration and deployment into different Web/Application Servers.
- Developed Chef Recipes for automating the Infrastructure, deployment process.
- Used Maven as a build tool on java projects for the building of deployable artifacts (war and ear) from source code.
- Integrated GIT with Jenkins to automate the code checkout process with the help of Jenkins DSL plugin.
- Developed Bash scripts for automation of build and release process.
- Experienced in using Ansible to manage Web Applications, Config Files, Data Base, Commands, users mount points, and packages. Ansible to assist in building automation policies.
- Wrote Ansible Playbooks with python SSH as the wrapper to manage configurations of AWS Nodes and test playbooks on AWS instances using python.
- Implemented Docker containers for creating various environments to deploy the applications.
- Used Amazon Route53 to manage DNS zones globally as well as to give public DNS names to ELB’s.
- Used PowerShell, JSON, XML for DevOps in Windows systems and Linux kernel, memory upgrades and swaps area Kickstart Installation.
- Deployed the applications on multiple WebLogic Servers and maintained Load balancing, high availability and Fail over functionalities.
- Used Dynatrace to monitor server metrics and Performed in-depth analysis to isolate points of failure in the application.
Jr. DevOps Engineer
Confidential
Responsibilities:
- Built infrastructure using AWS by importing volumes, launching EC2, RDS, creating Security Groups, Auto Scaling, Load Balancers (ELBs) in the defined Virtual Private Cloud (VPC) and Worked on setting the security groups, Auto scaling to design cost effective, fault tolerant and highly available systems.
- Configured Elastic Load Balancers with EC2 Auto Scaling groups based on memory, CPU to adapt to unforeseen spikes without having an outage or needing manual intervention, Architect and design AWS Private Cloud Subnets, Security Groups, Network Access Controls, configure Load Balancing for application high availability as well as performance.
- Created S3 buckets, managed policies for S3 buckets as well as used IAM policies, groups to access S3 buckets by encrypting data in buckets with KMS.
- Created Cloud Watch alerts for instances and using them in Auto scaling launch configurations, Implemented and maintained the monitoring as well as alerting of production and corporate servers/storage using AWS CloudWatch.
- Created Shell and Python Scripts to automate creation of AMI's through pre-boot and bootstrapping techniques and Familiarity with Shell scripting and leveraging the PowerShell ISE for development and debugging.
- Integrated Jenkins CI/CD with GIT version control and implemented continuous build based on check-in for various cross functional applications and created Git Webhooks to setup triggers for commit, push, merge and pull request events.
- Authored the Jenkins Pipeline Framework and write Jenkins file to create Build, Test and Deployment Pipeline across different applications environments.
- Creating the automated build and deployment process for application, leading up to building a continuous integration system for all our products using Jenkins and Implementing a Continuous Delivery framework using Jenkins, Maven & Nexus in Linux environment.
- Documented every release and successful installation walk to manage configurations and automate installation processes.
- Maintain and trouble shoot systems performance and network monitoring using tools like Nagios, and AWS services such as CloudWatch.
- Coordinated with development and third-party teams to perform PCI penetration testing and Vera code scanning every year on the source code.
- Install, configure, modify, test & deploy applications on Apache Tomcat App server.
- Involved in JIRA as defect tracking system and configure various workflows, customization and plug-ins for JIRA Bug/Issue tracker, integrated Jenkins with JIRA, GitHub
Build & Release Engineer
LNC IT Solutions Hyderabad, India
- As a Build & Release Engineer responsible for continuous delivery, working with different teams to deliver high- quality applications to satisfy growing customer and business demands
- Coordinating different tasks with different teams for creating usage models for different projects.
- Designing, creating and maintaining GIT repositories to client specifications and involved for setting up of Subversion-SVN server, server maintenance, Client machines setup.
- Performed regular builds and deployment of the packages for testing in different Environments (DEV, QA, CERT, UAT and PROD).
- Performing smoke tests to ensure the integrity of code deployment.
- Performed builds on Java projects using Jenkins as build tools & are initiated using the continuous integration tool like Jenkins.
- Configured Jenkins for doing the build in all the non-production and production environments.
- Release Engineer for a team that involved different development teams and multiple simultaneous software releases.
- Developed and implemented software release management strategies for various applications according to agile process.
- Managed sonar type nexus repositories to download the artifacts during the build.
- Used Puppet and other configuration management tools to deploy consistent infrastructure code across multiple
- Worked on Scrum methodology to maintain software development and coordinated with all the teams before and after the production deployments for the smooth production releases.
- Created a complete release process documentation, which explains all the steps involved in the release process.