We provide IT Staff Augmentation Services!

Sr. Cloud/devops Engineer Resume

4.00/5 (Submit Your Rating)

MI

PROFESSIONAL SUMMARY:

  • Accomplished, Pro - active Cloud/DevOps Engineer with around 8 years of experience in software configuration, development, deployment, and support using public cloud platformsfor clients in major industrial sectors like Retail, Banking, and Life Sciences. Well versed in Implementing DevOps environment to achieve CI/CD & automation of infrastructure.
  • Experienced in all phases of the Software Development Life Cycle (SDLC) with specific focus on the build and release and quality of software and involved with teams that worked on Scrum, Agile Software Development and Waterfall Methodologies.
  • Highly proficient in AWS cloud architecture design & development using various components services lie EC2, Auto scaling, ECS, ELB, S3, EBS, EFS, ECK, Glacier, Aurora, RDS, Dynamo DB, VPC, Cloud Front, Route53, Cloud Watch, Redshift, lambda etc.
  • Experienced in Cloud automation using AWS Cloud Formation, terraform templates, Amazon Cloud (EC2) Hosting and AWS Administration including S3 and IAM Services.
  • Experienced in Migrating On-premises applications and data onto AWS Cloud, leveraging the usage of different services available on AWS like AWS Direct Connect, S3 Transfer Acceleration, AWS SnowballEdge, Server migration service and Data Base migration service with Live Migration of applications implementing Hybrid Migration Strategy.
  • Experienced in development of Microsoft azure services like Azure web application, app services, Azure premium storage, Azure SQL database, Virtual machines, Fabric controller.
  • Planned, organized, and maintained the full stack Kubernetes environment running on Google Cloud Platform(GCP) and set up the alert and monitoring it by using Stack driver in GCP.
  • Designed POC works for new processes developed or to be developed for automation pipeline using cloud technologies like Azure and Google Cloud Platform(GCP).
  • Developed CI/CD system with Jenkins on Kubernetes container environment and utilizing Kubernetes and Docker for the runtime environment for the CI/CD system to build, test and deploy.Experienced working on Docker hub, creating Docker images and handling multiple images primarily for middleware installations and domain configuration.
  • Designed distributed private cloud system solution using Kubernetes on CoreOS and used it to deploy scale, load balance and manage Docker containers with multiple name spaced versions.
  • Worked on Docker-Compose, Docker-Machine to create Docker containers for testing applications in the QAenvironment and automated the deployments, scaling and management of containerized applications across clusters of hosts using Kubernetes.
  • Extensive experience in continuous integration tools like Jenkins, Bamboo and AnthillPro for building JAVA and J2EE based applications. Experienced in Installing, configuring, and administered Continuous Integration tools such as Jenkins, TeamCity, and Bamboo for end-to-end automations in all Build and Deployments.
  • Used Jenkins and Configuration management tools to drive all Microservices builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes and used Heapster monitoring platform on Kubernetes to report detailed resoerraformurce usage information.
  • Proficient in automating various infrastructure activities in CI/CD pipeline, Application Server setup, Stack monitoring using Ansible playbooks and on Integrating Ansible with Run deck and Jenkins.
  • Expertise in using Ansible to manage Web Applications, Config Files, Data Base, Commands, Users Mount Points, and Packages to assist in building automation policies.
  • Vigorously worked on Ansible playbooks, inventories, created custom playbooks written in YAML language, encrypted the data using Ansible Vault and maintained role-based access control by using Ansible Tower.
  • Writing Chef Recipes for Deployment on internal Data Centre Servers. Also, re-used and modified same Chef Recipes to create a Deployment directly into Amazon EC2 instances. Also Setting up ajnd maintenance of automated environment using Chef Recipes & Cookbooks within Azure environment.
  • Automated installation of Puppet Enterprise & configuring Puppet master & Puppet agents using puppet script.
  • Hands on experience in branching, tagging and maintaining the version across the environments using SCM tools like GitHub, Subversion (SVN) and TFS on Linux and windows platforms. Installed and configured GIT and communicating with the repositories in GITHUB.
  • Coordinated with developers for establishing and applying appropriate branching, labelling/naming conventions using Subversion (SVN) and GIT source control.
  • Getting code from Bitbucket to local Git and kick off testing from Bitbucket through build jobs. Experienced in using Build Automation tools like ANT, MAVEN and MS Build scripts for build and deployment.
  • Expertise in using build tools like MAVEN and ANT for the building of deployable artifacts such as war & jar from source code. Created and maintained Antbuild.xml's and Maven Pom.xml's for performing the build procedures.
  • Ability in development and execution of XML, Ruby, Shell Scripts and Perl Scripts, Power shell, Batch scripts and Bash also.
  • Hands on experience in building GUIs using Java Script, AJAX, HTML, DHTML, CSS2, JSP, JSON, XML, DTD.
  • Experience in Building streaming data pipelines that reliably get data between applications and involved in streaming applications that transform or react to the streams of data using Kafka, Splunk, Nagios & Logstash.
  • Created 5 Kibana Dashboards and 20+ Kibana visualizations which provided metrics on collected near real time logs of Fannie Mae Loan Management Systems.
  • Experience in Creating Store procedures and functions in SQl server to import data in to Elastic Search and converting relational data in to documents.
  • Experienced in Administration of Production, Development and Test environment's carrying Windows, Ubuntu (RHEL) Red Hat Linux and CentOS servers.
  • Familiar with Relational Database design, RDBMS and NoSQL, MognoDB, Cassandra.
  • Involved in setting up JIRA as defect tracking system and configured various workflows, customizations and plugins for the JIRA bug/issue tracker.
  • RabbitMQ message, SNS Worker Plugin Updated Hapi.js service worker to emit a new RabbitMQ message.
  • Created micro service plugin for Hapi.js service worker to read in info via RabbitMQ message bus in order to create and send SNS messages to AWS SNS topic.

PROFESSIONAL EXPERIENCE:

SR. CLOUD/DEVOPS ENGINEER

Confidential, MI

Responsibilities:

  • As a DevOps &CloudEngineer, worked in AWS environment, instrumental in utilizing Compute Services (EC2, ELB), Storage Services (S3, Glacier, Block Storage, and Lifecycle Management policies),CloudFormation (JSON Templates), Elastic Beanstalk, Lambda, VPC, RDS, Trusted Advisor andCloudwatch.
  • Created customized AMIs based on already existing AWS EC2 instances by using create image functionality, hence using this snapshot for disaster recovery.
  • Created AWS Launch configurations based on customized AMI and use this launch configuration to configure auto scaling groups and Implemented AWS solutions using EC2, S3, RDS, DynamoDB, Route53, EBS, Elastic Load Balancer, Auto scaling groups.
  • Created a Python Script to stop all the instances with a specific tag in AWS Instance using Lambda functions and made it intoCloudWatch Scheduler to schedule it every night.
  • Created GCP Firewall & rule for sensitive information to restrict the access for data storage on virtual machines in VPC through GCP console and REST API and defined at the VPC network with default and implied rules.
  • Used Google Cloud SDK to easily create and manage resources on Cloud Platform, including App Engine, Compute Engine, Cloud Storage, BigQuery, Cloud SQL, and Cloud DNS.
  • Used GCP HTTP(S) load balancing with Google cloud storage buckets and added a cloud storage bucket to your load balancer.
  • Used Google Cloud Platform (GCP) to build, test and deploy applications on Google's very adaptable and solid framework for web, portable, and backend arrangements.
  • Used Stack Driver Monitoring in GCP to check the alerts of the applications that run on the Google Cloud Platform and deployed on GCP using Google Cloud Deployment Manager.
  • Responsible for setting up ELK (Elasticsearch-Log stash-Kibana) platform, parsing unstructured logs using regular expressions to be structured in JSON format.
  • Using ELK (ElasticSearchLogStashKibana) stack for log management and Creating Kibana visualizations to analyze the logs. Worked on Python where I setup Python on Pycharm and created .py files for Appinit.
  • Configured network and server monitoring using Grafana, ELK Stack with Logspout and Nagios for notifications.Used Puppet to deploy ELK for automating continuous deployment(CD) and configured Slave Nodes and deployment failure reporting.
  • Worked on ELK architecture and its components like Elastic search, Log stash and Kibana for log analytics. Handled installation, administration, configuration of ELK stack on AWS. Performed Log analysis such as full text search, application monitoring in integration with AWS Lambda and cloud Watch.
  • Build servers using AWS, importing volumes, launching EC2, RDS, creating security groups, auto-scaling, load balancers (ELBs) in the defined virtual private connection and openstack to provision new machines for clients.
  • Collection of Build metrics and its Test case metrics from Jenkins to show case as visualization in Kibana dashboard using ELK. Experience in using Docker and setting up ELK with Docker and Docker-Compose. Actively involved in deployments on Docker using Kubernetes.
  • Implemented CI/CD pipeline using Jenkins2.3, Ansible Playbooks and Ansible Tower. Container management using Docker by writing Docker files and set up the automated build on Docker HUB.
  • Worked with Docker and Kubernetes on multiple cloud providers, from helping developers build and containerize their application (CI/CD) to deploying either on public or private cloud.
  • Used Kubernetes to orchestrate the deployment, scaling, management of Docker Containers. Created the framework to seamlessly integrate.
  • Used Jenkins pipelines to drive all microservices builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.
  • Integrated Kubernetes with network, storage, and security to provide comprehensive infrastructure and orchestrated container across multiple hosts.
  • Managed Kubernetes charts using Helm to Create reproducible builds of the Kubernetes applications & managed Kubernetes manifest files to managed releases of Helm packages. Implemented a production ready, load balanced, highly available, fault tolerant Kubernetes infrastructure. Kubernetes 1.9.0 is used to orchestrate the deployment, scaling, management of Docker Containers.
  • Implemented Ansible to manage all existing servers and automate the build/configuration of new servers and used Ansible Playbooks to setup Continuous Delivery Pipeline. Deployed micro services, including provisioning AWS environments using Ansible Playbooks.
  • Rolling upgrade cluster from Kafka without any downtime/data loss while upgradation. Developed Kafka consumers to consume data from Kafka topics. Deployed the applications to Tomcat Application Server and static content to apache web servers
  • Managed Ansible Playbooks with Ansible modules, implemented CD automation using Ansible, managing existing servers and automation of build/configuration of new servers.
  • Worked with Ansible for Orchestration of Deployments for various servers & Ansible playbooks, replacing the dependency on Chef Cookbooks and Chef Recipes to automate infrastructure as a code.
  • Managed the configurations of multiple servers using Ansible. Involved in support and upgrade of Ansible from 1.x into the 2.x version on servers Used Ansible to Orchestrate software updates and verify functionality. Created Maven scripts to create multiple deployment profiles and deploy the applications to Apache Tomcat.
  • Responsible for writing various scripts in Jenkins to monitor server health and self-heal if necessary. Also, configured email plug-in to send alerts based on groovy script.
  • Setting up and maintaining GitHub infrastructure and supporting a continuous delivery model by automating software build and package migration processes using GIT source control.
  • Expertise in Branching, Merging, Tagging and maintaining the version across the environments using SCM tools like GIT and Subversion (SVN) on Linux platforms.
  • Built Jenkins jobs to create AWS infrastructure from GitHub repos containing Terraform code and administered/engineered Jenkins for managing weekly builds.
  • Involved in using Terraform migrate legacy and monolithic systems to Amazon Web Services and provisioned the highly available EC2 Instances using Terraform and cloud formation and wrote new plugins to support new functionality in Terraform.
  • Converted existing AWS infrastructure to server less architecture (AWS Lambda, kinesis) deployed via Terraform and AWS Cloud formation.
  • Developed Dev, Test and Prod environments of different applications on AWS by provisioningKubernetesclusters on EC2 instances using Docker, Ruby/Bash, Chef, andTerraform.
  • Used Ticketing tool JIRA to track defects and changes for change management, monitoring tools like Splunk and Cloud Watch in different work environments in real and container workspace.
  • Implemented intuitive dashboards with a variety of graphical visualizations, efficient and reusable Splunk searches, custom platform related features and system integrations and apps.
  • Developing Splunk searches and information extraction of device logs for the purpose of cyber intrusion detection & monitoring structured after the kill chain model Parsing, Indexing, Searching concepts Hot, Warm, Cold, Frozen bucketing.
  • Used Splunk ES (SIEM) to analyses at tens of gigabytes of security point solution data, credentialed user activity data and bring in contextual data locked in key business systems.
  • Engineered Splunk to build, configure and maintain heterogeneous environments and maintained log analysis generated by various systems including security products.
  • Experience with Splunk Searching and Reporting modules, Knowledge Objects, Administration, Add-On's, Dashboards, Clustering and Forwarder Management.
  • Expertise in Installation, Configuration, Migration, Trouble-Shooting and Maintenance of Splunk & Created and Managed Splunk DB connect Identities, Database Connections, Database Inputs, Outputs, lookups, access controls.
  • Created Shell Scripts to install Splunk Forwarders on all servers and configure with common configuration files such as Bootstrap scripts, Outputs.conf and Inputs.conf Files.
  • Expert in using rex, Sed, erex and IFX to extract the fields from the log files & In depth and extensive Knowledge in setting up alerts and Monitoring recipes from the Machine generated data & Used MySQL, DynamoDB and Elasticache to perform basic database administration.
  • Experience with working on rolling updates using the deployments feature in Kubernetes and implemented BLUE GREEN deployment to maintain zero downtime.

CLOUD/DEVOPSENGINEER

Confidential, TX

Responsibilities:

  • Performed application server builds in EC2 environment and monitoring them using cloud watchAnd Creating Private networks and sub-networks and bringing instances under them based on the requirement in AWS.
  • Built DNS system in EC2and managed all DNS related tasks. Managed the configuration of the applications using chef to achieve Continuous Delivery goal on high scalable environment we used Docker coupled with load-balancing tool Nginx.
  • Leveraged AWScloudservices such as EC2, auto-scaling and VPC to build secure, highly scalable and flexible systems that handled expected and unexpected load bursts
  • Installed and configured various migration tools like AWS SERVER MIGRATION SERVICE, ATA DATA, AWS DATABASE MIGRATION SERVICE and migrated vm's from VMware to AWS.
  • Experience developing and integrating against APIs vRA / vRO engineering, development and support Terraform Engineering and integration into AWS Scripting.
  • Setting up scalability for application servers using command line interface for Setting up and administering DNS system in AWS using Route53 Managing users and groups using the amazon identity and access management (IAM).
  • Creating snapshots and amazon machine images (AMIs) of the instances for backup and creating clone instances.
  • Used EC2 as virtual servers to host Git, Jenkins and configuration management tool likeAnsible. Converted slow and manual procedures to dynamic API generated procedures.
  • Used AWSBeanStalkfor deploying and scaling web applications and services developed with Java, PHP, Node.js, Python, Ruby, and Docker on familiar servers such as Apache, and IIS.
  • Expertise in migrating the existing v1 (Classic) Azure infrastructure into v2 (ARM), scripting and templating the whole end to end process as much as possible so that it is customizable for each area being migrated into Google Cloud Platform (GCP) to build, test and deploy applications on Google's very adaptable and solid framework for web, portable, and backend arrangements.
  • Scheduled Jenkins to automate most of the build related tasks and tested using Selenium. Jenkins is used as a continuous integration tool for automation of daily process.
  • Performed Server configuration management via Puppet and Transparent Data Encryption (TDE) implementation.Implemented multi-tier application provisioning in Amazon cloud Services, integrating it with Puppet.
  • Implementing a Continuous Delivery framework using Git, Jenkins, Maven & Nexus, Puppet in Linux environment and Git, Maven, AWS S3, Jenkins, Docker
  • Coordinating with developers to resolve TFS build failures and issues.Developed a continuous deployment pipeline using Jenkins, Shell Scripts.Integrated SonarQube with Jenkins to test the code quality.
  • Using JenkinsAWSCode Deploy plugin to deploy toAWS. Automated deployment of builds to different environments using Jenkins.
  • Created Pods using Kubernetes and worked with Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes.
  • Creating micro services applications with integrations to AWS services by using Amazon EKS, while providing access to the full suite of Kubernetes functionality. Helped convert VM based application to microservices and deployed as a container managed by Kubernetes.
  • Installed and configured Kubernetes to manage Docker containers to convert VM based application to micro services and deployed as a container managed by Kubernetes.
  • Deployed the Java applications into web application servers like Web logic.Developed and supported the Red Hat Enterprise Linux based infrastructure in the cloud environment.
  • Built servers usingAWS, importing volumes, launching EC2, RDS, creating security groups, auto-scaling, load balancers (ELBs) in the defined virtual private connection.
  • Installed and configured Amazon command line interface tools Performed migrations of virtual servers from Ubuntu OpenVZ physical servers to AWS EC2.
  • WroteAnsiblePlaybooks with Python SSH as the Wrapper to Manage Configurations of AWS nodes and Tested Playbooks on AWS instances using Python. RunAnsibleScripts to Provide Dev Servers.
  • Developing automation scripts using EC2 commands.Worked on monitoring tools like SplunkManaged the configurations of the instances using OpscodeChef. Written and modified various Cookbooks/Recipes for better management of the systems.
  • Installed, configured and managed the ELK (Elastic Search, Log stash and Kibana) for Log management within EC2 / Elastic Load balancer for Elastic Search Worked with Log stash in order to visualize key Open stack environment log metrics on Kibana (ELK).
  • Automated the process of creating Grafana dashboards for monitoring Keystone data source. Installed and configured Grafana dashboard to display keystone data logs.
  • Have built Elastic search, Log stash and Kibana (ELK) for centralized logging and then store logs and metrics into S3 bucket using Lambda function for more than 2 weeks
  • ImplementedNagiosand integrated with Ansible for automatic monitoring of servers. Designs and implementCobblerinfrastructure and integrate with Ansible doing Linux provisioning.
  • Configured Source Code Management tools with Bamboo and executed triggers in SVN.
  • Implemented Continuous Integration and deployment using various CI Tools like Jenkins, Bamboo, Chef, and Puppet (Configuration Management Tools).
  • Installed, configured and administered Splunk Enterprise Server and Splunk Forwarder on Red hat Linux and Windows servers.
  • Worked on migration of Splunkto AWS(cloud) instances & Monitored Amazon ECS logs into Splunkby enabling SSL for security.
  • Monitor server applications, use monitoring tools OEM, AppDynamics, Splunk log files to troubleshoot and resolve problems.
  • Maintained System log and CloudTrail collection using Splunk, including Splunk installation, collector configuration and multi indexer setup.
  • Worked with administrators to ensure Splunk is actively and accurately running and monitoring on the current infrastructure implementation
  • Expertise building and monitoring software projects continuously with a CI tool, Bamboo, Hudson, Cruise Control, Build Forge, Visual Build Professional.

DEVOPS ENGINEER

Confidential, TX

Responsibilities:

  • Responsible for day to day Build and deployments in pre-production and production environments.
  • Deployed Puppet, puppet Dashboard for configuration management to existing infrastructure.
  • Designed and coded Business logic and Database layers in C#, XML, C++ and python.
  • Created Private networks and sub-networks and bringing instances under them based on the requirement. Built DNS system in EC2 and managed all DNS related tasks. Managed the configuration of the applications using chef.
  • Performed the automation using Chef Configuration management & Scheduled automated nightly builds using Jenkins.
  • Kept information organized and accessible with a flexible page hierarchy using Atlassian Confluence pages.
  • Worked on documentation - Chef basics, Initial setup of Chef, Data bags implementation, Coding standards,
  • Cookbook document, testing docs working with application deployment automation using Chef.
  • Installing application and load balance packages on different server using chef.
  • To achieve Continuous Delivery goal on high scalable environment, used Docker coupled with load balancing tool Nginx.
  • Developing Ant and Maven scripts to automate the compilation, deployment and testing of Web and J2EE applications.
  • Developed GIT hooks for the local repository, code commit and remote repository, code push functionality.
  • Creating snapshots and amazon machine images (AMIs) of the instances for backup and creating clone instances.
  • Worked on creation of Docker containers and Docker consoles for managing the application life cycle & Used Docker containers for eliminating a source of friction between development and operations & Implemented Docker machine as a virtualization between systems.
  • Developed automation and deployment utilities using Ruby, Bash, PowerShell, Python and Run deck.
  • Integrated builds with Code Quality Tools like Coverture, PMD and Check style.
  • Used BuildForge for Continuous Integration and deployment into Web Sphere Application Servers.
  • Assisted customers in implementing DevOps strategies using BuildForge as the automation engine.
  • Worked on Splunk architecture and various components (indexer, forwarder, search head, deployment server), Universal and Heavy forwarder.
  • Involved in helping the Unix and Splunk administrators to deploy Splunk across the UNIX and windows environment

BUILD AND RELEASE ENGINEER

Confidential

Responsibilities:

  • As a Build & Release Engineer responsible for continuous delivery, working with different teams to deliver high-quality applications to satisfy growing customer and business demands.
  • Coordinating different tasks with different teams for creating usage models for different projects.
  • Managed source control systems using GIT and SVN.
  • Designing, creating and maintaining GIT repositories to client specifications and involved for setting up of Subversion-SVN server, server maintenance, Client machines setup.
  • Performed regular builds and deployment of the packages for testing in different Environments (DEV, QA, CERT, UAT and PROD).
  • Implemented Clear Case and Subversion branching and merging operations for Java Source Code.
  • Performing smoke tests to ensure the integrity of code deployment.
  • Performed builds on Java projects using ANT and MAVEN as build tools.
  • Regular builds are initiated using the continuous integration tool like Jenkins.
  • Configured Jenkins for doing the build in all the non-production and production environments.
  • Implemented MAVEN builds to automate artifacts like jar, war and ear.
  • Release Engineer for a team that involved different development teams and multiple simultaneous software releases.
  • Developed and implemented software release management strategies for various applications according to agile process.
  • Managed sonar type nexus repositories to download the artifacts during the build.
  • Used Puppet and other configuration management tools to deploy consistent infrastructure code across multiple environments.
  • Worked on Scrum methodology to maintain software development and coordinated with all the teams before and after the production deployments for the smooth production releases.
  • Deploying Java Enterprise applications to Apache Web Server, JBoss Application server.
  • Created a complete release process documentation, which explains all the steps involved in the release process.

BUILD RELEASE ENGINEER

Confidential

Responsibilities:

  • Provided centralized software configuration management for enterprise application projects in a multi-tiered high-availability environment.
  • Integrated Eclipse IDE with different versioning tools like Clear Case, SVN, CVS, and GIT.
  • Managed SVN branching and merging for older projects in multiple simultaneous releases.
  • Providing configuration services on multiple platforms in the test environment running on one or more IT Platforms like Client/server, Jenkins, MS Build, Microsoft Windows NT, OS/390, UNIX. Completing software builds and elevations, creating directories and security groups, and recreating prior versions.
  • Monitored software, hardware, and/or middleware updates and utilizing technologies like Jenkins/Hudson, MS Build, TFS Team Explorer, and SVN.
  • Worked on SVN&CVS Administration including user management, repo migration, repo creation, repository hook script implementation, Integration with Jira/GIT/Fisheye, troubleshooting.
  • Created and configure jobs, script builder, custom command builder, agents in Bamboo. worked with SVN/CVS, and GIT Software Configuration tools (Source Control).
  • Managed all the environment and application level config using GIT.
  • Documented and published complete migration process of Subversion (SVN admin dumps) to UCM Clear Case (VOBS).
  • Developed and build and deployment standards with input from development, IT/operations, and IT security.
  • Evaluated build automation tools (Open Make and AnthillPro), recommended AnthillPro. Configured and deployed AnthillPro in Solaris 10 environment with multiple zones/containers, using Oracle database and multiple CVS servers and repositories & Configured services using SMF and XML.

SYSTEM ADMINISTRATOR

Confidential

Responsibilities:

  • Installed, Configured and maintained Red Hat Linux, CentOS, servers, DNS, LDAP, NFS.
  • Installed WebSphere Application severs 6.0 on Redhat Linux boxes. Created subversion repositories, imported projects into newly created subversion repositories as per the standard directory layout.
  • Monitoring day-to-day administration and maintenance operations of the company network and systems working on Linux Systems.
  • Imported and managed multiple corporate applications into Subversion (SVN).
  • Systems performance monitoring.
  • Knowledge in VMware Virtualization.
  • Responsible for troubleshooting end user and application problems.
  • Assembling the systems and installing operating system and application software.
  • Designing computer displays to accomplish goals using flowcharts and diagrams.
  • Installing Operating Systems and Application Software.
  • Developed automated processes that run daily to check disk usage and perform cleanup of file systems on LINUX environments using shell scripting.
  • Monitoring CPU, memory, physical disk, Hardware and Software RAID, multipath, file systems, network using the tools Clo 4.0 monitoring.
  • Writing Shell scripts for automation of daily tasks, documenting the changes that happen in the environment and in each server, analyzing the error logs, analyzing the User logs, analyzing the /var/log/messages.
  • Planned, scheduled and Implemented OS patches on Linux boxes as a part of proactive maintenance.
  • Installing and upgrading of packages and patches configuration management, version control, service pack and reviewing connectivity issue regarding security problem.

We'd love your feedback!