We provide IT Staff Augmentation Services!

Cloud Engineer Resume

Chicago, Il

SUMMARY:

  • Over 8 years of professional experience as a DevOps Engineer - Build and Release Engineer in Automating, Building, Deploying, Managing, and Releasing of code from one environment to other environment and maintaining Continuous Integration, Delivery, and Continuous Deployment in multiple environments like Developing, Testing, Staging & Production.
  • As a DevOps Engineer worked on Automating, Configuring and Deploying instances on AWS, GCP and Data centers.
  • Experience in Amazon Web Services (AWS) cloud which includes services like EC2, S3, VPC, ELB, EBS, Glacier, RDS , Aurora, CloudFront, CloudWatch, SecurityGroups, Lambda , CodeCommit, CodePipeline, CodeDeploy, DynamoDB, Autoscaling, Route53 , RedShift, CloudFormation, CloudTrail, OpsWorks, Kinesis, IAM, SQS, SNS, SES .
  • Experience in Cloud Computing technologies including Infrastructure as a Service , Platform as a Service , and Software as a Service provider (IaaS, PaaS, and SaaS).
  • Experience in writing CloudFormation templates in YAML and JSON formats to build the AWS services with the paradigm of Infrastructure as a code.
  • Experience in provisioning highly available EC2 Instances by using Terraform and cloud formation and wrote new plugins to support new functionality in Terraform.
  • Experience in Terraform for creating stacks of VPCs, ELBs, Security groups, SQS queues, S3 buckets in AWS and updated the Terraform Scripts based on the requirement on regular basis.
  • Expertise in creating Docker containers and building Docker images and pushed those images to Docker registry and Deploying and maintaining Micro services using Docker.
  • Experience in Configuring the provider with Terraform which is used to interact with resources supported by Kubernetes to create several services such as Config Map, Namespace, Volume, Auto scaler.
  • Experience in working on several Docker components like Docker engine , Docker Hub, Docker Swarm and Docker registry . Docker Swarm provides clustering functionality for Docker containers.
  • Experience in Configuring Chef server, Chef Workstation, bootstrapping various enterprise nodes and automated the cloud deployments using Chef, Ruby and AWS Cloud Formation Templates.
  • Experience in Designing, Installing and Implementing Ansible configuration management system and writing Playbooks for Ansible using YAML for maintaining roles and deploying applications.
  • Experience in Ansible and Ansible Tower as Configuration management tool, to automate repetitive tasks, quickly deploy critical applications and Environment Configuration Files.
  • Expertise in Deploying servers using Puppet , and Puppet DB for configuration management to existing infrastructure and Implemented Puppet 3.8 manifests and Modules to deploy the builds for Dev, QA and Production .
  • Experience in working with EC2 Container Service plugin in Jenkins which automates the Jenkins Master Slave configuration by creating temporary slaves.
  • Expertise in Configuring CI/CD pipelines and setup Auto trigger auto build and Auto deployment with the help of the CI/CD tool like Jenkins .
  • Expertise in Build and Release Management tools, experienced with CI/CD tools like Jenkins, Hudson and Build forge, Bamboo . Where able to streamline the code delivery process from developer's workstations to Production systems.
  • Experience in branching, tagging and maintaining the version across the environments using SCM tools like Subversion (SVN), CVS, Bitbucket and GIT on UNIX and Windows environments.
  • Knowledge in understanding of the principles and best practices of Software Configuration Management (SCM) in Agile, Scrum, and Waterfall methodologies.
  • Extensive experience in using Maven, Gradle and ANT as build tools for building of deployable artifacts (jar, war & ear) from source code.
  • Experience in Virtualization technologies VMWare, Virtual box, Vagrant for creating virtual machines and provisioning environments.
  • Expertise in using Webhooks for integrating with Continuous Integration tools like Jenkins, TeamCity, Bamboo and ANT , Maven and Graddle for generating builds. Designed quality profiles and certain standards set by installing Quality Gates in SONARQUBE.
  • Experience in setting up of end-to-end environment by defining DNS records, Load Balancer VIP's, Apache Proxies and backend Tomcat/WebLogic with registering authentication SiteMinder services.
  • Supported Deployments into PROD, Pre-Prod environments with multiple Application server technologies like WebLogic, Jboss, Glassfish and Apache Tomcat.
  • Experience in developing UI using JSP, HTML, CSS, JavaScript and NoSQL databases like Cassandra and MongoDB.
  • Experience in Designing the application using HTML5, AngularJS, CSS, Ng-Grid, Bootstrap, Web-API, responsive web-design for mobile access.
  • Served the ELK (Elastic search, Log stash, Kibana) stack community with use cases and Logstash plugin and actively participated in blogs and QA.
  • Experience in Monitoring tool like Nagios, Splunk, AppDynamics and task scheduling tools like Cronjob.
  • Experience in implementing use of Nagios and keynote for monitoring and analysing network loads on machines by enforcing custom Nagios monitoring, notifications, dashboard to exhibit various metrics using Shell Scripting .
  • Knowledge on involving in setting up of JIRA as defect tracking system and configured various workflows.

TECHNICAL SKILLS:

Cloud services: AWS, GCP

Operating System: Linux, Centos, Redhat, windows, Ubuntu

CI/CD Tools: Jenkins, Hudson and Bamboo, Nexus, JFrog, Artifactory and SonarQube

Scripting language: Shell, Perl, Python, Bash Scripts, JSON and YAML

Containerization Tools: Docker, Packer

Build Tools: Maven, ANT, MS Build and Gradle

App Servers: JBOSS, WebLogic and Web Sphere

Agile, V: Model, Waterfall and UML Clear Case, GIT, Stash and Bit Bucket

Bug Tracking Tools: Jira, Fisheye, Crucible, Rally, and Remedy

Web Servers: Apache, Apache Tomcat and Nginx, WebSphere, Jboss

Orchestration Tools: Kubernetes, Docker Swarm, Mesos

PROFESSIONAL EXPERIENCE:

Confidential, Chicago, IL.

Cloud Engineer

Responsibilities:

  • Configured, Monitored and Automated GCP Services as well as involved in deploying the content cloud platform using Google Compute Engine, Google Storage Buckets and created Google storage buckets and maintained and utilized the policy management of these buckets and Glacier for storage and backup on Google cloud.
  • Built custom data feed automation tool to read Stackdriver's metrics onto on premise VALET dashboard such as Volume, Availability, Latency, Errors and Tickets and used Google API's to get the metrics and created the metrics.
  • Configured a Google cloud Virtual Private Cloud (VPC) and Database Subnet Group for isolation of resources and created several Cyber Security compliance processes for the organization.
  • Configured Hybrid Cloud setup on GCP using VPN with two different regions and used Google Cloud console to create and manage GCP and GKE workloads. written the Python script to send the Stackdriver logs using Cloudfuntion with integration of Pub/Sub and BigQuery and automated all the infrastructure work flows using Terraform.
  • Exported the Stackdriver logs to Pub/Sub and from the Pub/sub sending data to the GCS bucket and the Big Query.
  • Created Composer environment and worked on the Airflow for scheduling the jobs by using the Dags and accomplished the POC which is Triggering the Dags by using the REST-API calls from On-Prem Unix Server.
  • Used Data prep for converting the Raw data to the Redefined data and used Cloud Storage bucket for storing that data and from Cloud Storage exported the data to the PostgreSQL and the Big Query.
  • Created Identity-Aware-Proxy for O-Auth Authentication for triggering Dags from the On-prem server by using the Rest-Api calls and used Python Scripts for Dags for scheduling the jobs
  • Installed the Syslogng server on the local machine and with the help of the Logstash exported the logs from pub/sub to the Syslogng server.
  • Created the Dataflow pipeline to continuously sending the logs from Stackdriver to GCS bucket
  • Automated Project creation, Network Firewall and Compute Instance creation using Terraform.
  • Managed IAM policies with active directory integration to manage security in GCP and AWS.
  • Configured, supported and maintained all Network, Firewall, Load balancers, Operating systems in AWS EC2 and created detailed AWS Security groups which behave as virtual firewalls that controlled the traffic allowed reaching one or more AWS EC2 instances.
  • Integrated Amazon Cloud Watch with Amazon EC2 instances for Monitoring the log files, store them and track metrics. Created AMI’s to implement automatic deployments of application components and bootstrapping AWS EC2 Instances by passing user data to download files from S3.
  • Configured a Kubernetes Cluster on GKE and managed, production-ready environment for deploying containerized applications and deployed the Kubernetes dashboard to access the cluster via its web-based user interface.
  • Created Clusters using Kubernetes, kubectl and worked on creating many Pods, Replication controllers, Services deployments, Labels, Health checks and ingress by writing YAML files.
  • Developed and Test environments carrying different operating system platforms like Windows, Ubuntu, Red Hat Linux, Centos, Unix.
  • Used JIRA for creating bugs tickets, storyboarding, pulling reports from dashboard, creating and planning Sprints

Environment: GCP, GCE, GKE, App Engine, Stackdriver, Pub/Sub, Cloud Function, BigQuery, Dataflow, Cloud Shell, VPC, AWS, EC2, Python, Cloud watch, Kubernetes, Windows, Linux, Jira.

Confidential, Ann Arbor, MI.

Sr Devops/Cloud Engineer

Responsibilities:

  • Setting up of CI/CD pipeline using continuous integration tools such as Cloud Bees Jenkins and automated the entire AWS EC2, VPC, S3, SNS, RedShift, EMR based infrastructure using Terraform, Chef, Python, Shell, Bash scripts and managing security groups on AWS and custom monitoring using CloudWatch.
  • Created an AWS RDS Aurora DB cluster and connected to database through an Amazon RDS Aurora DB Instance using Amazon RDS Console and used BOTO 3 and Fabric for launching and deploying instances in AWS and configured Inbound or Outbound in AWS Security groups according to the requirements.
  • Developed Amazon Elastic Container Registry for integrating with Amazon ECS and the Docker CLI, for development and production workflows and worked on creation of various subscriptions and topics using SNS and SQS based services and automated the complete deployment environment on AWS.
  • Implemented Packer based scripts for continuous integration with the Jenkins server and deployed those scripts on to the Amazon EC2 instances and customized AMI’s based on already existing AWS EC2 instances by using create image functionality, hence using this Snapshot for disaster recovery.
  • Implemented Cloud Infrastructure as a Service (IaaS) Automation across AWS Public Cloud using Packer & Terraform and implemented Terraform Enterprise to Provision Infrastructure across AWS Workloads and OpenShift Clusters.
  • Leveraged AWS S3 service as Build Artifact repository and created release-based Buckets to store various modules/branch based Artifact storage.
  • Used Terraform templates along with Packer to build images for application deployment in AWS.
  • Designed AWS CloudFormation Templates to create custom sized VPC, Subnets, NAT to ensure successful deployment of Web applications and database templates and created scripts in Python which integrated with Amazon API to control instance operations.
  • Created Kubernetes YAMLs using objects like Pods, Deployments, Services and ConfigMaps and created reproducible builds of the Kubernetes applications, managed Kubernetes manifest files and Helm packages.
  • Utilized Kubernetes as a platform to provide a platform for automating the Deployments, Scaling and Operation of application containers across a Cluster of hosts and worked closely with development teams and test engineers for EC2 size optimization and Docker build Containers.
  • Used Docker to containerize custom web application and deploy them on Ubuntu instance through SWARM Cluster and to automate the application deployment in cloud using Vagrant.
  • Created a microservice environment on cloud by deploying services as a Docker container and used Amazon ECS as a container management service to run micro services on a managed cluster of EC2 instances.
  • Implemented Docker Containers to create images of applications and dynamically provision slaves to Jenkins CI/CD pipelines and reduced build and deployment times by designing and implementing Docker workflow.
  • Used Git for source code version control and integrated with Jenkins for CI/CD pipeline, code quality tracking and user management with build tools Maven and Gradle.
  • Involved in writing Jenkinsfile by using Groovy Scripts for building CI/CD pipeline for automation of Shell Scripts.
  • Configured Jenkins jobs to automate build create Artifacts and Execute unit tests as part of the build process. Also, integrated build process with Sonar for Code Quality analysis.
  • Automated the tasks using Ansible Playbooks and migrating the servers with the required configuration changes and testing and deploying the machines using Ansible Commands.
  • Enhanced the automation to assist, repeat and consist configuration management using Ansible based YAML scripts and worked on deployment automation of all the Microservices to pull image from the private Docker Registry and deploy to Kubernetes Cluster using Ansible.
  • Used Knife and Chef Bootstrap processes and worked on Chef Server management console with proficient knowledge on all different components like Nodes and Workstations.
  • Worked with Chef Enterprise Hosted as well as On-Premise, Installed Workstation, Bootstrapping the Nodes and used Chef Ohai to collect system configuration data, which is provided to the Chef-Client for use within the Cookbooks to determine the System State.
  • Used Ruby scripting on Chef Automation for creating Cookbooks comprising all resources, templates, attributes and used Knife commands to manage Nodes.
  • Used ANSIBLE role to create an ELK cluster for non-log purposes to search and analytics of product data and pricing data.
  • Designed ELK (Elastic search, Logstash, Kibana) system to monitor and search enterprise alerts installed, configured and managed ELK Stack for Log management within EC2 / Elastic Load balancer for Elastic Search.
  • Used AWS Beanstalk for deploying and scaling web applications and services developed with Java, PHP, Node.js, Python, Ruby, and Docker on familiar servers such as Apache, and IIS.
  • Writing new plugins in Nagios to monitor resources and working in implementation team to build and engineer servers on Ubuntu and RHEL Linux provisioning virtual servers on VMware and ESX servers using Cloud.
  • Involved in setting up application servers like Tomcat, WebLogic across Linux platforms as well as wrote shell scripts, Bash, Perl, Python, Ruby scripting on Linux.
  • Used JIRA for creating bugs tickets, storyboarding, pulling reports from dashboard, creating and planning Sprints.

Environment: AWS, Packer, Cloud Bees, Jenkins, Terraform, Kubernetes, Docker, Docker Swarm, Ansible, Chef, Python, Bash Scripts, Shell Scripts, YAML, Groovy Script, Git, Maven, ELK, Splunk, Nagios, Ubuntu, RHEL, Java, PHP, Ruby, Jira.

Confidential, Stamford, Connecticut

AWS Engineer

Responsibilities:

  • Involved in designing and deploying of a multitude application by utilizing almost all the AWS stacks including EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM focusing on high-availability, Fault tolerance, and Auto-Scaling in AWS Cloud formation.
  • Migrated Production Infrastructure into an Amazon Web Services utilizing AWS Cloud Formation , Code Deploy, EBS and Ops Works . And Deployed and migrated applications using AWS CI/CD tools like Code Pipeline, Code Commit.
  • Used Amazon ECR for hosting images in a highly available and scalable architecture and allowing to deploy containers for applications Integration with AWS Identity and Access
  • Created AWS Multi-Factor Authentication (MFA) for instance RDP/SSH logon, worked with teams to lock down security groups and created IAM roles so AWS resource can securely interact with other AWS
  • Utilized AWS CLI to automate backups of ephemeral data-stores to S3 Buckets , EBS and created AMI s for mission critical production servers as backups.
  • Setting up private networks and sub-networks using Virtual Private Cloud (VPC) and creating security groups to associate with the networks and set up and administer DNS system in AWS using Route53.
  • Configured AWS Identity and Access Management (IAM) Groups and Users for improved login authentication. Also handled federated identity access using IAM to enable access to our AWS account.
  • Used Elastic Container Service (ECS) to support Docker containers to easily run applications on a managed cluster of Amazon EC2 instances and used Terraform in AWS Virtual Private Cloud to automatically setup and modify settings by interfacing with control layer.
  • Written Terraform templates and pushed them onto Chef for configuring EC2 Instances and solved Gateway time issue on ELB and moved all the logs to S3 Bucket by using Terraform .
  • Worked on Terraform for managing the infrastructure through the terminal sessions and executing the Scripts and creating Alarms and notifications for EC2 instances using Cloud Watch .
  • Converted existing Terraform modules that had version conflicts to utilize CloudFormation templates during deployments and to create Stacks in AWS, and updated these scripts based on the requirement on regular basis.
  • Initiated Microservices application through Docker and Kubernetes Cluster formation for scalability of the application, and creation of Docker Images to upload or download in and out from the Docker Hubs.
  • Created Clusters using Kubernetes, kubectl and worked on creating many Pods, Replication controllers, Services deployments, Labels, Health checks and ingress by writing YAML files.
  • Build Docker Images and deployed Restful API microservices in the Containers managed by Kubernetes and develop CI/CD system with Jenkins on Docker container environment, utilizing Kubernetes and Docker for the runtime environment for the CI/CD system to Build, Test and Deploy.
  • Used Docker as a Container management for writing Dockerfile in JSON format and place the automated build in Docker Hub and managed deployments using Kubernetes and created local clusters and deployed application containers.
  • Builded Jenkins pipelines to drive all micro-services builds out to the Docker Registry and then deployed to Kubernetes and created Pods and managed by using Kubernetes.
  • Worked on infrastructure with Microservice models like Docker Containerization and collaborated with development support teams to setup a Continuous Delivery environment with the use of Docker.
  • Used GIT for creating the Local Repo, Cloning the Repo, adding, committing and pushing the changes in the local repo and creating and maintaining GIT Repositories also analysing and resolving conflicts related to merging of source code to GIT.
  • Installed and administered Artifactory repository to deploy the Artifacts generated by Maven and to store the dependent jars which are used during the Build.
  • Worked with Jenkins for any automation builds which are integrated with GIT as part of infrastructure automation under continuous integration (CI).
  • Developed Version control of Chef Cookbooks, testing of Cookbooks using Test Kitchen and running recipes on Nodes managed by on premise of Chef Server.
  • Deployed and configured Chef Server and Chef Solo including Bootstrapping of Chef Client Nodes for provisioned and created Roles, Cookbooks, Recipes, and Data Bags for Server configuration.
  • As an ELK developer worked on all the internal tools. Designed, deployed and coordinated with different teams to enhance ELK platform and took ownership of new technologies
  • Configured Apache webserver in the Linux AWS Cloud environment using Chef Automation and evaluated Chef Framework tools to automate the cloud deployment and operations.
  • Experienced in authoring pom.xml files, performing releases with the Maven release plugin in Java projects and managing Maven repositories.
  • Developed Cron jobs and Shell Scripts and Python for automating administration tasks like file system management, process management, backup and restore.
  • Developed Splunk Queries and dashboards targeted at understanding application performance and capacity analysis and worked on setup of various reports and alerts in Nagios.
  • Designed and administered databases for Oracle, MySQL to support various web programming tasks.

Environment: AWS, GKE, Terraform, Kubernetes, SWM, Docker, Git, ANT, BitBucket, Maven, Jenkins, Chef, Ruby, Nagios, Cacti, Zabbix, Splunk, Shell Scripts, Python, Nginx, Apache, JSON, Vagrant, WebLogic, Oracle, MySQL, Java.

Confidential, Mooresville, North Carolina

Devops Engineer

Responsibilities:

  • Worked on AWS CloudWatch for monitoring the application infrastructure and used AWS Email services for notifying & configured S3 version in gand lifecycle policies to and back up files and archive files in Glacier.
  • Configured, supported and maintained all Network, Firewall, Load balancers, Operating systems in AWS EC2 and created detailed AWS Security groups which behave as virtual firewalls that controlled the traffic allowed reaching one or more AWS EC2 instances.
  • Created monitors, alarms and notifications for EC2 hosts using Cloud Watch and monitored system performance, managed Disk Space LVM (Logical Volume Manger) and performed system Backup and Recovery.
  • Built S3 Buckets and managed policies for S3 buckets and used S3 Bucket and Glacier for storage and backup on AWS and created Snapshots and Amazon Machine Images (AMI's) in EC2 instance for Snapshots and creating clone instances.
  • Migrating a production infrastructure into an Amazon Web Services utilizing AWS CloudFormation, Code Deploy, EBS and Ops Works.
  • Created Dockerfile for each microservice's and changed some of the Tomcat configuration file which are required to deploy Java based application to the Docker Container.
  • Worked on end to end setup of the Artifactory pro as a Docker Container with a secure private Docker Registry and local Docker repositories for storing the built Docker Images.
  • Used Git for deployment scaling and load balancing to the application from dev through prod, easing code development and deployment pipeline by implementing Docker Containerization with multiple name spaces.
  • Created, tested and deployed an End to End CI/CD pipeline for various applications using Jenkins as the main Integration server for Dev, QA, Staging, UAT and Prod environments.
  • Deployment automation using Puppet along with Hiera Data on Mcollective Orchestration engines and writing manifests and modules for different micro services.
  • Created Puppet automation with multiple modules as per component like MySQL, Http collectors & Schema registry to install and configure EC2 instances to implement release schedules, created Rollout Plans, tracked the Project Milestones.
  • Built and managed a highly available monitoring infrastructure to monitor different application servers and its components using Nagios, with Puppet Automation.
  • Installed, configured, upgraded and managed Puppet Master, Agents & Databases. Integration of Puppet with Apache and Passenger and experience in working with Puppet Enterprise and Puppet Open Source.
  • Worked closely with the development of organizations to implement the necessary tools and process to support the automation of builds, deployments, testing and infrastructure (infrastructure as code) using Chef.
  • Written Chef Cookbook recipes to automate installation of middleware infrastructure like Apache Tomcat, JDk and configuration tasks for new environment continuous management.
  • Worked with Chef Ohai plugin to push jobs and exposure to Chef Supermarket to leverage the existing Cookbooks for quick automation of general deployment and Infrastructure tasks.
  • Involved in developing JUnit Test Cases to validate the type of data in the XML Files. Used Log4J for logging and tracing the messages.
  • Implemented and maintained the branching of build/release strategies utilizing Clear Case.

Environment: AWS, Docker, Chef, Puppet, Git, Maven, Ant, Jenkins, Java, Kafka, Zookeepers, MySQL, Nagios, XML, Log4J, Junit, Clear Case, Apache Tomcat, JDk, Spark

Confidential, Medley, Florida

Build/Release Engineer

Responsibilities:

  • Developed build and Deployment Scripts using ANT and GRADLE as build tools in Jenkins to move from one environment to other environments.
  • Involved in setting up Puppet Master/Client to automate installation and configuration across the environment.
  • Created, automated and managed the builds and responsible for continuous integration of builds using SVN, UNIX, Tomcat, IBM Message broker.
  • Created analytical matrices reports, dash boards for release services based on JIRA tickets.
  • Troubleshoot the automation of installing and configuring .NET applications in the test and production environments.
  • Installed and configured Jenkins for Automating Deployments and providing a complete automation solution.
  • Patch management review via PowerShell script to discovered current patch status and deploy patches to effected systems, implemented Windows Update Services (WSUS) to schedule updates.
  • Configured TCP/IP for servers, workstations, and setup of complete network.
  • Extensively worked with software build tools like Apache Maven, Apache Ant to write pom.xml and build.xml respectively.
  • Developed UNIX and Perl Scripts for manual deployment of the code to the different environments and E-mail the team when the build is completed.
  • Created UNIX scripts for build and Release activities in QA, Staging and Production environments.
  • Managed and installed software packages using YUM and RPM and created repository files for offline servers.

Environment: Ant, Gradle, Jenkins, Puppet, Unix, Tomcat, Jira, .Net, PowerShell, TCP/IP, Maven, Perl, Yum, Rpm

Confidential

Java Developer

Responsibilities:

  • Analysis, design and development of Application based on J2EE using Spring and Hibernate.
  • Involved in interacting with the Business Analyst during the Sprint Planning Sessions.
  • Used Hibernate for Object Relational mapping with Oracle database.
  • Used Spring IOC for injecting the beans and reduced the coupling between the classes.
  • Created RFP (Request for Proposal) micro-services to provide RESTful API utilizing Spring Boot with Spring MVC.
  • Used Spring IOC (Inversion of Control)/DI (Dependency Injection) for wiring the object dependencies across the application.
  • Used Eclipse IDE as development environment to design, develop and deploy Spring components on Tomcat server.
  • Implemented connection Pooling for database connectivity, transaction and retrieval queries using SQL with the backend Database.
  • Designed and developed several web pages using HTML/CSS and JavaScript to perform validations at Client's side.
  • Created Web Pages using XML, XSLT, JSP, HTML and JavaScript.

Environment: J2EE, Hibernate, Spring IOC, Eclipse IDE, Spring Boot, Spring MVC, HTML, Java Script, XML, XSLT, JSP, HTML, Java Script.

Confidential

Linux/Unix Admin

Responsibilities:

  • Managed and administrated of all UNIX servers, includes Linux operating systems by applying relative patches and packages at regular maintenance periods using Red Hat Satellite server, YUM, RPM tools.
  • Installation, maintenance, managing and regular upgrading of Red Hat Linux Servers Ubuntu Trust, CentOS, Fedora, Linux on both standalone and virtual environments.
  • Creating and managing virtual memory (swap spaces) and filesystems, while also supporting data management through on-site and off-site storage and retrieval services.
  • Configured and worked on scripts for DNS look up tests on net group, auto mounting and unmounting the shares at Linux end.
  • Production support of Apache, Apache HTTPD, Jboss, Tomcat and Oracle Web logic 10.3 application servers including installation, configuration, management and troubleshooting.
  • Strong understanding in writing the automation of processes using the Shell script with Bash and Python.
  • Maintain all computer, server, network and wireless hardware

Environment: Unix, Red Hat, DNS, Yum, Rpm, Ubuntu, Centos, Fedora, Apache Httpd, Jboss, Tomcat, Oracle, Web logic, kernel, Shell, Bash, Python.

Hire Now