We provide IT Staff Augmentation Services!

Aws Devops Developer Resume

5.00/5 (Submit Your Rating)

MN

PROFESSIONAL SUMMARY

  • Over 9+ years of experience in highly available business applications for enterprise customers like Cloud Computing, Linux System administration, Network and Security Management.
  • Experience in working with multiple public and private cloud platforms like Amazon Web Services.
  • Worked on Amazon Web Services (AWS) infrastructure with automation and configuration management tools such as Chef and Puppet.
  • Proficient in using all Amazon web services like EC2, EBS, IAM, S3, ELB, RDS, VPC, Route 53, Cloud Watch, Cloud Formation etc.
  • Defined AWS security groups which acted as virtual firewalls to control the incoming traffic onto one or more AWS EC2 instances.
  • Experience in real - time monitoring and alerting of applications deployed in AWS using Cloud Watch, Cloud Trail and Simple Notification Service.
  • Experience in deploying and monitoring applications on various platforms using Elastic Bean Stalk.
  • Configured AWS Identity and Access Management (IAM) users and groups for improved login authentication.
  • Experience working on several Docker components like Docker Engine, Hub, Machine, Compose and Docker Registry.
  • Expertise in managing VPC configurations for organizations and maintaining networks and subnet ranges.
  • Implemented AWS high-availability using AWS Elastic Load Balancing (ELB), which performed balance across instances in multiple availability zones.
  • Configured and managed AWS Glacier to move old data to archives, based on retention policy of database/applications.
  • Worked on the v1 release of VEMS RESTful Micro-services developed in Java to be consumed by various.
  • Proficient with container systems like Docker and container orchestration like EC2 Container Service, Kubernetes, worked with Terraform.
  • Migrate Database objects -Staging Tables, Final Tables, Aggregates and Consumption Views to Snowflake.
  • Migrate ETL - Refactoring ETL process currently implemented in Informatica to load data into Snowflake Staging Tables (ETL jobs built to ensure full re-start ability).
  • Load Final Tables - Creation of SQL Scripts to load data from Staging Tables to Final Tables and Orchestration of these scripts on an ongoing basis to keep the tables up-to-date.
  • Utilize Teradata Export Framework (developed using TPT tool) to load historical data into the Snowflake.
  • Testing and parallel run of all the migrated jobs in the new environment and comparison of data between Teradata and Snowflake
  • Managed Docker orchestration and Docker containerization using Kubernetes.
  • Used Kubernetes to orchestrate the deployment, scaling and management of Docker Containers
  • Assigned AWS elastic IP addresses to work around host or availability zone failures by quickly re-mapping the address to another running instance.
  • Designed, configured and deployed Amazon Web Services for a multitude of applications utilizing the Amazon services focusing on high-availability, fault tolerance and Auto Scaling.
  • Experience in installation, configuration and maintenance of Redhat, CentOS and Ubuntu at multiple Data Centre.
  • Strong experience implementing project using jhipster Framework and customizing as per project requirements
  • Designed and developed the application using Angular 2 framework along with Jhipster,HTML5, CSS3, Type Script, Java Script, Bootstrap, Node.js, NPM, Mongo DB
  • Ability to design applications on AWS taking advantage of disaster recovery design guidelines.
  • Hands on experience in Chef Enterprise. Installed workstation, bootstrapped nodes, wrote recipes and cookbooks and uploaded them to chef server.
  • Managed On-site OS/Applications/Services/Packages using Chef as well as AWS for EC2/S3/Route53 and ELB with Chef Cookbooks.
  • Worked on version controller like GIT and continuous integration tools like Jenkins.
  • Continuous integration, automated deployment and management using Jenkins.
  • Managed GIT version control system in creating branches, tags and merge.
  • Practical experience in Linux administration and troubleshooting.
  • Experience in installation, configuration, backup, recovery, maintenance and support.
  • Experience in creation and managing user accounts, security, disk space and process monitoring in Redhat Linux.
  • Installing and upgrading packages, version control and reviewing connectivity issue regarding security problem.
  • Performed Disk Management with the help of LVM (Logical Volume Manager)
  • Experience in writing Bash Shell Scripts to automate the administrative tasks and management using Cron Jobs.

TECHNICAL SKILLS:

Programming Languages & Databases: C, C++, JAVA, C#, .NET, PYTHON, Oracle 11g, PL/SQL, MySQL & ORM Framework Hibernate,Java Technologies Servlets, JSP

Cloud Services: EC2, S3, VPC, Cloud Formation, Cloud Watch, Cloud Trail, Cloud Formation, Redshift, EMR, RDS, Dynamo DB, SQS, IAM, SNS, SES.

Cloud: Amazon Web Services

Web Technologies: HTML, JavaScript, CSS, jQuery, XML, Ajax, JSON, DOM, XHTML

Web Services: SOAP, RESTful

Networking: TCP/IP Fundamentals, IP Addressing, IP Services (DNS, DHCP, NTP etc), NAT, Access-Control Lists

Operating Systems: Windows, Linux, Mac OS, IDE Python IDE, Eclipse

Infrastructure as Service: AWS, OpenStack

Containerization Tools: Docker

Application Deployment: uDeploy

Virtualization Platforms: Vagrant, VirtualBoxConfiguration Management: Chef, Puppet, Ansible, Knowledge on SaltStack, jhipster

Build Systems: Maven, Ant, Gradle, NANT

Continuous Integration/Delivery: Jenkins, AntHillPro, Teamcity, Bamboo, Team City, Chef, Puppet.

Development Tools: Waterfall, Agile.

Application & Web Servers: Weblogic, Apache Tomcat, Jboss, Apache, IIS

Version Control: Git, SVN, Perforce

Scripting Languages: Shell, Ruby, Perl

Logging: Logstash, Splunk

Operating Systems: Linux, Unix, Windows XP, 2000, 2003, 2008, 2010, UNIX, Linux.

PROFESSIONAL EXPERIENCE

Confidential, MN

AWS Devops Developer

Responsibilities:

  • Involved in designing and deploying applications utilizing almost all of the AWSstack (Including EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling in AWSCloud formation.
  • Migrated the current Linux environment to AWS/RHEL Linux environment and used auto scaling feature. Also involved in remediation and patching of Unix/Linux Servers.
  • Created AWS Route53 to route traffic between different regions.
  • Configured AWS IAM and Security Group in Public and Private Subnets in VPC.
  • Used MySQL, DynamoDB and Elastic Cache to perform basic database administration.
  • Used GZIP with AWS Cloud-front to forward compressed files to destination node/instances.
  • Build out server automation with Continuous Integration - Continuous Deployment tools like Jenkins/Maven for deployment and build management system.
  • Configuration tool Installed and configured such as chef server / workstation and nodes via CLI tools to AWS nodes.
  • Supported many other projects build and deployments issues with Jenkins CI/CD pipelines.
  • Created a Function App on the portal using Maven build project via Jenkins CI/CD pipeline.
  • Automated CI/CD with Jenkins, build-pipeline-plugin, Maven, GIT Set up Jenkins master/slave to distribute builds on salve nodes.
  • Architected, planned, developed & maintained Infrastructure as code using CI/CD deployments using Terraform.
  • Installed VMware ESXi 5.5 and 6, vSphere Server and VMware center Server in rack Servers.
  • Created Templates for main services like Nova, Swift and Neutron for reuse the current environment or to easily modify the current environment.
  • Leveraged AWS cloud services such as EC2, auto-scaling and VPC to build secure, highly scalable and flexible systems that handled expected and unexpected load bursts.
  • Manage Amazon redshift clusters such as launching the cluster and specifying the node type as well.
  • Used AWS Beanstalk for deploying and scaling web applications and services developed with Java, PHP, Node.js, Python, Ruby and Docker on familiar servers such as Apache, and IIS.
  • Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
  • Setup JIRA "JQL Tricky" plugin, to help users for tracking on their dashboards using a JQL or by configuring manually.
  • Discuss and Design Architecture of EDW data migration framework using Spark, Redshift using Airflow and DynamoDB
  • Building a Full Stack Applications, Microservices and RESTful API's deployed in Docker using NodeJS, Express.
  • Proficient with Agile Software Development experience as Full Stack Developer(MEAN) and Microservices.
  • Created JIRA projects, templates, workflows, screens, fields and other administrative activities.
  • Provided ongoing support and configuration for JIRA project, workflows, Screens, fields, permissions, and other Admin tasks.
  • Involved in a POC to build an Airflow operator using Python which fetched data from Teradata and then enabled features like writing the data to s3 for Athena, generating result CSVs, writing data to Snowflake.
  • Implemented automated local user provisioning instances created in AWS cloud.
  • Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation JSON templates.
  • Extracted the data from MySQL, Oracle, SQL Server using Sqoop and loaded data into Cassandra.
  • Provide highly durable and available data by using S3 data store, versioning, lifecycle policies, and create AMIs for mission critical production servers for backup.
  • Strong experience implementing project using jhipster Framework and customizing as per project requirements
  • Maintained the user accounts (IAM), RDS, Route 53, VPC, RDB, Dynamo DB, SES, SQS and SNS services in AWS cloud.
  • Developed incremental and full Sqoop jobs to import data from Oracle/Teradata to S3 and scheduled the jobs in production using Airflow.
  • Created a complex ETL nightly process using Snowflake SQL (Supports ANSI SQL standard) which will collect the Reverse Inventory data (with supporting information like material attributes and the total of 4-Wall inventory) from various base tables and objects created in the previous steps. This aggregated reverse inventory information will be display it in the Disposition Application for the users to categorize into one of twelve disposition values.
  • Data extraction using Snowflake COPY command and load using Teradata MLOAD/TPT.
  • Data extraction and load using Apache Spark SQL and Hive.
  • Data extraction using Snowflake utility and load using Teradata TPT with AWS S3 access module.
  • Capacity planning of host servers to hold the data in between extraction and load period.
  • Creation of various Snowflake objects (tables, aggregate and consumption views) and Snowflake stage areas.
  • Worked on POC for Deploying the AWSinfrastructure using TerraForm, CloudFormation created and configured Redshift Cluster for data-warehousingresponsible for Security, included opening different ports on security groups and Network ACL.
  • Implement infrastructure as code and automate infrastructure and EC2 deployments with Ansible and Terraform.
  • Created high quality spring boot + Angular/React projects using Jhipster.
  • Implemented Terraform modules for deployment of various applications across multiple cloud providers and managing infrastructure.
  • Implemented Cloud Infrastructure (IaaS) Automation across AWS Public Cloud using Packer & Terraform.
  • Implemented Terraform Enterprise to Provision Infrastructure across AWS Workloads and OpenShift Clusters.
  • Large scale data modeling on Teradata and Snowflake environment which includes creation of various database objects (tables, views, complex aggerate views for reporting and presentation layers), schemas, user accounts, roles, permission statements and Snowflake stage areas.
  • Responsible for the development of Docker images on internal suite of hardware platforms.
  • Converted legacy production virtual machine to Docker environment.
  • Implemented build stage- to build the Microservices and push the Docker container image to the private Docker registry.
  • Used a full complement of MEAN (Express, AngularJS, Node.js, and MongoDB) stack to store and present assessments.
  • Application backend implemented using Node.js application server.
  • Performed Restful routing using node.js which submits my form data to the Mongo DB database.
  • Designed and implemented scalable, secure cloud architecture based on Amazon Web Services.
  • Defined branching, labeling, and merge strategies for all applications in Git.
  • Built Continuous Integration environment Jenkins and Continuous delivery environment.Utilized Configuration Management tool Chef & created Chef Cookbooks using recipes to automate system operations.
  • Implement infrastructure as code and automate infrastructure and EC2 deployments with Ansible and Terraform.
  • Implemented Terraform modules for deployment of various applications across multiple cloud providers and managing infrastructure
  • Build servers using AWS, Importing volumes, launching EC2, RDS, creating security groups, auto-scaling, load balancers (ELBs) in the defined virtual private connection.
  • Deployed applications on AWS by using Elastic Bean Stalk.
  • Used Ansible server and workstation to manage and configure nodes.
  • Configured plugins for the integration tools to the version control tools.
  • Manage source code, software builds, software versioning, & defect tracking on software maintenance tasks/projects.
  • Used Node.js, AngularJS and Bootstrap in creating web applications and cross-platform runtime in a fast-paced environment.
  • Administered and Engineered Jenkins for managing weekly Build, Test and Deploy chain, SVN/GIT with Dev/Test/Prod Branching Model for weekly releases.
  • Created monitors, alarms and notifications for EC2 hosts using CloudWatch.
  • Migrated applications to the AWS cloud.
  • Developer several modern web applications using Jhipster.
  • Involved in DevOps processes for build and deploy systems.
  • Created Python scripts to totally automate AWS services, which includes web servers, ELB, CloudFront distribution, database, EC2 and database security groups, S3 bucket and application configuration, this script creates stacks, single servers, or joins web servers to stacks.
  • Managed version control tool Git to version code changes to help developers/programmers branch/merge/revert code.
  • User, Group creation, monitoring and maintaining log for system status/health using Linux commands and Nagios system monitor.
  • Good understanding of ELB, security groups, NACL, NAT, firewall and Route 53.
  • Designed and developed automation test scripts using Python.

Environment: AWS, EC2, S3, VMware, Tomcat Apache, CloudWatch,CI/CD, Microservices,CloudFormation, DynamoDB, VPC, IAM, Nagios, Git, Chef, Linux, Data Centre Migration, Jhipster, Jenkins, Maven.

Confidential, MD

AWS Developer

Responsibilities:

  • Launching Amazon EC2 Cloud Instances using Amazon Web Services (Linux/ Ubuntu) and Configuring launched instances with respect to specific application
  • Create the stack using Cloud Formation Template to launch multiple Instances.
  • Installed the application on AWS EC2 instances and also configured the storage on S3 buckets
  • Deployed EC2 Instance, adding (EBS) block level storage volume to increase the availability of the website
  • Taking Encrypted Snapshots from the Encrypted volumes to create new volumes size.
  • Using Simple storage services (S3) for snapshot and Configured S3 lifecycle of Applications & Databases logs, including deleting old logs, archiving logs based on retention policy of Apps and Databases
  • Setting up the Elastic load balancer (ELB) to send traffic to all instance in the availability zone.
  • Designed a security group for maintaining the inbound and outbound traffic.
  • Managed hosted Zone and domain name service using Route 53.
  • Using various routing policies in Amazon Route53.
  • Focused on automating manual, CI/CD tasks, finding bottlenecks and tool assessment at corporate level.
  • Designing & implementing auto provisioning, testing, monitoring and build pipelines for CI/CD.
  • Participating CI/CD pipeline developments and resolving blockers down the road.
  • Working with PythonORM Libraries including Django ORM, SQLAlchemy.
  • Used severalPython libraries like wxPython, NumPy and Matplotlib.
  • Participated in the complete SDLC process and usedPythonto develop website functionality.
  • Built various graphs for business decision making usingPythonMatplotlib library.
  • Worked withPythonOO Design code for manufacturing quality, monitoring, logging, and debugging code.
  • Design and implement Spring Boot Microservices using ATDD and TDD to support highly customizable and scalable APIs.
  • Installed Bigdata components such as Event-hubs, Stream-Analytics, CosmosDB, BLOB storage, HDInsight Spark Cluster on Cloud using Jenkins CI/CD pipeline.
  • Implemented solutions for ingesting data from various sources and processing the Data-at-Rest utilizing BigData technologies such as Hadoop, MapReduce Frameworks, HBase, Hive, Oozie, Flume, Sqoop etc.
  • Promoted microservices robustness through adoption of resiliency patterns such as the circuit breaker pattern.
  • Architected microservices solutions to meet NFR requirements, including service discovery, zero-downtime (rolling upgrades), service versioning, resiliency and failover.
  • Defined policies and architecture for ensuring scalable, resilient and portable microservices to be run in containers.
  • Installed and managed plug-ins for JIRA and confluence in production environment.
  • Worked with various teams on Setting new JIRA & Confluence instances for new teams.
  • Worked on Integrating(migrating) JIRA with Confluence, Fisheye, Crucible.
  • Generated scripts for effective integration of JIRA applications with other tools.
  • Conducted analysis and evaluation of existing systems to upgrade latest version.
  • Helped users to subscribe to this timesheet report to receive email notifications.
  • Developed interface in dashboard and used JIRA Queries to handle it.
  • Setup of Virtual Private Networks across Departments with strong Network ACLs at both the Subnet and the Instance level.
  • Worked on installation of Docker using Docker toolbox.
  • Worked on creating the Docker containers, Docker images, tagging and pushing the images and Docker consoles for managing the application life cycle.
  • Deploy Docker Engines in DockerVirtualized Platforms for containerization of multiple applications.
  • Configure Docker container for automated testing purposes.
  • Experienced in SOX audits with internal and external auditors and Defined Control objects for Risk and Control items.
  • Experienced in transferring data from different data sources into HDFS systems using producers, consumers and Kafka brokers
  • Involved in developing Hive DDLs to create, alter and drop Hive tables and storm, & Kafka.
  • Auto scaling of Web Application instances based on the Cloud watch alarms during sudden increase in network traffic.
  • Involved in analyzing existing architecture on premise datacenters and designed to migrate complex Network architectures to AWS Public Cloud.
  • Worked with networking teams in configuring AWS Direct Connect to establish dedicated connection to datacenters.
  • Design roles and groups for users and resources using AWS Identity Access Management (IAM).
  • Used Tomcat and WebLogic as standard application servers to deploy web applications.
  • Scripting in multiple languages on Linux - Bash Shell, kornshell.
  • Analysis of Automated Weekly Instance usage reports and choose the right instance type for applications based on the Network I/O, CPU utilization and RAM
  • Monitoring & Metrics - Using Amazon Cloud watch, monitor infrastructure and applications such as EBS, EC2, ELB, S3
  • Configure notifications for the alarms generated based on events defined.
  • Conduct incident review and root cause analysis, and escalate incidents as appropriate.
  • Took Lead on troubleshooting most of the AWS services.
  • On-call support for issues related to Linux VMs hosted in AWS and network troubleshooting.
  • Managed cost analysis tool to monitor AWS costs

Environment: AWS EC2, S3, Identity Access Management, CI/CD,EBS, Elastic Load Balancers, Route 53, Cloud watch, GIT, Web logic, Tomcat, Jhipster, Microservices, AWS IAM, AWS S3, AWS cloud watch, cloud formation, Apache HTTPD, Groovy, Kafka, Apache Tomcat, Json, Bash, Shell

Confidential, NJ

AWS DevOps Engineer

Responsibilities:

  • Implemented Agile development process on Software Development Life Cycle.
  • Deployed NodeJS web applications into Elastic Beanstalk Environment.
  • Used NodeJS code in AWS Lambda Functions.
  • Design roles and groups using AWS Identity and Access Management (IAM).
  • Helped migrating and managing multiple applications from on premise to cloud using AWS services like S3, Glacier, EC2, RDS, SQS, SNS, SES.
  • Configured and maintained user accounts for dev, QA, and production teams and created roles for EC2, RDS, S3, Cloud Watch.
  • Launching and configuring of Amazon EC2 (AWS) Cloud Servers using AMI's(Linux) and configuring the servers for specified applications.
  • Managing with Custom AMI's, created AMI tags and modified AMI permissions.
  • Created S3 buckets for EC2 instances to store all the content including HTML pages, images, CSS files and Java script files.
  • Configured, supported and maintained all network, firewall, storage, load balancers, operating systems, and software in AWS EC2 and Created detailed AWS Security groups which behaved as virtual firewalls that controlled the traffic allowed reaching one or more AWS EC2 instances.
  • Managing multiple AWS instances, assigning the security groups, Elastic Load Balancer and AMIs.
  • Auto scaling the instances to design cost effective, fault tolerant and highly reliable systems.
  • Provided production support by debugging system issues.
  • Repeatedly worked on AWS Cloud platform and its features which include EC2, VPC, AMI, RDS, SES, S3, Route 53, IAM, Cloud Formation, Cloud Front, and Cloud Watch.
  • Configured S3 lifecycle of Applications & Databases logs, including deleting old logs, archiving logs based on retention policy of Apps and Databases.
  • Configured and managed AWS Glacier, to move old data to archives based on retention policy of databases/ applications.
  • Configured custom metrics for the AWS Cloud Watch for detailed monitoring.
  • Used AWS SDK for JavaScript in NodeJS.
  • Implementing a Continuous Delivery framework using Jenkins, Maven & Nexus in Linux environment.
  • Using Amazon Route53 to manage public and private hosted zones.

Environment: Amazon Web Services (AWS) EC2, Route 53, S3, VPC, IAM, Cloud Watch Alarms, Cloud Formation, SNS, SES, SQS, Git, GitHub, RDS, JUNIT, Jenkins, Maven, NodeJS.

Confidential, MI

DeVops Engineer

Responsibilities:

  • Participated in weekly release meetings to identify and mitigate potential risks associated with the releases.
  • Imported and managed multiple corporate applications into GIT.
  • Integrated GIT into Jenkins to automate the code check-out process.
  • Performed regular builds and deployment of the packages for testing in different Environments.
  • Created Branches, Tags in GIT for each release and particular environments
  • Experience in working GIT for branching, tagging, and merging.
  • Implementing a Continuous Integration and Continuous Deployment framework using Jenkins, and Maven in Linux environment.
  • Performed merges between different branches and resolved all merge conflicts successfully by working with development teams.
  • Build servers using AWS: Importing volumes, launching EC2, creating security groups, load balancers (ELBs) in the defined virtual private connection.
  • Created Jenkins Jobs for Build and Deployment of the application code on to the AWS Instances.
  • Maintained build and deployment procedures and resolved configuration management issues, created Branches for each release for particular environment, making baselines and Merging of branches.
  • Built and Deployed Java/J2EE to a web application server in an Agile continuous integration environment and automated the whole process.
  • Implemented a CI/CD pipeline involving Git, Jenkins, Chef, and Docker, for complete automation from commit to deployment.
  • Involved in Configuration Automation and Centralized Management with Ansible and Implemented Ansible to manage all existing servers and automate the build/configuration of new servers by using Jhipster.
  • Involved in writing various Custom Ansible Playbooks for deployment orchestration and developed Ansible Playbooks to simplify and automate day-to-day server administration tasks.
  • Installed and Configured Chef Enterprise and Chef Workstation hosted as well as On-Premise; Bootstrapped Nodes; Wrote Recipes, Cookbooks and uploaded them to Chef-server.
  • Installed and created CI pipeline using Jenkins tool.
  • Created Jenkins Automated Pipeline for CI and CD with Maven Scripts along with GIT Version control.
  • Installed Chef Server and wrote custom cookbooks and recipes.
  • Configured the Chef nodes by using custom chef recipes and cookbooks.
  • Created Docker images for the entire application and moved it to the certain GIT repository.
  • Run the Docker container in test environment and the prod environment as well.
  • Creating fully automated CI build and deployment infrastructure and processes for multiple projects.
  • Developing scripts for build, deployment, maintenance and related tasks in Jenkins, using Bash.
  • How to develop a code and a developed code will be tested and how we integrate our data to that code.
  • Created Jenkins Automated Pipeline for CI and CD with Maven Scripts along with GIT Version control
  • Installed and Configured Chef Enterprise and Chef Workstation hosted as well as On-Premise; Bootstrapped Nodes; Wrote Recipes, Cookbooks and uploaded them to Chef-server.
  • Designed roles and groups for users and resources using AWS Identity Access Management (IAM).
  • Built Servers using AWS, importing volumes, launching EC2, creating security groups, load balancers in the defined virtual private connection.
  • Experience with AWS instances spanning across Dev, Test, Pre-Production and Cloud Automation through.
  • Trained and Performed Open Source DevOps tools like Chef, Jenkins & Docker

    Environment: Tomcat, Chef, Oracle, Icinga, Ansible, Bamboo, Perl, Jenkins, Nolio, Python, Ruby Chef, JIRA, Maven, Artifactory, Git, Ubuntu, CentOS, AWS, Ruby, Windows, Docker, CI/CD, AWS,Cloud, Foundry,Shell Script.

Confidential

Jr DevOps Engineer

Responsibilities:

  • Installation of patches, security fixes, e-fixes, packages on AIX, Sun Solaris, and Red Hat Linux.
  • Installation and configuration of Red Hat Linux & Sun Solaris, NT servers
  • Update latest patches and upgrade all applications to the latest versions.
  • Scheduling jobs in Crontab, modifying them on regular basis according to requirement.
  • Setup LAN consisting of ten RS-6000 Servers running AIX 4.3,5.0, 5.1, and five Solaris Name Servers (E450)
  • Supporting the network on connectivity issues relating to TCP/IP, IPX, and AppleTal
  • Administering user accounts: Adding users, user-groups, removing users & groups and
  • The primary responsibilities of the job include monitoring all the servers verify
  • Troubleshooting of DNS and DHCP related problems.
  • Creation of Linux Virtual Machines using VMware ESX Virtual Client 5.1.0
  • Installation of Red Hat Linux Virtual Machines through kickstart and interactive installation.
  • Performance of RPM and YUM package installations, Yum repository management.
  • Managed and configured LVM (Logical Volume Manager).
  • Administration of DHCP, FTP, NTP, DNS, and NFS services in Linux.
  • Creation and management of users' and groups' accounts, passwords, profiles, security (ACL, Disk Quota, and PAM), permissions, disk space usage and processes.
  • Management scheduling cron jobs, system logging and network logging.
  • Regular disk management such as partitioning according to requirements, creating new file systems or growing existing ones over the hard drives and managing file systems.
  • Experience with network monitoring and performance management tools.
  • Offered technical support by troubleshooting Day-to-Day issues with various Servers on different platforms.
  • Provided customers' services as possible as it requested.
  • Have earned a reputation among co-workers and supervisors as valuable and competent employee.
  • Trusted with closing responsibilities

Environment: Red Hat Linux 3.0, SUN Solaris 2.6, NT/2000, MS Operating

We'd love your feedback!