We provide IT Staff Augmentation Services!

Linux Administrator Resume

0/5 (Submit Your Rating)

Waltham, MA

SUMMARY

  • Overall 10+ years of experience as a ELK Engineer and Build/Release management, SCM, Environment Management and Build/Release Engineering for automating, building, releasing and configuring changes from one environment to another environment.
  • SME on Elasticsearch and recognized for my customer obsession, ownership of operational issues and developing solutions to address scalability issues.
  • Experience in Administration, maintaining and monitoring of ELK (Elasticsearch, Logstash, Kibana) and Kafka Clusters in dev, stage and Prod environments using Ansible.
  • Experience in managing ELK Stack and Kafka clusters deployed in Amazon EKS (Kubernetes clusters).
  • Experience in doing CRUD operations on Kubernetes resources like create/update deployments, Statefulsets, Ingress etc.
  • Experience in upgrading Elasticsearch clusters from 5.x to 6.x to 7.x using rolling/full upgrade methods.
  • Experience in adding/removing brokers into the cluster, setting up monitoring with appD and metricbeat.
  • Experience in setting up SSL, using SCRAM module for SASL SSL listener, creating required Kafka acls to follow the principle of least privilege access.
  • Experience in configuring RBAC for the confluent 5.5.0 version for some of the main components of confluent platform like Brokers, schema registry, Kafka rest proxy.
  • Experience with configuring multiple Logstash pipelines and filtering the data before pushing to Elasticsearch.
  • Expertise in Application Deployments & Environment configuration using Ansible.
  • Experience in automating repetitive tasks with python/shell.
  • Experience in writing Ansible scripts. Wrote many Ansible playbooks from installing java/tomcat/copying files to configuring and installing applications.
  • Experience in integrating applications with AD using LDAP, handling /updating SSL certs in ACM/IAM.
  • Experience in writing Dockerfiles to build images specific to the application and deploy the application on the built image using docker - compose, deploying apps into EKS cluster.
  • Responsible to designing and deploying new ELK clusters (Elasticsearch, Logstash, Kibana, beats, Kafka, zookeeper etc.
  • Design, build and manage the ELK (Elasticsearch, Logstash, and Kibana) cluster for centralized logging and search functionalities for the App.
  • Written and Maintained Automated Salt scripts for Elasticsearch, Logstash, Kibana, and Beats. Expertise in Repository Management tools Jfrog, Artifactory, and Nexus.
  • Good knowledge on working with monitoring tools like Nagios. Real time streaming of data using Spark with Kafka.
  • Skilled in monitoring servers using Nagios, Data dog, Cloud watch and using EFK Stack Elasticsearch Fluentd Kibana.
  • Excellent hands-on experience with configuring & managing the monitoring tools.
  • Configuring app monitoring in ELK and App Dynamics.
  • Written and Maintained Automated Salt scripts for Elasticsearch, Logstash, Kibana, and Beats. Expertise in Repository Management tools Jfrog, Artifactory, and Nexus.
  • Good knowledge on working with monitoring tools like Nagios. Real time streaming of data using Spark with Kafka.

TECHNICAL SKILLS

Programming Languages: Python, C++, Java, C, LR script

Web Technologies/Frameworks: HTML, CSS, JavaScript, Bootstrap, JSP, JDBC, Spring ORM with Hibernate, Spring MVC, EJB, Angular

Web Service Specification: XML, REST/JSON, JAX-RS, JAX-WS

Database: MySQL, MongoDB, Firebase |

Build Tools: Maven, Ant and MS Build

Operating System: Windows and Linux

Tools: Eclipse, Android Studio, HP LoadRunner 11.0, Selenium, Maven, HP Quality Center, postman, JIRA.

Version Control Tools: SVN, GIT, TFS, CVS, Bitbucket and IBM Rational Clear Case.

Reporting Tools: Elasticsearch Logstash Kibana, Splunk.

Configuration Tools: Chef, Puppet, Salt Stack and Ansible

Virtualization Tools: Docker, VM virtual Box and VMware

Automation Tools: Jenkins/Hudson, Build Forge and Bamboo, Salt Stack, GitHub

Web/Application Servers: Web Logic, Apache Tomcat, Web Sphere and JBOSS

PROFESSIONAL EXPERIENCE

Sr ELK Stack Engineer

Confidential

Responsibilities:

  • Migrate Data from Elasticsearch-2.x Cluster to Elasticsearch-5.6.4 and version 5x to 6.5, 6.5 to 7.X using logstash, filebeat Kafka for all environments.
  • Provided design recommendations and thought leadership to improved review processes and resolved technical problems. working with product managers to architect the next generation of Workday searches.
  • Responsible to designing and deploying new ELK clusters (Elasticsearch, logstash, Kibana, beats, Kafka, zookeeper etc.
  • Worked on configuring the EFK stack and used it for analyzing the logs from different applications.
  • Involved in creating the cluster and implemented the backup of the cluster with the help of curator by taking the snapshots
  • Spin up the environment with the help of chef cookbooks and involved in modifying them per our requirement.
  • Created users for application teams to view their logs using curl statements and provided only the read access to them.
  • Configured xpack for the security and monitoring of our cluster and created watches to check for the health and availability of the nodes.
  • Used AWS Beanstalk for deploying and scaling web applications and services developed with Java, PHP, Node.js, Python, Ruby, and Docker on familiar servers such as Apache, and IIS.
  • Worked on Cloud automation using AWS Cloud Formation templates.
  • Implemented Continuous Delivery framework using Jenkins, Chef, and Maven in Linux environment.
  • Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation using Jenkins along with Shell scripts to automate routine jobs.
  • Manage AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing and Glacier for our QA and UAT environments as well as infrastructure servers for GIT and Chef.
  • Installed and deployed Kafka, Zookeeper, ELK, and Grafana using Ansible playbooks
  • Benchmark Elasticsearch-5.6.4 for the required scenarios.
  • Involved in enabling cluster logs and search slow logs temporarily using rest API calls to collect logs and analyzing those logs to troubleshoot the Elasticsearch related functional and performance issues.
  • Involved in updating the cluster settings using both API calls and configuration file changes.
  • Have prepared Elasticsearch operations guide and trained the operations team to perform day-to-day operations like back-up, restore, re-indexing, troubleshooting frequently occurring problems etc.
  • Working on cluster maintenance and data migration from one server to other and upgrade ELK stack.
  • Merge the data into share on avoid data crush and support load balancing.
  • Using X-pack for monitoring, Security on Elasticsearch-5.6.4 cluster.
  • Elasticsearch and Logstash performance and configuration tuning.
  • Identify and remedy any indexing issues, crawl errors, SEO penalties, etc.
  • Using Curator API on Elasticsearch to data back up and restoring.

Environment: Docker, Jenkins, Ansible, Elasticsearch, Logstash, Kibana, Kafka, Grafana, elasticdump, upgrades, filebeat, Kubernetes, Kafka streams, Cassandra.

Elastic Stack Consultant

Confidential - Charlotte NC

Responsibilities:

  • Responsible for working with the technical team to design, document, build, secure, and maintain Elastic Stack Enterprise solutions (Elasticsearch, Logstash, Kibana, and Beats).
  • Responsible for capacity planning, sizing, performance optimization, of the Elastic stack.
  • Troubleshooting production incidents requiring detailed analysis of issues, creating internal JIRA issues for bug reports and SOP for bug fixes and workarounds
  • Worked on complex problems that span a wide range of AWS services at all layers of the stack such as EC2, networking, security, storage, deployment and databases with a focus on Big Data and Analytics
  • Involved in Elastic stack packaging, configuration, and integration.
  • Bulk Loading jobs through spark and other various python-based bulk loads into the elastic environment.
  • Design and tailor Elasticsearch indices for high relevancy search results and hit highlights
  • Setup and configured LDAP, SAML, PKI, authentication mechanism for Elasticsearch, and SSO for Kibana.
  • Utilized role mappings, RBAC, Kibana spaces for meeting the data security requirements.
  • Work closely with architects, engineers, developers, and integrators to assess customer requirements and to design and support Elastic Stack solutions
  • Configure and maintain Linux based Operating Systems in support of the Elasticsearch
  • Setup data tiers (hot-warm-frozen) on elastic with appropriate ILM policies.

Environment: AWS, EC2, Kubernetes, Kibana, Redhat, Windows, Ubuntu, MySQL. Oracle, Docker, Beats, Ansible, Elastic Load Balancer, Route 53, S3, Cloudwatch, Cloudtrail.

ELK Stack Engineer

Confidential - Minneapolis, MN

Responsibilities:

  • Analyze structured and unstructured data points to design data architecture solutions for scalability, high availability, fault tolerance, and elasticity
  • Architect, design and implement high performance large volume data integration processes, database, storage, and other back-end services in fully virtualized environments.
  • Improved the performance of the Kafka cluster by fine tuning the Kafka Configurations at producer, consumer and broker level.
  • Implemented Disaster management for creating Elasticsearch clusters in two DC and configure Logstash to send same data to two clusters from Kafka.
  • Installed and deployed Kafka, Zookeeper, ELK, and Grafana using Ansible playbooks
  • Written and maintained Wiki documents for the Planning, installation, Deployment for Elk Stack and Kafka.
  • Written custom plugins to enhance/customize open-source code as needed.
  • Wrote Python scripts to parse JSON documents and load the data in database.
  • Used Python and Django to interface with the jQuery UI and manage the storage and deletion of content. written automation salt Scripts for managing, expanding, and node replacement in large clusters.
  • Sync Elasticsearch Data between the data centers using Kafka and Logstash. managing Kafka Cluster and integrated Kafka with Elasticsearch.
  • Snapshot Elasticsearch Indices data and archive in the repository every 12 hours.
  • Strong expertise in implementation of Kinesis, Elasticsearch, Logstash, Kibana Plugins. deploying Elasticsearch cluster on EKS.
  • Using Kibana illustrate the data with various display dashboard such as matric, graphs, pia-chart, aggregation table.

Environment: Load Balancer, Elasticsearch, Logstash, Kibana, Kafka, X-pack, Ansible, MySQL.

Elk Stack Engineer/Admin

Confidential - Bellevue, WA

Responsibilities:

  • Worked on AWS cloud watch for monitoring the application infrastructure and used AWS email services for notifying & configured S3 versioning and lifecycle policies to and backup files and archive files in Glacier.
  • Handled operations and maintenance support for AWS cloud resources, which includes launching, maintaining, and troubleshooting EC2 instances, and S3 buckets, Virtual Private Clouds (VPC), Elastic Load Balancers (ELB) and Relational Database Services (RDS)
  • Setup self-managed Elasticsearch cluster on AWS cloud.
  • Responsible for migrating data, ingest pipelines from Splunk to Elasticsearch.
  • Setup snapshots and restore process for archival purposes.
  • Performed a DR exercise to fallback from us-east-1 to us-west-2 AWS region.
  • Migrated Splunk artifacts to Kibana visualizations and dashboards.
  • Created complex watchers and Kibana alerts for observability.
  • Providing Global Search with Elasticsearch
  • Written watcher alerts based on required scenarios.
  • Developed APIs for integration with various data sources.
  • Implemented cloud-based integrations with elastic.
  • Setup a dedicated monitoring cluster to monitor multiple production Elasticsearch clusters.
  • Setup and configure beats to send logs and metrics from Linux servers, Windows, Kubernetes pods.
  • Created anomaly detection ML jobs to replace the existing threshold-based alerts.
  • Created ingestion pipelines to parse custom application logs and ensure it is ECS compliant.
  • Setup CCS (Cross Cluster Search) for frontend cluster.
  • Created documentation for onboarding new clients onto the Elasticsearch clusters.

Environment: Elasticsearch, Logstash, Kibana, Curator, Xpack, Watcher, Zookeeper, Accumulo, Kafka.

ELK Engineer/Admin

Confidential - Waltham, MA

Responsibilities:

  • Used ELK (Elasticsearch, Logstash and Kibana) for name search pattern for a customer.
  • Develop ongoing test automation using Ansible, Python based framework
  • Using Ansible to Setup/teardown of ELK stack (Elasticsearch, Logstash, Kibana).
  • Creating topics with multiple configurations, configuring/updating offsets of the consumer groups as required or resets.
  • Creating several users and assigning Acls using kafka-acls on multiple resources with their respective operations like creating a user and allowing the user to only produce to a specific topic pattern/topic.
  • Automated the process of creating topics in multiple environments using Jenkins.
  • Monitoring Kafka with AppD and Metricbeat (shipping data to monitoring Elasticsearch).
  • Using Ansible to update config, do a rolling restart or a full stop/restart to the brokers.
  • Deployed Filebeat, Metricbeat to manage and monitor the ELK Components like
  • Elasticsearch, Logstash, Kibana and to monitor the system level metrics.
  • Automated the process of creating Logstash pipelines or on boarding new applications into ELK for the use case Observability (Log analytics).
  • Worked with Cross Cluster Search and Replication.
  • Build visualization and Dashboards using kibana.
  • Migrate all clusters for enterprise to Open source
  • Infrastructure design for the ELK Clusters.
  • Maintained multiple Kibana instances for the same Elasticsearch cluster.
  • Integrated Elasticsearch with SAML for user base.
  • Automated the process of creating users and assigning roles using shell script for native realm.
  • Used several features of Elasticsearch (from 7.x) like index lifecycle management, rollover jobs etc.
  • Automated the process of deploying kafka clusters in multiple environments.
  • Monitoring Zookeeper, Kafka, Schema Registry, Kafka Rest proxy, kafka connect, Confluent Control Center, ksqldb.
  • Automated few admin tasks like start/stop confluent components using Ansible along with configuring and deploying.

Environment: Load Balancer, Elasticsearch, Logstash, Kibana, Kafka, X-pack, Ansible, MySQL.

Linux Administrator

Confidential, Phoenix, AZ

Responsibilities:

  • Monitored everyday systems, evaluated availability of all server resources, and performed all activities for Linux servers.
  • Installed and maintained all server hardware and software systems and administered server performance and ensured availability for it.
  • Responsible for creating and managing users accounts, groups and security policies.
  • Troubleshooting network issues and system maintenance, resolving software and hardware issues.
  • Provided support of physical servers and virtual servers in a production environment.
  • Use Multi Factor Authentication to secure the AWS Account.
  • Created users account, adding /removing users, password reset, updating users profile, setting permissions on files and directories.
  • Good Experience in setting up the Linux environments, Password less SSH, creating file systems, disabling firewalls, swapping, Selinux and installing Java.
  • Creating new file system, managing & checking data consistency of the file system.
  • Ability to diagnose network problems, understood TCP/IP networking, and its security considerations.
  • Used Stack driver and AWS cloud monitoring extensively to monitor and debug the cloud based AWS EC2 services.
  • Monitored and Log Management on RHEL Centos, Ubuntu servers including processes, crash dumps and swap management, with password recovery.
  • Adding, removing, or updating user account information and resetting passwords.
  • Using Java JDBC to load data into MySQL.
  • Working experience with managing VMware virtual and container infrastructure (Docker, Kubernetes, etc.)
  • Maintaining the MySQL server and Authentication to required users for database access.
  • Installing and updating packages using YUM.
  • Patches installation and updating on server.
  • Installation and configuration of Linux for new build environment.
  • Did volume management using LVM and creating of physical and logical volume groups.
  • Hands-on experience in Linux admin activities on RHEL, Cent OS & Ubuntu.
  • Excellent in communicating with clients, customers, managers, and other teams in the enterprise at all levels.
  • Tested and configured AWS Workspaces (Windows virtual desktop solution) for custom application requirement.
  • Effective problem-solving skills and outstanding interpersonal skills.
  • Ability to work independently as well as within a team environment and driven to meet deadlines.
  • Motivated to produce robust and high-performance output work.
  • Ability to learn and use new technologies quickly.

Environment: Oracle Red Hat Linux6, 7; Linux, Centos, Ubuntu, VMware, LVM, TCP/IP, MYSQL

We'd love your feedback!