- AWS (S3, EC2, EKS, CF, CW, IAM, EBS, VPC, R53, ELB etc)
- Bash (Advanced)
- Perl (Advanced)
- Linux (Ubuntu, Container Linux, CentOS, RHEL)
- Agile Development
- Golang/Python (Basic)
Confidential, Houston, TX
Senior Systems Engineer
- Helped to design and implement docker based orchestration process powered by Kubernetes for use in AWS.
- Worked with many different technologies related to Kubernetes, such as OPA, Kustomize, Helm, Prometheus, Dex, Gangway, Cert manager, Ark (Velero), ArgoCD to name a few.
- Designed and published reports to help identify wasted resources in AWS, to help decrease application footprint and cost.
- Automated nearly everything related to Kubernetes using Gitlab Pipelines and Argo CD. Majority of pipelines used Terraform to build and update clusters.
- Operational management of 15+ Kubernetes clusters, in AWS, EKS and onprem running in CloudStack vm's. Experience managing clusters with hundreds of worker nodes and 10,000+ pods in production.
- Migrated EC2 Kubernetes clusters to EKS as a cost saving measure and simplified management over running native Kubernetes in EC2.
- Used Spot instances for Kubernetes worker nodes as a further cost cutting measure to decrease cluster costs by approximately 60%. Worked with Spotinst the company to get their product working in EKS as well.
Senior Systems Engineer
- Wrote a RESTful interface in Python using Flask to serve as an API for creating and managing instances in AWS. Used Nginx as the frontend to the app, with boto3 as the backend library for making requests.
- Designed Cloud Computing interface that allows users to request a custom virtual server in the cloud designed to their specifications. Also wrote the backend that builds and provisions servers when requested by the user, typically within 2 to 3 minutes. Designed admin interface to manage virtual servers, and report interface to see the overall cloud status.
- Developed several automated scanning Perl programs that were used to collect data and automate routine tasks. Scanning of assets to collect server based information both via SSH, console, and DRAC. Automatically determine memory/disk usage on virtual servers, and modify as appropriate. Changing passwords across servers, console devices, DRAC's, IP cameras, and ldap accounts. Scanning of console ports/DRAC interfaces to find missing/lost assets even if the servers were not on.
- Real time Excel spreadsheet based reports created with Perl collecting data from servers and database for various purposes. Determining servers that are idle, external cloud costs, general server report based on asset database, status of current spares, and server breakdown based on department ownership are some examples of reports that were created.
- Created Perl utilities designed to minimize time required to do tasks. Search netgroups, DNS search, DNS checking, auto consoling, root password changing on all hosts, and threaded app to execute commands on remote hosts are a few examples.
- Designed various web based perl programs used by management. Time tracking based on time entered in RT, an oncall rotation list with many features, edit/creation activities on wiki docs, and the ability to track urgent requests are some examples.
- Supported various teams throughout the company with technical needs and code deployment. Setting up development/qa/production environments, developing code deployment strategies, and troubleshooting issues as they arose.
- Maintained 5000+ Linux based servers on a day to day basis. Flavors of Linux included CentOS 4.x/5.x, RHEL 4.x/5.x, RedHat 7.x/9.0, and Ubuntu 12.04.
Senior UNIX Administrator/Perl Programmer
- Maintained over 300 Linux based servers. These range from NFS, DNS, Apache, Sendmail, MySQL, Oracle, and monitoring systems. My day to day responsibilities on these machines range from troubleshooting problems as they arise, developing new ways to avoid problems, setting up new machines, and enhancing already existing configurations.
- Setting front and back end video streaming servers for various video types. Video types included RealPlayer (RealMedia), Quicktime (Darwin), Windows Media via http, and Flash Video. Also worked on converting various types of software using both ffmpeg and MEncoder from one format to another.
- One of my larger tasks while working here over the last 2 years was fixing a system known as Bigmailbox. This system currently handles a volume of 10 million incoming emails per day. When I took over this project, approximately one in ten emails were delivered making it a very unreliable system. Today, without any new hardware being installed, it reliably is able to process all incoming email, and delivery is rarely unsuccessful.
- The ability to handle millions of incoming hits per day on our Apache web servers has also proved critical. My main task in this area was two fold; the ability to handle 2048 concurrent connections on an Apache web server, and to be able to control how much bandwidth was used per connection. I successfully setup a bank of servers behind an F5 load balancer, that is able to handle 7500 concurrent connections on 5 Intel based servers while throttling the bandwidth to ensure we never go over 50 mb/s. We have over 150 other web servers, many of which are based on configurations similar to the one I developed for this situation.
- Perl is something that I used on a day to day basis to aid me in many of these tasks. One of our primary monitoring systems is a program called Big Brother. In order to enhance this product and make it more suitable for our needs, it was determined that logging to a MySQL database, and a more advanced paging system would need to developed. I was successfully able to integrate MySQL and Big brother by modifying several of the shell scripts it used. I used Perl/DBI to write a custom paging system which better suited our needs.
- With over 300 servers, some central updating system is critical. Currently nearly every one of our servers uses Redhat, though many different versions are in place. While using Redhat's up2date functionality is the preferred solution for this, it can be very costly to implement. Instead a replacement server that duplicates Redhat's up2date functionality was implemented. This system was designed to have one central server provide updates for various versions of Redhat, such as 7.x, 8.0, and 9. While Redhat has unfortunately stopped providing updates for those versions of Redhat, we continue to use it as a distribution point for custom updates we develop on our own.
- Several of our projects require large scale storage requirements, without a large scale budget. I was solely responsible for designing, purchasing, and implementing a 15 TB redundant raid solution with a budget of under $20,000. The final solution used raid 5 drives, with hot spare drives, in a hot swap raid case, exceeded the space requirements, and came in under budget.