We provide IT Staff Augmentation Services!

Senior Site Reliability And Elastic Stack Administrator/engineer Resume

5.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • Reliable, self - managed engineer willing to learn and to mentor others. Offer troubleshooting, database management, data parsing, System Operations, Site Reliability and Python scripting experience.

TECHNICAL SKILLS

  • Linux/Unix, Solaris
  • BSD, Red Hat, Oracle
  • Networking, TCP/IP, SMTP
  • Python, Bash Scripting
  • Elastic Stack Administrator
  • Elastic Stack Engineer
  • Email Authentication
  • Email Threat Analysis
  • Email Forensics
  • Data Mining
  • Ansible Scripting
  • Azure Cloud Environment
  • SQL, MySQL, Postgres SQL
  • DNS, FTP
  • Grafana Dashboard Design
  • SPF, DKIM, DMARC

PROFESSIONAL EXPERIENCE

Confidential - Atlanta, GA

Senior Site Reliability and Elastic Stack Administrator/Engineer

Responsibilities:

  • Supporting large, enterprise footprint, deployments of Elasticsearch in a fast-paced, mission environment Providing full troubleshooting support to Elasticsearch deployments as needed, to include the systems and software levels
  • Configuring, maintaining, and optimizing Elasticsearch environments
  • Planned and documented the upgrade path for Elasticsearch as required
  • Optimizing Elasticsearch capabilities and working with other team members to ensure the highest possible level of data/search parity
  • Creating, reviewing, and maintaining information technology documentation, particularly related to Elasticsearch capabilities and requirements
  • Elastic Stack System Architecture
  • Planned, designed, and implemented the expansion and separation of the cluster.
  • Design the data ingestion flow from internal and external sources.
  • Created scripts to pull data from vendors software using API’s.
  • Catchpoint
  • Jira Service Desk and Software
  • Opsgenie
  • Google Analytics
  • Localytics
  • Used ETL (Extract Transform Load) using python and Logstash to ingest the data into Elasticsearch
  • Create complex Postgres SQL queries to visualize that in Grafana.
  • Used knowledge of Linux/UNIX to troubleshoot server issues and automate task using custom scripts.
  • Used knowledge of TCP to troubleshoot connectivity issues. Performed data captures (TCP dumps) and analyzed the information to root cause network configuration issues.
  • Configured the CloudMark Sever to send logging information to Splunk.
  • Created Ansible Playbooks to install the OS security updates.
  • Reviewed log files and network data to determine the cause of issue.
  • Joined in on calls with customer on escalated issues to help resolve the issue before it reaches the SLA.
  • Helped the testing group troubleshoot network issues in the testing environment.
  • Part of a 24/7 on call rotation group.
  • Created new RHEL 7 servers for use in the Elastic Stack.
  • Updated routing tables.
  • Created rules within the load balances
  • Developed new architecture designs for subscriptions.
  • Designed Grafana dashboards using data connection from Elasticsearch and Postgress Database;
  • Worked with product owners to understand their data and create visualization around it.
  • Mentored new hires and user on how to troubleshoot the Elastic Stack using Linux commands, Catchpoint, and Opsgenie.
  • Debugged python 2.7 and 3.x code for other developers on the team.
  • Implements computer system requirements by defining and analyzing system problems; designing and testing standards and solutions.
  • Defines application problem by conferring with clients; evaluating procedures and processes.
  • Develops solution by preparing and evaluating alternative workflow solutions.
  • Controls solution by establishing specifications and coordinating production with programmers.
  • Validates results by testing programs.
  • Ensures operation by training client personnel and providing support.
  • Provides reference by writing documentation.
  • Accomplishes information systems and organization mission by completing related results as needed.
  • Create repositories within Bitbucket to store and manage code.
  • Elastic Logstash configurations
  • Elasticsearch index templates and mappings
  • Customer python, ansible, and bash code.

Confidential

Application Technology Architecture Associate Manger

Responsibilities:

  • Monitor all incoming and outgoing emails for spam activities.
  • Configured the CloudMark Workflow rules on the CloudMark servers allow and deny connections based upon IP address and email content. Create policies to stop unwanted spam messages based on senders, IP address, 821 and 822 headers, subject, content, and attachments.
  • Created new process on how to detect spam and fishing emails based on the log data.
  • Created Kibana queries to search the logs for specific data.
  • Performed threat model, using ETL (Extract Transform Load), Used Elasticsearch API with python to query the Kibana server and save the data to a file. The data was then parsed using json, pandas, and python to transform it into a readable format for the reporting team.
  • Performed threat model, using ETL (Extract Transform Load), by parsing the email sever logs to find messages the domain name, IP address, and recipient email addresses of all of the messages for seven days. This information was then reviewed do determine what type of spam or phishing messages were passing through the appliance. Once the message was determined to be a spam it then place this an Excel spreadsheet that presented a graphical report.
  • Used knowledge of Linux/UNIX to troubleshoot mail server issues and automate task using custom scripts.
  • Used knowledge of TCP to troubleshoot connectivity issues. Performed data captures (TCP dumps) and analyzed the information to root cause network configuration issues.
  • Configured the CloudMark Sever to send logging information to Splunk.
  • Created Ansible Playbooks to install the Splunk Forwarder on the CloudMark servers.
  • Created Splunk Indexes to be used by the email support.
  • Created Splunk Dashboards to show trends based upon the needs of the user.
  • Created a Python script that allowed me to send email directly to an internal email server for testing. With this script it allowed the user the option of sending one email or tens of thousands of emails to any particular mail sever within our test environment.
  • Created a Python script to test IMAP and POP connective to a Devcot servers.
  • Created new process on how to detect spam and fishing emails based on the log data.
  • Participated in meetings and provide insight on the current settings on the anti-spam device.
  • Provide insight on what changes need to be made to those devices to better detect spam.
  • Joined in on calls with customer on escalated issues to help resolve the issue before it reaches the SLA.
  • Helped the testing group troubleshoot network issues in the testing environment.
  • Part of a 24/7 on call rotation group.
  • Trained the Tier 3 group on how to troubleshoot the email system, Linux commands, and searching Splunk to retrieve log information using index.
  • Debugged python 2.7 and 3.x code for other developers on the team.
  • Showed the Tier 3 group how to create bash scripts to search the logs to find the information they were looking for.

Confidential - Alpharetta, GA

Senior Email Anti-Spam Support Engineer

Responsibilities:

  • Provided 24/7 Support on a global level in a 100% BSD, Linux and Red Hat environment for the IronMail and Confidential Email Gateway server w/anti-spam, ant-virus & email filtering.
  • Provided 24/7 Support on a global level for the Email and Web Security product. Walked customer through installing the software in and VMware ESX, ESXi, VMware Server, and the Intel hardware. Explained the too customers the basic concepts, policies, protocols (SMTP, POP3, FTP, HTTP, and ICAP), maintenance and monitoring. Helped configure policies to allow and prevent access to certain websites. Edited the XML configuration file to work around issued until patches were released.
  • Provided 24/7 Support on a global level for the ePolicy Orchestrator server. Help customers configure the ePolicy Orchestrator to connect to the Confidential Email Gateway and Email Web Security products. Trouble shot and connective issues between the devices. Insured that the correct Policies were being pulled from the parent device to the correct child devices.
  • Used knowledge of Linux/UNIX to troubleshoot mail server issues, automate task, and backup/restore databases.
  • Identified new virus threats, analyzed trends and updated IP address reputation(s) using Trusted Source (RBL). Created reports to identify our effectiveness against new threats.
  • Used knowledge of TCP to troubleshoot connectivity issues. Performed data captures (TCP dumps) and analyzed the information to root cause network configuration issues.
  • Configured the IronMail and Confidential Email Gateway server to communicate with various mail/directory servers i.e. Exchange, Group Wise, LDAP, and Active Directory.
  • Configured the Confidential Gateway to send logging information to Splunk, Event Format server, Content Security Reporter, and the Confidential Enterprise Security Manager.
  • Walked customer configuring and applying DKIM and SPF records for the Confidential Email Gateway.
  • To test the DKIM is working properly, I would first query the customer’s DNS TXT records to view their DKIM signature. Then have the customer send a test email to a Yahoo, GMAIL, or Hotmail account. I then would review the original message in text format, this would allow me to pull the headers and collect the DKIM signature. I would then validate the signature with one of the many online DKIM validation tools. If the validation failed, then a new DKIM recorded had to be created.
  • Create policies to stop unwanted spam messages based on senders, IP address, 821 and 822 headers, subject, content, and attachments.
  • Change the Public Community String within the SNMP file form Confidential to a Private Community String.
  • Created a python script to automatically query the SQL and Postgres database to obtain the domain name, IP address, sender address, recipient address, message subject, and message score. This information was then converted to a CSV file with headers to be placed in an Excel spreadsheet.
  • Performed threat model, using ETL (Extract Transform Load), by parsing the email sever logs to find messages the domain name, IP address, and recipient email addresses of all of the messages for seven days. This information was then reviewed do determine what type of spam or phishing messages were passing through the appliance. Once the message was determined to be a spam it then place this an Excel spreadsheet that presented a graphical report. This information would show the spam effectiveness of the appliance. The if the spam effectiveness was less than 95%, we would then review the appliance and provide reconfiguration option of the policies based on the type of emails that were passing through the device.
  • Created a Python script that allowed me to send email directly to an internal email server for testing. With this script it allows the user to option of sending one email or tens of thousands of emails to any particular mail sever within out test environment.
  • Redesigned the team’s internal website to using Joomla content manager, PHP, and Python. The PHP script was designed to connect to a MySQL sever and pull a list of email severs to display their IP address, host name, and create a link to the portal page for administration. The python script check to see if the devise was up and running. Then it updated the status of the mail server in the MySQL database to up or down. These allow the team to know which servers are running.

We'd love your feedback!