Site Reliability Engineer Resume
Santa Clara, CA
SUMMARY:
- DevOps Engineer with a definitive understanding of blending development and operations to quickly deliver code to customers. Brings Seven years of extensive experience as a software professional which comprises development, cloud management and spans across several business domains.
- Experience using AWS cloud Platform including EC2, VPC, S3, RDS, CloudFormation, Route53, CloudWatch, SNS, SQS, IAM, CodeDeploy,Kinesis,Lambda and CloudFront.
- Expertise in multiple Google Cloud Services for computing: Compute Engine and App Engine, for networking: Cloud Virtual Networks, Cloud Load balancing, Cloud Interconnect specifically partner interconnect with Equinix and XPN(Shared - VPC), for storage and DB: Cloud Storage, Cloud Bigtable, Cloud SQL, and PD’s, for big data: Big Query and Cloud Data Flow, for Identity and Security: IAM, CRM and Cloud Security Scanners, for monitoring: Stackdriver, Stackdriver Logging, Stackdriver Monitoring and Pub/Sub.
- Experience using AWS Redshift as a data warehouse for storing usage reports of applications.
- Experienced in architecting highly available, fault tolerant and scalable applications in AWS platform using EC2, Auto Scaling Groups, ELB and AMI.
- Ability to create scripts using Azure PowerShell for automation and build process.
- Experience in dealing with Windows Azure IaaS- Virtual Networks, Virtual Machines, Resource Groups, Express Route, VPN, Load Balancing and Auto Scaling.
- Experience in working with Azure Blob Storage for migrating data to and from cloud.
- Familiarity working with Kubernetes to automate deployment, scaling and management of web Containerized applications on Google Cloud Platform.
- Experience developing Docker images to support Development and Testing Teams and their pipelines; distributed Jenkins, Selenium, JMeter and ELK stack images.
- Good knowledge in creation, tagging and pushing of custom Docker containers with Docker swarm.
- Experience in creating Terraform templates for launching custom sized VPC and subnets.
- Knowledge of Working with Jenkins and Maven for Continuous Integration of applications built on Mulesoft Anypoint platform.
- Hands on experience working with Hashicorp's tools like Vault,Vagrant and Packer.
- Experience in deploying custom Ansible playbooks to configure the machines in different environments with appropriate package/services and versions
- Good knowledge and experience in using Cloud Watch, Splunk, Prometheus and Grafana for logging and monitoring.
- Skills in querying relational database management systems like Oracle, SQL Server and MYSQL
- Proficiency in NoSQL databases like Amazon DynamoDB.
- Experience in building middle tier Rest APIs and Microservices using Spring Boot.
- Good experience in working with different Bug Tracking Tools like JIRA, ServiceNow.
- Automating routine tasks through apps/processes built using bash scripting.
- Developing Python Scripts to automate various system tasks, OS patches for dev environments, and deployment of applications to testing/prod environments using Python OS modules.
- Proficient in working with different protocols such as SSH, SFTP, DNS, HTTP and HTTPS.
PROFESSIONAL EXPERIENCE:
Confidential, Santa Clara, CA
Site Reliability Engineer
Responsibilities:
- Python tooling for pulling secrets from Hashi corp Vault and placing them in Kube secrets.
- Utilized Hashi corp vault to generate dynamic IAM users.
- Enabled authentication and authorization to resources on GKE through Istio.
- Setup Elasticsearch as logging backend for applications running on Kubernetes.
- Utilized Spinnaker for deploying applications on GKE.
- Extracted content of CSV files and added to BigQuery tables.
- Managed user and service accounts using Terraform.
- Involved in Terraform True Up efforts for the Prod environment.
- Working with Compute engine VMs to modify startup scripts and disk sizes on the fly.
- Debugging pod failures and connectivity issues in GKE.
- Troubleshooting scrape issues of Prometheus.
- Moving tenants across clusters in the on-prem environment.
- Working with mongodb to modify tenant configurations.
Environment: GCP, Python, Terraform, Docker, Kubernetes, Python, Bash, Linux, Spinnaker, Elasticsearch
Confidential - Fort Lauderdale, FL
Cloud Security Engineer
Responsibilities:
- Developed an automated solution using python for querying AWS and alerting the respective infrastructure owner about usage and compliance of the infrastructure.
- Configured and administered Elastic Load Balancers, Route53, Network and Auto scaling Groups for high availability.
- Managed AWS Inventory in DynamoDB with Python Boto3.
- Provisioned infrastructure across AWS workloads by utilizing Terraform Enterprise.
- Provided measures for AWS cloud implementation that resulted in a significant cost reduction for the client.
- Configured Azure services like network watcher and security center for logging.
- Reviewing the architecture of different product services, developing scripts to interact with cloud services from AWS and Azure.
- Utilized Fortify to scan front end applications for vulnerabilities.
- Hardening the virtual machines utilizing configuration management tools like Ansible.
- Monitoring network data using events in Splunk.
- Conducting security audits for validating the security posture.
- Monitoring networks for vulnerabilities using tools like Qualys.
- Developing Lambda functions for shipping logs generated by services like GuardDuty and CloudTrail.
- Developing, documenting policies and procedures for IT security.
- Handling events and incidents based on established IR framework.
- Implementing Firewalls and Cryptographic solutions for safeguarding sensitive information.
- Performing penetration testing using different tools (including nmap, Metasploit and Hydra).
- Developed Splunk queries and dashboards targeted at understanding application performance and capacity analysis across multiple teams and sectors.
Environment: AWS, Azure,VisualStudio,C#,BitBucket,Ansible,Terraform,Kubernetes,CloudFormation, Python, Shell, Artifactory (JFrog), Splunk, Alert Logic, Fortinet, Qualys, Linux,Fortify
Confidential- Kansas, MO
Cloud Devops Engineer
Responsibilities:
- Designed and developed Google Cloud Architecture for the client’s specific requirement with security as the point of focus.
- Utilized Terraform scripts for GKE infrastructure deployment.
- Involved in the integration of GKE with Prometheus for monitoring purpose.
- Building Dockerfiles and deploying containers using Jenkins.
- Deployed and configured Spinnaker for application deployment on GKE.
- Created and configured jobs along with agents in Bamboo.
- Used Deployment manager of GCP to launch instances and copy configurations into additional regions to tackle issues related to region outage.
- Configured Istio as a gateway for external services.
- Exported logs to Pub/Sub and Google Cloud Storage though Istio.
- Configured the Kubernetes cluster on GCP to establish communication between pods and MySQL instance on RDS of AWS.
- Involved in writing various custom Ansible playbooks for deployment orchestration and developed Ansible Playbooks to simplify and automate tasks. Protected encrypted data needed for tasks with Ansible Vault.
- Installed and configured Nexus to manage the artifacts in different Repositories and handling dependency management using nexus private repository.
- Developed and executed Shell scripts and worked on Python Scripting in different projects for automation of regular repeating tasks.
- Ingested logs from the applications like windows defender through data inputs.
- Utilized App Scanner for scanning security vulnerabilities on web applications.
Environment: Cloud Storage, Cloud Bigtable, Cloud SQL, Compute Engine, Stack Driver, Prometheus, GKE, Ansible, Jira, Terraform, AWS, Jenkins, Docker, Spinnaker, Bamboo,Groovy
Confidential
DevOps Engineer
Responsibilities:
- Setup of Virtual Private Cloud (VPC), Network ACLs, Security Groups and route tables across Amazon Web Services.
- Configuration and administration of Load Balancers, Route53, Network and Auto scaling for high availability.
- Configured security and patch management of Linux based operating systems.
- Integrated Github web hooks into Jenkins to automate the code check-out process.
- Migrated various applications and services of teams from on-premise to AWS cloud using AWS resources like EC2, VPC, S3, Security Groups, etc,
- Generated Ant, shell scripts for build activities in QA, Staging and Production environments.
- Worked on the transition project which involves migration activities from Ant to Maven in order to standardize the build across all the applications.
- Migrated source code management tool form SVN to GitHub and ensured to have identical configuration.
- Managed Users and Groups in GtiHub and involved in troubleshooting client spec issues and user issues.
- Configured local Maven repositories and multi-component Ant projects with Nexus repositories and scheduled projects in Jenkins for continuous integration.
- Maintained configuration files for each application for build purpose and installed on different environments.
- Directed the Release Management Calls to synchronize with the Developers, Testers and DBA teams for successful Release.
- Presented reports to the Project manager about the progress and issues tracking key project Milestones, plans and resources.
Environment:: AWS, Git, putty, Linux, windows, SVN, Java/J2EE, Ruby, Eclipse, Ant, Jenkins, Maven, Jira, Junit, Unix/Linux, Tomcat,Groovy
Confidential
Build/DevOps Engineer
Responsibilities:
- As member of Release Engineering group, redefined processes and implemented tools for software builds, patch creation, source control, and release tracking and reporting, on LINUX platform
- Installed and configured Jenkins for Automating Deployments and providing an automation solution
- Integrated GIT into Jenkins to automate the code check-out process
- Used Jenkins for automating Builds and Automating Deployments
- Maintained and tracked inventory using Jenkins and set alerts when the servers are full and need attention
- Experienced working with Puppet Enterprise and Puppet Open Source. Installed, configured, upgraded and managed Puppet master, agents & Databases.
- Assigned Puppet roles to instances as they are launched through script.
- Integration of Puppet with Apache and developed load testing and monitoring suites in Python
- Integrated delivery (CI and CD process) Using Jenkins and puppet
- Developed build and deployment scripts using ANT and MAVEN as build tools in Jenkins to move from one environment to other environments
- Installation of Application on production and Test server for the use of application development and configuration
- Dealt with Jira as ticket tracking and work flow tool
- Released code to testing regions or staging areas according to the schedule published.
Environment:: Puppet, Git, putty, windows, Java/J2EE, Ruby, Eclipse, Ant, Jenkins, Maven, Jira, Junit, Linux, Tomcat Apache Application Server
Confidential
Software Engineer
Responsibilities:
- Developed data access layer for Oracle DB using Hibernate.
- Responsible for coding of DAO classes using Spring with Hibernate.
- Developed middle tier Rest services using Spring Boot framework.
- Making changes to the JSON files and moving them to the Amazon S3, creating the VCAP-SERVICES.
- Pushing the developed code using GIT and deploying the application to development, test and production using the Jenkins build tool.
- Participated in Unit Testing and code review.
- Documentation of request submission workflow.
Environment:: Git, Java/J2EE, Scala, Spring, Hibernate, AngularJS, HTML, Javascript, AWS-SWF,windows, Maven, Jira, Junit, Tomcat Apache Application Server
TECHNICAL SKILLS:
Cloud Computing : AWS, Azure, GCP
SDLC Models : Waterfall Model, Agile Methodologies (SCRUM)
Web technologies : Apache, Tomcat, Nginx, WebSphere
Programming : Java,Python
Operating System : Windows,Linux, Ubuntu, RHEL, CentOS.
Database : Mysql, DynamoDB, OracleDB.Cassandra
Monitoring Tools : Splunk, CloudWatch, Prometheus
Scripting : Bash,Powershell
Configuration : Puppet, Vagrant, Ansible, Terraform.