Devops Infrastructure Engineer Resume
SUMMARY
- Over 8+ Years of experience with DevOps Tools, Cloud, Configuration Management, Build and Release Management.
- Experience in setting up the CI/CD pipelines using Jenkins, Maven, JFrog, GitHub, Ansible, Terraform and AWS.
- Experience in setting up VSTS and YAML pipelines on Microsoft Azure.
- Wrote Ansible Playbooks with Python, SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python, run Ansible Scripts to provision development servers.
- Provided consistent environment using Kubernetes, deployment scaling and load balancing to the application from development through production, easing the code development and deployment pipeline by implementing Docker containerization.
- Experience with installation and configuration of Kubernetes, clustering them and managed local deployments in Kubernetes.
- Experience in setting up AKS clusters on Microsoft Azure and installing Helm packages and other software like Dynatrace, Nginx - Ingress controller, Istio and kiali.
- Setting up Valero backup solution for taking backup of AKS clusters.
- Experience in configuring Elastic stack for application monitoring and log parsing.
- Experience in setting up Confluent Kafka as an event streaming platform.
- Expertise in Chef automation for Configuration Changes and automating various applications.
- Extensively worked with automation tools like Jenkins to implement the end to end automation.
- Expertise in using build tools like MAVEN and ANT for the building of deployable artifacts such as war & ear from source code.
- Experience in integrating Unit Tests and Code Quality Analysis Tool like Junit.
- Administered and Implemented CI tools Hudson and Jenkins for automated builds.
- Experience is using Tomcat and httpd Application servers for deployments.
- Experience in working with documentation of Solution design documentation SDD in perspective to Automation.
- Managed environments DEV, QA, STAGE and PROD for various releases and designed instance strategies.
- Ability to work closely with teams, in order to ensure high quality and timely delivery of builds and releases.
- Have good experience in automation with Chef using Knife plugin from Azure.
- Have good experience with server and user experience monitoring using AppDynamics
- Experience with multiple deployment models in Azure classic and Resource Manager
- Have good experience in setting up and automating Big Insights clusters and installing VAS and using Ambari Blueprints.
- Experience in setting up the continuous integration and continuous deployment (CI and CD)
- Have good experience with Microsoft Azure.
- Have good experience in establishing VNET to VNET connection
- Involved in SCRUM meetings (stand-up, grooming, planning, demo/review and retrospective) with the teams to ensure successful project forecasting and realistic commitments.
TECHNICAL SKILLS
Operating Systems: Linux, Rhel, Centos, Windows
Version Control Tools: SVN, GIT, TFS
Languages: Python, Ruby, Shell
Databases: Oracle, MySQL
Application/Web Servers: Tomcat, httpd, Nginx
Build Tools: Ant, Maven
Cloud: Microsoft Azure, AWS EC2, VPC, EBS, AMI, SNS, SSQS, SWF, RDS, EBS, CloudWatch, CloudFormation, AWS CLI, AWS API, AWSConfig, S3, Cloud Trail, IAM, AWS authmethod. Azure virtual networks, Vnet to Vnet, point to point connections, ACL, key vaults, end points.
Search Engines: Elastic Search
CM Tools: Ansible, Chef, Serf
Continuous Integration: Jenkins, Bamboo, Hudson
Log Parser: Logstash
Monitoring Tool: Dynatrace, Nagios, App dynamics
Web: HTML, Java Script, JQuery, Kibana
PROFESSIONAL EXPERIENCE
Confidential
DevOps Infrastructure Engineer
Responsibilities:
- Responsible for installing and configuring and administering Elastic, Logstash, Kibana with Beats as a containerized application on Docker.
- Responsible for installing and configuring Confluent Kafka on Linux servers using Ansible modules.
- Used Terraform modules to create infrastructure on AWS and create a dynamic inventory for Ansible to install Elastic stack and Confluent Kafka on RHEL nodes.
- Used Ansible as a configuration management tool to create roles and playbooks.
- Configured HashiCorp vault to retrieve Cubbyhole secret paths and certificates from vault and do a rotation of PKI certs using ansible tower.
- Used molecule as a test framework to test ansible roles in a local infrastructure within docker containers.
- Automated installation of Kibana dashboards by loading json files on kibana servers.
- Designed and provisioned Virtual Network Confidential AWS using VPC , Subnets , Network ACLs , Internet Gateway , Route Tables , NAT Gateways .
- Design, build and manage the ELK (ElasticSearch, Logstash Kibana) cluster for centralized logging and search functionalities for the App.
- Responsible to designing and deploying, Kafka , zookeeper , Console Producer, Console Consumer, Kafka and control center using confluent kafka .
- Installed Kerberos secured Kafka cluster with encryption at rest on Dev, QA and Prod.
- Setting up Kafka ACL's on Kafka servers.
- Installed JMX agent Jalokia to capture JMX metrics from kafka servers and displaying the metrics on kibana dashboard.
- Installed Kafka manager for consumer lags and for monitoring Kafka Metrics also this has been used for adding topics, Partitions etc.
- Configuring RBAC method to controlling system access based on roles assigned to users .
- Worked on setting up Hadoop security, data encryption and authorization using Kerberos , TLS / SSL
- Configured Kerberos on Confluent Kafka and ELK servers to authenticate users.
Environment: AWS, Terraform, Ansible, Bamboo, ELK, Confluent Kafka, Metric beats, Heart-beat, Filebeat, packetbeat, Jalokia agent, Aerospike, Molecule, HashiCorp Vault.
Confidential, Lowell, Arkansas
Software Engineer
Responsibilities:
- Responsible for configuring and deploying infrastructure as a code using Terraform and Ansible.
- Designed and configured Azure Virtual Networks (VNets), subnets, Azure network settings , DHCP address blocks, DNS settings, security policies and routing.
- Worked on building Azure VSTS pipeline and YAML pipeline to create Azure AKS infrastructure.
- Installing and configuring AKS environment with helm, ingress, NGINX, Prometheous, RBAC, istio, kiali, Dynatrace one agent operator
- Developed and Implemented kubernetes manifests, helm charts for deployment of microservices into k8s cluster using helm tiller account.
- Setup Ingress controller as an internal load balancer using Nginx ingress controller.
- Established infrastructure and service monitoring using Prometheus and Grafana .
- Configured RBAC principles for r egulating access to AKS network resources based on the roles of individual users and groups.
- Configured service mesh availability, observability and visualizing using istio and kiali.
- Create a backup solution and scheduling backup of AKS cluster using Valero backup solution, storing backup of AKS cluster in azure blob storage.
- Installing Dynatrace OneAgent to collect all monitoring data on AKS infrastructure.
- Configured Dynatrace dashboard and created alerts on the threshold value.
- Have written custom resource definition and RBAC rules for Kubernetes
- Designed Jenkins HA setup using Azure share directory for automated backup of Jenkins home directory.
- Configured and supported SonarQube builds on Jenkins and YAML pipeline to check code quality and security.
- Automated Continuous Integration builds, nightly builds, deployments and unit tests across multiple different environments ( DEV , QA, Training, Production).
Environment: Azure, Kubernetes, Terraform, Ansible, Jenkins, Dynatrace, Helm, Istio, Kiali, Valero, Prometheus, Grafana, Thanos, Python, Git Bash, Tomcat, Active Directory, F5 Big IP.
Confidential, McLean, Virginia
Software Engineer
Responsibilities:
- Responsible for configuring the apps to OpenShift v3 and containerization of apps using Docker.
- Used Bash and Python included Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMIs and scheduling Lambda functions for routine AWS tasks.
- Setup the new VPN tunnels from corporate network to VPCs in AWS and designed the subnets, Routing an IAM policies.
- Designed and developed CI/CD pipeline integrating GIT Lab and Ansible across geographically separated hosting zones in AWS.
- Implemented AWS Code Pipeline and Created Cloud formation JSON templates and Terraform templates for infrastructure as code.
- Worked on AWS services like EC2, S3, RDS, ELB, EBS, VPC, Route 53, Auto scaling groups, CloudWatch, CloudFront, IAM for installing configuring and troubleshooting on various Amazon images for server migration from physical into cloud.
- Integrated Kubernetes with network, storage, and security to provide comprehensive infrastructure and orchestrated container across multiple hosts.
- Created Docker images using a Docker file.
- Worked on Docker container snapshots, removing images and managing docker volumes and experienced with Docker container service.
- Installed and configured Oracle Weblogic 11g/12c application server.
- Setting up Role based provisioning and role-based access in Tivoli identity manager.
- Configure SOAP Project for testing Rest Services and Apigee Proxies.
- Involved in prod release activities, deployments and documentation.
- Handling application deployments into DEV, Test, Performance and Production Environments.
- Experience in configuring and administering JDBC resources (Data Sources, connection pools), JMS resources (Queue, Topic, connection factory) in Web Logic.
- Implemented Autosys for scheduling the ETL (Ab Initio, Datastage, Informatica), Java, Weblogic, PL/SQL Jobs.
- Developed and designed continuous integration pipeline and integrated using Github, Jenkins, SonarQube.
- Provided 24/7 Support and on call schedule for Production support.
Environment: AWS, Docker, Openshift, Kubernetes Ansible, Splunk, Sitescope, Jenkins, shell, Python, Git Bash, Weblogic, JBoss, Autosys, Active Directory, Rundeck, SoapUI, F5 Big IP.
Confidential
DevOps Developer
Responsibilities:
- Chef automation for Configuration Changes
- Developed and implemented Software Release Management strategies for various applications according to the agile process
- Manage deployment automation using Chef, Ruby and Perl
- LDAP-AD authentication automation with various applications
- Automation and Setting up of Hadoop Clusters using Ambari and installing services like MapReduce, Hive, Pig, Oozie, Sprak etc
- Setting up and installation of Value Addded Services on Hadoop Cluster • Injecting Blue prints and IOP components of clusters on Big insight Clusters.
- Experience in using Chef attributes, Chef templates, Chef recipes, Chef files for managing the configurations across various nodes
- Worked on Setting up Chef infrastructure, Chef-repo and Boot strapping chef nodes
- Experience in writing complex cookbooks, recipes and configure them by applying node convergence on various production nodes.
- Worked on setting up the life cycle policies to back the data from Microsoft Azure
- Established VNET to VNET connection and point to site connection
- Experience with Rundeck in automating our deployment using Chef creating a job.
- Migrating some of the applications from existing Rackspace environment
- Proficient with deploying .dll files onto IIS server
- Wrote Python Scripts to Monitor Variety of Services & Perl Scripts with Hash/Arrays to Insert/Delete/Modify content in multiple servers.
- Securing Linux servers by hardening the OS using IPtables, SELinux.
- Experience with application monitoring with tool like App Dynamics with controller and GeoServer
Environment: TFS, RedHat-7.2, Centos- 6.7, Chef, App Dynamics,
