We provide IT Staff Augmentation Services!

Devops Engineer Resume

4.00/5 (Submit Your Rating)

Austin, TX

SUMMARY:

  • Having 8 Years of Expertise working as a System Analyst/Programmer, Linux Administrator, AWS Cloud Engineer, DevOps Engineer and Build and Release engineer in the areas of Production servers, Installations, Hadoop, Configurations, EMR in Big Data, Continuous Integration (CI), Continuous Delivery (CD), Configuration Management and Application Deployment management.
  • 4+ years of experience with DevOps methodologies in Server provisioning, middleware management, build and deployment Automation using tools like AWS, Docker, Ansible, Terraform, Jenkins, Chef, GIT etc.
  • Strong believer of DevOps Methodologies and working towards a core DevOps engineer position, who can bring an end to end work flow of Continuous Integration, Continuous Delivery and Continuous Deployment process for an organization.
  • I have designed and implemented CI / CD pipelines for many java - based applications helping project teams develop, deploy, test and deliver software packages fast & reliably.
  • Experience maintaining Jenkins Master, configuring, securing and plugin management areas in Jenkins.
  • Worked with project development teams following Agile & Waterfall methodologies and can design source code branching, Release management and release life cycles and CI/CD pipelines based on pace & project deliverables.
  • Designed and implemented fully automated server build, management, monitoring and deployment solutions spanning multiple platforms, tools and technologies including Jenkins Nodes/Agent, Chef, Ansible, VMWare, Amazon EC2.
  • Build and manage DevOps automation using Ansible, Python scripts for ELK and Java services product stack.
  • Experience working with Apache Hadoop, Kafka, Spark and Log stash.
  • Experience in writing the script in Shell for Automation process of release branches of Components.
  • Perform deployments on AWS using EKS, AWS code pipeline, AWS deploy pipeline.
  • Written Chef cookbooks and recipes to Provision Several pre-prod environments consisting of Deployment automation, AWS EC2 instance mirroring, WebLogic domain creations and several proprietary middleware installations.
  • Experience with multiple integration techniques based on service-oriented architecture such as SOA, Web services (REST APIs, SOAP) and other communication interfaces like HTTP, HTTPS, and TCP.

TECHNICAL SKILLS:

CI Tools: Jenkins, Bamboo.

Testing Tool: Sonarqube, Selenium, Junit.

Build Systems: Maven, Ant and Gradle.

Networking: TCP/IP, NFS, Telnet, FTP, DNS, DHCP, NAT, Firewall, HTTP.

Bug Tracking Tool: JIRA, Service Now and Rally.

Version Control: Git, Bit Bucket, SVN.

Application & Web Servers: Apache Tomcat, WebLogic, WebSphere, JBoss.

Configuration Management: Chef, Ansible.

Cloud Services: AWS, Azure, OpenStack.

Containerization Tools: Docker, Open shift and Kubernetes.

Virtualization Platforms: Vagrant, Virtual Box, VMware.

Scripting Languages: Shell, Bash and Python.

Web Technologies: Node Js.

Monitoring Tools: AWS Cloud Watch, Datadog, ELK Stack, Nagios, App Dynamics.

Application monitoring tool: Data dog

Messaging Services.: RabbitMQ, Amazon SNS and Apache Kafka

Logging: Logstash, Splunk.

Operating Systems: Linux, Unix, CentOS, Ubuntu, Windows.

Big Data Tool: Hadoop, Kafka, AWS EMR.

Programming languages: C, Java Script and Java.

PROFESSIONAL EXPERIENCE:

Confidential, Austin, TX

DevOps Engineer

Responsibilities:

  • Involved in DevOps migration/automation processes for build and deploy systems.
  • Setting up various jobs in the Jenkins for the commit-based builds as well as nightly builds.
  • Involving more in writing Bash script for building deployment pipelines.
  • Added multiple nodes to Jenkins and configured SSH for continuous deployments.
  • Implementing new projects builds framework using Jenkins & maven as build framework tools.
  • Creating local and cloud Hadoop clusters via Ambari & command line.
  • Performed Web logic Server administration tasks such as installing, configuring, monitoring and performance tuning on Linux Environment.
  • Worked on Ansible playbooks for Kafka, Grafana, Prometheus and its exporters.
  • Deployed micro services, including provisioning AWS environments using Ansible Playbooks.
  • Performed installation and managed Grafana to visualize the metrics collected by Prometheus.
  • Using Ansible to Setup/teardown of ELK stack (Elastic Search, Logstash, Kibana) Troubleshooting any build issue with ELK and work towards the solution.
  • Worked with storage and also worked with gateway stored volumes, gateway cached volumes, gateway virtual tape library and Installed VM image to host in our datacenter, which supports VMware.
  • Work with team to build out automation templates in AWS Cloud Formation in support of the managed services platform.
  • Worked on setting up the Datadog monitoring across different servers and AWS services.
  • Working on writing Several Chef cookbooks from scratch consisting of recipes that can Provision several pre-prod environments consisting of WebLogic domain creation, Deployment automation, instance mirroring, and several proprietary middleware installations.
  • Creating Custom AMI’s and AMI tags, modified AMI permissions for EMR Stacks.
  • Used AWS S3 service as Build Artifact repository to created release-based buckets to store various modules/branch-based artifact storage.
  • Used Terraform for provisioning the AWS Infrastructure, as they were relying over on-premises before.
  • Good Experience using Elastic Kubernetes Service (EKS) for deployment and management of containers and hands on with deploying Docker Containers using AWS EKS.
  • Creation of container from Docker file, Clustering of Docker.
  • Used Kubernetes - an open-source platform for automating deployment, scaling, and operations of application containers across clusters of hosts, providing container-centric infrastructure.
  • Continuously working on Creating Wiki pages and educated the team about the automation and branching strategies to be followed.

Environment: Git, Hadoop, Jenkins Enterprise, DynamoDB, SQL, Chef, Ansible, Maven, JFrog, AWS, Openstack, Ant, Java, Grafana, Prometheus, Tomcat, JBoss, Splunk, HTML, Service Now, Jira, Docker, Unix/Linux.

Confidential, San Jose, CA

DevOps Engineer/Cloud Engineer

Responsibilities:

  • Responsible for setting up from scratch and maintaining automated (CI/CD) Pipelines for multiple apps.
  • Configured security and system in Jenkins and Added multiple nodes to Jenkins and configured SSH for continuous deployments.
  • Developed a automation scripting in Shell and python that can create a release branches by using Jenkins enterprise.
  • Implemented Ansible to manage all existing servers and automate the build/configuration of new servers and setup the Continuous Delivery Pipeline.
  • Supported Java Project code base and used Maven as Build tool.
  • Migrated Ant build.xml files to Maven POM.xml.
  • Written Chef cookbooks and recipes to Provision Several pre-prod environments consisting of Deployment automation, AWS EC2 instance mirroring, WebLogic domain creations and several proprietary middleware installations.
  • Professional in deploying and configuring Elasticsearch, Logstash, Kibana (ELK) and AWS Kinesis for log analytics and skilled in monitoring servers using Nagios, AppDynamics, Splunk, AWS CloudWatch, Azure Monitor and ELK.
  • Responsible for build and deployment automation using VM Ware ESX, Docker containers and Ansible.
  • Worked on PCF for effective deployments of the application in cloud using Spring Boot.
  • Working with several AWS services like EC2, EBS, S3, Cognito, VPC, EKS, Cloud Formation and Cloud Watch and worked on migrating application to AWS Cloud.
  • Automated build and the deployment process in Open Stack.
  • Maintained Ubuntu build machines and always keep them updated with latest security fixes.
  • Used Cloud formation templates and launch configurations to automate repeatable provisioning of AWS resources for applications.
  • Worked with Docker Trusted Registry as repository for our Docker images and worked with Docker container networks.
  • Utilized Several Docker best practices to create base images from scratch and to create clear, readable and maintainable clean Docker files.
  • Written several Docker files to create tomcat along with required version of jdk container images.
  • Good hands on with container orchestration tools like Kubernetes, experience with Red Hat open shift enterprise Platform as a Service product.
  • Worked with the testing teams to automate test cases as part of the post deployment action and did Cucumber setup for testing automation.
  • Taken Responsible for configuring and maintaining all pre-prod environments consisting of complex middleware environments with several variations of Tomcat installations consisting of several instances in each installation.
  • Designing and Implementing De Cloud Infrastructure by Creating Azure Resource Manager (ARM) templates for Azure Platform also used Terraform to deploy the infrastructure necessary to create development, test, and production environments for a software development project.
  • Expertise in working under Azure Active Directory for creating roles, tenants, and assigning various security policies.
  • Good exposure on working with APM tool like Data dog.

Environment: Git, Jenkins Enterprise, Chef, Maven, Gradle, Spring Boot, Nexus, JFrog, Docker, Open Shift, VM Ware, Junit, Kubernetes, AWS, Azure, Splunk, Open stack, WebSphere, Ant, Ansible, Confluence, Atlassian Jira, Rally, Unix/Linux.

Confidential, Sunnyvale, CA

DevOps/Cloud Engineer

Responsibilities:

  • Written pre-commit, post-commit, post-receive hooks in GIT.
  • Built end-to-end CI/CD pipelines in Jenkins to retrieve code and push build artifacts to Nexus Artifactory.
  • Performed tasks like bootstrapping nodes to executing run lists to mirror the new nodes to web or application servers and to run deployments against newly added nodes to the clusters.
  • Created and automated the Jenkins pipeline using pipeline groovy script for the applications.
  • Worked on google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring and cloud deployment manager.
  • Knowledge on Splunk , Kibana and ELK Stack.
  • Used Ansible and Ansible Tower as Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manages change.
  • Worked with Docker container snapshots, attaching to a running container, managing containers, directory structures and removing Docker images.
  • Creation of container from Docker file, Clustering of Docker.
  • With Kubernetes , we could quickly and efficiently respond to changes on demand and deployed our applications quickly and predictably.
  • Configured AWS IAM and Security Group in Public and Private Subnets in VPC.
  • Created AWS IAM command line that is used as part of on-perm identity broker type of Application.
  • Created and design the Terraform templates to create custom sized VPC, NAT subnets, for deployment of Web applications and databases.
  • Worked effectively in Configuring and Managing monitoring tools like Nagios and AppDynamics.

Environment: s: Java/J2EE, Tomcat, GIT, Salt, AWS, Google Cloud, LINUX /UNIX, HTML, XML, Nexus, JFrog Artifactory, Nagios, Gradle, ANT, MAVEN, Jenkins, Bash, Agile.

Confidential

System Analyst

Responsibilities:

  • Installed and Configured of Red Hat, LinuxOS, CentOS, Fedora, Ubuntu and performed tasks like, troubleshooting connectivity, disk space, CPU memory consuming and application status.
  • Developed TMS and connect with SOAP, Web Services and REST API to track load status, get freight quoting and create shipments.
  • Designed SQL database for a manpower scheduling Web App.
  • Responsible for Connectivity issues among various Servers and various software components.
  • Configure network services like NFS, DHCP, DNS, SAMBA, FTP, HTTP, TCP/IP, SSH and Firewall that runs on Red Hat Linux, Sun Solaris, AIX.
  • Software Product Developer in Linux kernel IP Failover product in a Cluster environment (Apache/NFS).
  • OS Kernel Software Developer on UNIX Clustering UnixWare 7.x (Compaq Nonstop Cluster).
  • Developed a Linux performance tool "net star" (C) that monitors TCP/IP IEEE 802.x network activity at a per-process level for system and support analysts.

We'd love your feedback!