We provide IT Staff Augmentation Services!

Devops Engineer Resume

2.00/5 (Submit Your Rating)

RaleigH

SUMMARY

  • AWS DevOps Engineer having around 9 relevant years of professional Experience dedicated to automation and optimization. Understands and manages the space between operations and development to quickly deliver code to customers. Has experience with the Cloud, as well as DevOps automation development for Linux Systems. Seeking a position in AWS/ DevOps to contribute my technical knowledge.
  • Expertise inDevOps tools such as GIT, Jenkins, Maven, Ansible, Chef, Puppet, Docker, Kubernetes, AWS, Azure, and Terraform, Splunk, Nagios.
  • Experienced in Continuous Integration (CI) and Continuous Delivery (CD) process implementation using Jenkins, TeamCity, and Bamboo.
  • Good knowledge on Configuration Management Tools such as Chef, Ansible, Puppet and Automated Linux production server’s setup using the chef cookbooks, Ansible playbooks.
  • Experience with Docker containerized environment, hosting web servers on containers, building Docker images using Docker.
  • Expertise working with Kubernetes to automate deployment, scaling and management of web Containerized applications.
  • Worked on Kubernetes to orchestrate Docker containers of new and existing applications as well as deployment and management of complex run time environment.
  • Expertise in administering and automating operations across multiple platforms (UNIX, Linux, Windows and MAC.
  • Proficient in the AWS Cloud platform and its features, which include EC2, VPC, EBS, CloudWatch, Cloud Trail, Cloud Formation, Cloud Front, IAM, S3, Route 53, SNS, SQS.
  • Experience in working on Docker and Vagrant for different infrastructure setup and testing of code.
  • Worked on shell/python scripts & implemented auto deployment process and reduced the amount of time.
  • Experience on Python, Bash/Shell, Ruby, Perl, PowerShell, JSON, YAML and Groovy.
  • Knowledge on Java Lambda Expressions to retrieve the data from Collections using Functional Interfaces.
  • Proficient in using Unix Commands and utilities to monitor server - side activities and performance.
  • Experience on Azure Site Recovery and Azure Backup Installed and Configured the Azure Backup agent and virtual machine backup, Enabled Azure Virtual machine backup from the Vault and configured the Azure Site Recovery (ASR).
  • Experience in branching, tagging, and maintaining the version across the environments using SCM tools like Subversion (SVN), Git, Bitbucket, and GitHub on UNIX and Windows environment.
  • Expertise in working with diverse Database platforms for Installing, Configuring and Managing NoSQL, RDBMS tools like MYSQL, Oracle, DynamoDB, and MongoDB.
  • Worked on various network protocols like HTTP, UDP, FTP, TCP/IP, SMTP, SSH, SFTP & DNS and technologies like load balancers (ELB), ACL, Firewalls.
  • Experienced with handling production incidents, queries, problems through remedy ticketing system and non -production issues/tasks/incidents through JIRA ticketing system
  • Good understanding of web and application servers like Apache Tomcat, WebSphere, Nginx.
  • Used build tools like Maven, ANT, and Gradle for the building of deployable artifacts from source code.
  • Experience in building and deploying Java & SOA applications and troubleshooting the build and deploy failures.
  • Proficient with Terraform Configuration files to spin up the infrastructure very easily and efficiently.
  • Managed the deployment of virtual storage, sites and server’s infrastructure to augment clients’ datacenters by Agile.
  • Experienced using different log monitoring tools like Splunk, Nagios, ELK (Elastic search, Logstash, Kibana) for to see logs information, monitor, get the health & security notifications from nodes.

TECHNICAL SKILLS:

Cloud Technologies: AWS EC2, IAM, AMI, Elastic Load Balancer (ELB), EBS, Auto Scaling, Cloud Front, S3, SNS, Route53, VPC, Security groups, Cloud Watch

Operating Systems: Windows, Linux - RHEL 6/7

Scripting: Bash Shell Scripting

Versioning Tools/ SCM: GIT, GitHub, Bitbucket

Build Tools: Maven

CI Automation Tools: Jenkins, SonarQube, JFrog Artifactory

Web/Application Servers: Apache, Tomcat, WebSphere, WebLogic

Containerization Tools: Docker, Kubernetes

Iaas Tools: Terraform

Monitoring Tools: Prometheus, Grafana, ELK Stack, Nagios

PROFESSIONAL EXPERIENCE:

Confidential, Raleigh

DevOps Engineer

Responsibilities:

  • Experience in dealing with Windows AzureIaaS - Virtual Networks, Virtual Machines, Cloud Services, Resource Groups, Express Route, VPN, Load Balancing, Application Gateways, Auto-Scaling and Traffic Manager.
  • Experience in configuring Azureweb apps, AzureApp services, AzureApplication insights, Azure Application gateway, AzureDNS, Azure Traffic manager, AzureNetwork Watcher, Implementing Azure Site Recovery, Azure Backup and Azure Automation.
  • Deploying the Virtual Machines with the Microsoft Monitoring Agent / Operational Management Suite (OMS) Extension using the PowerShell Scripts.
  • Created job chains with Jenkins Job Builder, Parameterized Triggers, and target host deployments. Utilized many Jenkins plugins and Jenkins API.
  • Built end to end CI/CD Pipelines in Jenkins to retrieve code, compile applications, perform tests and push build artifacts to Nexus Artifactory.
  • Created Jenkins Workflows for advanced deployment process (DB execution, Environment configuration changes etc.) on both QA and preproduction Environments
  • Worked on several Docker components like Docker Engine, Hub, Machine, Compose and Docker Registry.
  • Creating Dockerimages analysing various Jenkins Metrics and provisioning them in a container orchestration platform Mesos.
  • Worked on deployment automation of all the Microservices to pull an image from the private Dockerregistry and deploy to Docker SwarmCluster.
  • Wrote Chef Cookbooks and recipes in Ruby to provision pre-prod environments consisting of Cassandra DB installations, WebLogic domain creations and several proprietary middleware installations.
  • Built & Deployed Java/J2EE to web application server in agile continuous integration environment and automated Labelling activities in TFS once deployment is done.
  • Testing Cookbooks with Test Kitchen and Docker containers even before uploading to chef server and Good understand of Knife, Chef Bootstrap process etc.
  • Involved in Performance Optimization of SQL Server stored procedures and Analysis Serviced.

Environment: Azure, Jenkins, Chef, Nagios, Java/J2EE, .NET GIT, Github, Bamboo, WebLogic, Docker, Nexus, Python, Bash, Chef Server, Tomcat, nginx, CentOS, Unix, JIRA, Sonar.

Confidential, California

AWS/DevOps Engineer

Responsibilities:

  • Implement required AWS services to High Performance Computing Experience. Especially, experience in leveraging AWS services including EC2, S3 storage, EFS (Elastic File System), IAM (Identity and Access Management) Cloud Formation, Cloud watch, DynamoDB.
  • Automate AWS Backup for applications like Confluence to maintain backup by supported AWS services (EFS, RDS etc.) and migrate Confluence Data from source to destination by using AWS DataSync agent.
  • Responsible for administration ofGITversioncontroland Perform activities branching, tagging, backup, restore.
  • Automated the process of deployment to Apache Tomcat Application Servers by developing Unix/Python Scripts.
  • Worked extensively on AWS broad range of services such as provisioning EC2, VPC, ELB, Security Groups, IAM, EBS, S3, SNS, SQS, Route53, ELB, CloudWatch, Cloud Front, Cloud trial, RDS
  • Worked on container-based deployments using the Docker images and Docker registries, pulled the required images from Docker Hub, Nexus. Used Docker to avoid the environment difficulties between Developers and Production.
  • Involved in setting CI/CD Pipelines using Jenkins, Maven, Github, Docker, and Nexus.
  • Created and developed deployments, namespaces, Pods, Services, Health checks, and persistent volumes etc., for Kubernetes in YAML Language.
  • Design and implement the CICD architecture and automation solutions using GITHUB, Bitbucket, Jenkins, Bamboo, and Ansible Tower. Deploying to production environment in AWS using Terraform
  • ImplementedFortify andBlackDuckScansfor Security and deploying the binary files like zipping, jar, dll’s files to JFrog Artifactory using Jenkins, Bamboo
  • Automation of Data transfer between on-premises andCloud platformsleveraging HPC scripts.
  • Design and deploy the infrastructure and services for new applications by tools such as SonarQube, Blackduck and Jenkins etc.
  • Create testing framework with AWS services like AWS gateway, Dynamo DB, EC2 etc to test different AKANA policies.
  • Create Gatling scripts to test different AKANA QOS policies and Performing AKANA end-to-end testing by using Gatling tool.
  • Deployed centralize and auto scaling monitoring system for infrastructure using Grafana, Puppet and puppetdb with integration of many custom plugins for real time monitoring.
  • Automate SDLS with scripting languages like python, shell and groovy scripts. System troubleshooting and problem-solving across platform and application domains.
  • Work closely with development team by rapidly deploying instances, monitoring operating efficiencies of the platform, and responding as needs arise.
  • Used Ansible to deploy ELK for automating continuous deployment (CD) and configured Nodes and deployment failure reporting and working with Site Reliability Engineer to implement Data dog system metrics, analytics
  • Accountable for the integration/maintenance/development of application and server monitoring tools (for DevOps) - Nagios, Prometheus, Kibana, Datadog.
  • Monitor applications and environment by managing the installation/configuration of Splunk and ELK.

Environment: Git, Black Duck, Bitbucket, Confluence, AKANA, Jenkins Pipeline, Groovy, Ansible, YAML, Docker, MAVEN, AWS, Ec2, EFS, RDS, Route53, Dynamo DB, S3, Cloud Formation, IAM, Sonarqube, Jfrog, Artifactory, XML, JSON, Rally, Apache Tomcat, Gatling, Python, Shell.

Confidential, Detroit, MI

AWS/DevOps Engineer

Responsibilities:

  • Involved in designing and deploying multitude applications utilizing almost all the AWS stack (Including EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling in AWS CloudFormation.
  • Used MySQL and DynamoDB to perform basic database administration.
  • Configured S3 versioning and lifecycle policies to and backup files and archive files in Glacier.
  • Configured and managed source code using GIT and resolved code merging conflicts in collaboration with application developers.
  • Managed GIT in branching, tagging, and maintaining the version across the environments using SCM tools like GIT on Linux and Windows platforms.
  • Performed regular builds and deployment of the packages for testing in different Environments.
  • Creating the automated build and deployment process for application and leading up to building a CI/CD system for all our products from commit to deployment usingJenkins.
  • Automated Continuous Integration/Continuous Deployment with Jenkins, build-pipeline-plugin, Maven, GIT, setting up Jenkins master/slave to distribute builds on salve nodes.
  • Experience running LAMP (Linux, Apache, MySQL, and PHP) systems in agile quick scale cloud environment.
  • Perform regular DBA activities including space management and performance monitoring.
  • Involved in various phases of the Software Development Life Cycle of the application including Requirement Analysis, Design, Implementation, Testing, and Maintenance.
  • User, Group creation, monitoring and maintaining log for system status/health using Linux commands and Nagios system monitor.

Environment: GIT, GitHub, AWSEC2, S3, VPC, IAM, CloudWatch, DynamoDB, Tomcat Apache, Linux, Data Centre Migration, Nagios, Jenkins, Maven.

Confidential

Sr. Quality Analyst

Responsibilities:

  • Worked on Integration Testing of Membership, Providers, Referrals and Claims.
  • Worked on the GA regional Membership system called Common Membership (CM) to test the eligibility and benefit administration.
  • Worked on the Provider data transfer in HL7 format from the Regional Provider System to Third Party Administrator (TPA).
  • Analyzed requirements to determine testing feasibility of Epic applications (Cadence, Ambulatory, Resolute and Tapestry).
  • Achieved the open enrollment deadlines of the Self-Funding Benefit project for the system and integration testing of the Middleware component for membership enrollment.
  • Extensively tested all EDI file formats (Segment and Field level) transformation testing.
  • Validated the data transfer in X12 EDI format from TPA to the Regional Systems and vice versa.
  • Validated the Membership data transfer from TPA to the Regional Membership system (CM) and vice versa in the form of X12 EDI 834 transactions.
  • Validated the claims (in X12 EDI 837 format) flow from Resolute to TPA.
  • Validated the Remittances (X12 EDI 835) transactions and loading the remit data from TPA into Resolute.
  • Created test data to satisfy different positive and negative scenarios.
  • Tested the Inbound / Outbound Interfaces to EPIC and populated XML files.
  • Validated the data on Mainframes.
  • Validated the Execution and Scheduling of Batch Jobs in Mainframes (JCL).
  • Managed and documented test cases corresponding to business rules and other operating conditions in Quality Center.
  • Executed the test cases in Quality Center.
  • Reported the defects in Quality Center.
  • Tested the bugs as they are fixed and changed the status as relevant in Quality center.
  • Attend Daily status and defect call meetings with team and vendors.

Environment: EPIC, ePremis, EMR/EHR, XML, Mainframes (CICS), UNIX, Mercury Quality Center, IBM Tivoli, puTTy, EDI NOTE PAD, Visual CACTUS.

Confidential

Quality Analyst

Responsibilities:

  • Worked as manual tester to create test scenarios, test plan, test cases of system design documents.
  • Involved in generating Test cases for existing system to new system for different Levels of Business.
  • Preparation of Defect Metrics and Productivity Metrics for the team, worked as back end tester worked in creating manual test cases.
  • Responsible for reviewing the development standards, testing standards and processes
  • Performed Positive and negative testing, Black Box, and End User Testing
  • Used SQL Server Management Studio for queries.
  • Performed Unit testing and System testing.
  • Created Unit Test Cases and developed Unit Test Plans.
  • Unit tested the newly developed and/or modified software.
  • Created Unit Test Summary reports.
  • Performed smoke, usability, functionality, GUI, browser compatibility and regression tests.
  • Actively participated in regular QA Team meetings to discuss testing process and resolve issues.
  • Interacted closely with developers, environment people, client, team manager, team leads and team members for feature issues and discussion.

We'd love your feedback!