We provide IT Staff Augmentation Services!

Devops / Cloud Engineer Resume

5.00/5 (Submit Your Rating)

Washington, DC

SUMMARY:

  • Over 8+ years of IT Industry experience with Configuration Management, Deploy, Build and Release Management, AWS and DevOps Methodologies.
  • Extensive experience in Software Development Life Cycle (SDLC) including requirements analysis, design specification, coding and testing of enterprise applications.
  • Full understanding in SDLC methodologies, RUP, Agile Methodologies and Waterfall.
  • Strong experience with Docker Management Platform, Leveraged Custom Docker Images as Containerized Apps with in the Docker Engine as Multi Stack Application like LAMP
  • Experience with patching of Red hat Linux servers and hardening of servers using native and third - party tools.
  • Experience with the tools in Hadoop Ecosystem including Pig, Hive, HDFS, MapReduce, Sqoop, Yarn, Oozie, and Zookeeper.
  • In-depth understanding of the principles and best practices of Software Configuration Management (SCM) processes, which include compiling, packaging, deploying and Application configurations.
  • Experience in project analysis, gathering user requirements, technical design and training customers.
  • Experience in Cloud automation using AWS Cloud Formation templates and migration to Amazon web Services AWS.
  • Created Infrastructure in a Coded manner (Infrastructure as Code) using Terraform.
  • Built Jenkins jobs to create AWS infrastructure from GitHub repos containing Terraform code.
  • Managed different infrastructure resources, like physical machines, VMs and even Docker containers using Terraform.
  • Installed and configured/customized the Jenkins, Nexus and SonarQube tools to enforce the GSA CM policy & procedures.
  • Administered 7 Bamboo servers and Jenkins which includes install, upgrade, backup, adding users, creating plans, installing local/remote agent, adding capabilities, performance tuning, troubleshooting issues and maintenance.
  • Experience in Software Configuration Management process and customizing/using the IBM Build Forge tool and Continuous Integration tools like Bamboo, Jenkins.
  • Good on implementation of Jenkins / Hudson as a Continuous Integration Tools.
  • Strong hands on development and configuration experience with software provisioning tools like Chef, Puppet and Vagrant.
  • Pleasant Experience in real time analytics with Apache Spark (RDD, Data Frames and Streaming API).
  • Experience in designing and developing applications using Java and Big Data technologies like Apache Spark/ Hadoop.
  • Designed, built and tested completely rebuilt infrastructure to move sleepnumber.com eCommerce website from static hosting in Rackspace to containerized infrastructure-as-code maintained in AWS utilizing automation tools and frameworks like Ansible, Terraform, Nomad, Consul, Kubernetes.
  • Knowledge experience with other NoSQL Databases.
  • Software engineering in Distributed Systems development within the cloud using SCALA, AWS, REST, LINUX servers,
  • Experienced in working on DevOps/Agile operations process and tools area (Code review, unit test automation, Build & Release automation, Environment, Service, Incident and Change Management).

TECHNICAL SKILLS:

Configuration management: Chef, Puppet, Ansible, Docker, Consul, kubernetes, Terraform.

Configuration Integration Tools: Jenkins, Rundeck, IBM Urban Deploy, Bamboo, Electric Cloud.

Build Tools: ANT, Maven

Application/Web Servers: WebLogic, Apache Tomcat, JBoss

Scripting Languages: PowerShell, Ruby, Python

Logging/Monitoring Tools: Splunk, Logstash, Atlassian JIRA, CloudWatch, Nagios.

Version Control: Subversion, GIT, GitHub, GitLab

Cloud Services: AWS, Azure.

Operating Systems: Red Hat, CentOS, Ubuntu, Windows

PROFESSIONAL EXPERIENCE:

Confidential, Washington, DC

DevOps / Cloud Engineer

Responsibilities:

  • Provided highly durable and available data by using S3 data store, versioning, lifecycle policies, and create AMIs for mission critical production servers for backup.
  • Utilized Cloud Watch to monitor resources such as EC2, CPU memory, Amazon RDS DB services, Dynamo DB tables, EBS volumes; to set alarms for notification or automated actions; and to monitor logs for a better understanding and operation of the system.
  • Created RDD's Extracting the data from Azure Blob Storage (Blobs, Files, Tables and Queues) and making transformations & actions.
  • Designed highly scalable, distributed and digitally modernized New Business and Underwriting systems, utilizing EDA, microservices, Kafka, Spring boot, REST, Dockers, CI, CD, AWS, OpenShift.
  • Built container native MicroServices-based application development platform for Oracle's bare-metal cloud (BMC).
  • Led technical research for Oracle's Java-based MicroServices keynote at JavaOne 2016.
  • Extracting the data from Azure Data Lake into HDInsight Cluster (INTELLIGENCE + ANALYTICS) and applying spark transformations & Actions and loading into HDFS.
  • Documented system configurations, Instance, OS, and AMI build practices, backup procedures, troubleshooting guides, and keep infrastructure and architecture drawings current with changes.
  • Helping Scrum master across the company to customize JIRA for their requirements.
  • Provided all round support for Splunk forwarder logging issues, troubleshoot servers that are not forwarding events.
  • Implemented Azure Data Factory pipelines, datasets, copy and transform data in bulk via Data Factory UI and PowerShell, scheduling and exporting data.
  • Designed and developed standalone data migration applications to retrieve and populate data from Azure Table / BLOB storage to Python, HDInsight and Power BI.
  • Exported data to Azure Data Lake Stores and stored them in their native formats using various sources, Relational and Semi-structured data using Azure Data Factory.
  • Good working experience in Azure Logic apps, Service Bus, Document DB and SQL Database.
  • Integrate Active directory with JIRA by using the JIRA user directory and atlassian crowd.
  • Integrated Jenkins with different code quality analysis and Review tools like SonarQube
  • Deployed Nagios to monitor approximately 300 Linux systems. Wrote numerous custom plugins to monitor specific application parameters as requested by the team leads. This solution was a tool for migration and application deployment.
  • The Nagios checks were geared toward confirming configuration items, software dependencies, and client requirements on application server pods.
  • Coordinating with developers to resolve TFS build failures and issues. Developed a continuous deployment pipeline using Jenkins, Shell Scripts. Installed and Configured Electric Flow using Amazon Web Services.
  • Continuously delivered using Electric Cloud.
  • Developed Spark Streaming application using Scala to read data from Oracle tables and publish to Kafka topic.
  • Configured Jenkins servers, Jenkins nodes and required Perl and Python scripts.

Environment: AWS EC2, ELB, EBS, S3, Cloud Watch, Code Deploy, Cloud Formation, IAM, SNS, VPC, RDS, Puppet, Terraform, Ansible, Jenkins, ANT, Maven, Bash, Nexus, Linux, Red Hat, CentOS, Solaris and Windows, Redshift, Gradle, Open Shift, Azure, Electric Cloud.

Confidential, Cleveland, OH

DevOps Engineer- Release Engineering

Responsibilities:

  • Created and managed cloud VMs with AWS EC2 command line clients and AWS management console
  • Generated reports on different bugs and tickets using JIRA/ Bug tracking
  • Created and maintained the Shell and Perl deployment scripts for Web Logic web application servers.
  • Performed system administration and operations tasks using Chef, Ganglia and Nagios.
  • Used Amazon RDS to manage, create snapshots, and automate backup of database.
  • Worked with Terraform to create AWS components like EC2, IAM, VPC, ELB, Security groups.
  • Focused on architecting NoSQL databases like Mongo, Cassandra and Cache database.
  • Deploying the Open stack components on multi-node with High availability environment.
  • Designed and developed the REST based Microservices using the Spring Boot.
  • Designed and implemented fully automated server build management, monitoring and deployment solutions spanning multiple platforms, tools and technologies including Jenkins Nodes/Agents, SSH, deployment and testing.
  • Implemented rapid-provisioning and life-cycle management for Red Hat Linux using Kickstart.
  • Performed Data Quality checks like flagging duplicates, null value checks, etc., for the data in Hadoop using Spark Scala API.
  • Deployed the generated build to WEB and APP server using the continuous integration process to all Environments.
  • Used technology like jQuery, JavaScript, JUnit, AJAX, SVN, Git.
  • Strong understanding of multiple programming languages, including Java, PHP, Python, Perl, JavaScript, HTML, CSS and XML
  • Debugged build failures and worked with developers and QA personnel to resolve related issues.
  • Integrated Amazon Cloud Watch with Amazon EC2 instances for monitoring the log files, store them and track metrics.
  • Monitored and track Splunk performance problems, administrations and open tickets with Splunk if there is need.
  • Installation of Splunk Enterprise, Splunk forwarders, Splunk Indexer, Apps in multiple servers (Windows and Linux) with automation.
  • Provide regular support guidance to Splunk project teams on complex solution and issue resolution.
  • Used Splunk Deployment Server to manage Splunk instances and analyzed security-based events, risks & reporting
  • Expertise in Actuate Reporting, development, deployment, management and performance tuning of Actuate reports. Involved in the Splunk UI/GUI development and operations roles
  • Designed and Implemented NoSQL - Cassandra / HBase Database/Schemas.
  • Interacted with client teams to understand client deployment requests.
  • Integrated JIRA with SVN and created automated release Notes using Perl Scripts and used JIRA to track issues.
  • Implemented MicroServices architecture using Spring Boot for making application smaller and independent.
  • Designed and configured Azure Virtual Networks (VNets), subnets, Azure network settings, DHCP address blocks, DNS settings, security policies and routing.
  • Virtual Machine Backup and Recover from a Recovery Services Vault using Azure PowerShell and Portal.
  • Hands on experience on Azure VPN-Point to Site, Virtual networks, Azure Custom security, end point security and firewall.
  • Managing and scheduling batch Jobs on a Hadoop Cluster using Oozie.
  • Wrote several cookbooks which include recipes to perform Installation and Configuration Tasks
  • Deployed and monitored scalable infrastructure on Amazon web services (AWS)configuration management
  • Leveraged AWS S3 service as Build Artifact repository and created release-based buckets to store various modules/branch-based artifact storage.
  • Provided configuration management expertise to all software development projects.
  • Set up Jenkins and Hudson for Continuous Integration Process
  • Configured various jobs in Jenkins and Hudson for deployment of Java based applications and running test suites
  • Built scripts using ANT, Gradle and MAVEN build tools in Jenkins, Sonar to move from one environment to other environments.
  • Created and maintained the Python deployment scripts for Web Sphere web application server.
  • Developed Perl and shell scripts for automation of the build and release process.
  • Implemented Continuous Delivery framework using Jenkins, Maven, and Nexus in Linux environment.
  • Installed, configured and managed Nexus Repository Manager and all the Repositories.
  • Created various branches for each purpose, merged from development to release branch, created tags for releases.

Environment: AWS EC2, Azure, IIS, WebLogic, WebSphere, Tomcat, Chef, Ansible, SVN, GIT, Jenkins, MS Build, JIRA, Docker, Confluence , Microservices, Hadoop.

Confidential, Boca Raton, FL

DevOps Engineer

Responsibilities:

  • Implemented Chef Recipes for Deployment on build on internal Data Centre Servers. Also, re-used and modified same Chef Recipes to create a Deployment directly into Amazon EC2 instances.
  • Evaluated testing of Chef Recipes - Concept of Test Driven Development for Infrastructure as a Code.
  • Automated Continuous Build and Deploy Scripts for Hudson/Jenkins Continuous Integration tool
  • Automated the cloud deployments using chef, python and AWS Cloud Formation Templates.
  • Involved in migration of Bamboo server, Artifactory & Git server.
  • Implemented rapid-provisioning and life-cycle management for Ubuntu Linux using Amazon EC2, Chef, and custom Ruby/Bash scripts
  • Integrated Jenkins with various DevOps tools such as Nexus, SonarQube, Chef etc
  • Develop and maintain Java applications with a focus on enterprise technologies.
  • Wrote Chef Cookbooks for various DB configurations to modularize and optimize product configuration.
  • Getting the list of issues from the components (project, module, file etc.) with the help of SonarQube.
  • Experienced in implementing Microservices, Service Oriented Architecture (SOA) with XML based Web Services (SOAP/UDDI/WSDL) using Top Down and Bottom Up approach.
  • Integrated various provisioning and monitoring modules into a single platform.
  • Managed central repositories: Implemented Atlassian Stash along with GIT to host GIT central repositories for source code across products, facilitate code reviews and login audits for Security Compliance.
  • Worked with Oozie workflow engine to manage interdependent Hadoop jobs and to automate several types of Hadoop jobs.
  • Managed internal deployments of monitoring and alarm services for the Azure Infrastructure (OMS).
  • Design and develop solutions using Microsoft Azure PaaS resources such as Service Fabric, IoT Hub, Event hub, Stream analytics, Document DB, app services, service bus, distributed cache.
  • Worked on architecture of DevOps platform and cloud solutions.
  • Used chef for server provisioning and infrastructure automation in a SAAS environment.
  • Integrated Automated Build with Deployment Pipeline. Currently installed Chef Server and clients to pick up the Build from Jenkins repository and deploy in target environments (Integration, QA, and Production).
  • Implemented scheduled downtime for non-prod servers for optimizing AWS pricing.
  • Created proper documentation for new server setups and existing servers.
  • Developed installer scripts using Maven, Python for various products to be hosted on Application Servers
  • Good knowledge in running Hadoop streaming jobs to process terabytes of xml format data.

Environment: Azure, Java/J2EE, Git, jQuery, Tomcat, Apache, Elastic Search, Oracle 11g, Jenkins, Python, Ruby Chef, JIRA, Maven, Arti factory, Git, Ubuntu, CentOS, Linux, AWS ELB, AWS SQS, AWS S3, AWS Cloud Formation Templates, AWS RDS, AWS Cloud Watch, Ruby, PowerShell, Chef, MicroServices.

Confidential

DevOps Engineer.

  • Building and supporting environments consisting Testing, Development and Production.
  • Performed the automation using Chef Configuration management.
  • Involved in chef infra maintenance including backup/monitoring/security fixtures.
  • Worked on Chef Server backups.
  • Worked with Knife command-line tool and creating Cookbooks.
  • Manage deployment automation using recipes, cookbooks in Chef using Ruby.
  • Implementing a Continuous Delivery framework using Jenkins in Linux environment.
  • Create pipelines for Jenkins jobs.
  • Viewing the selected issues of web interface using SonarQube.
  • Create new EC2 instance in AWS, allocate volumes and giving Provisionals using IAM.
  • Creating image of existing EC2 instance with all the required software for applications.
  • Used AWS Cloud Formation Templates to simplify provisioning and management on AWS.
  • Monitor the usage, health and logs of application with the help of Amazon Cloud Watch.
  • Worked with Amazon Redshift to create simple, scalable data warehouse.
  • Responsible for nightly and weekly builds for different modules.
  • Implementing a CI/CD using Jenkins with build tools Maven, Ant and Gradle.
  • Studied and Gathered requirements for the Data Design to be developed in-line with Performance, Capacity and Forecasting for Document based NoSQL.
  • Extensively worked with Version Control Systems GIT and SVN.
  • Involved in emitting processed data from Hadoop to relational databases or external file systems using SQOOP, HDFS GET or CopyToLocal.
  • Create, extend, reduce and administration of Logical Volume Management (LVM) in RHEL environment.
  • Applied patches and packages on Linux servers using rpm and yum tools.
  • Worked with scripting language like Bash and Ruby.
  • Installation of third party tools using packages.

Environment: RedHat Linux, VMware ESX, VMware vSphere, Windows servers, GIT, Jenkins, Chef, C++, SVN, AWS.

Confidential

Build Engineer

Responsibilities:

  • Used Subversion to manage different builds for the system.
  • Wrote Shell scripts for compilation and deployment process.
  • Integrated SonarQube with Jenkins to test the code quality.
  • Wrote ANT Scripts in making all the files local to the server.
  • Created different files and checked files for production support.
  • Worked with Web sphere app server admin console for deploying applications.
  • Carried out deployments and builds on various environments using continuous integration tools.
  • Used Source Code configuration tools like Subversion and GIT.
  • Developed and implemented the software release management for the release of web applications.
  • Worked closely with Project Manager's for the release of all the operational projects.
  • Communicated with all levels of engineering, management, development and test teams.
  • Collaborated with developers and managers to resolve issues that rose during deployments to different environments.

Environment: Subversion, GIT, Shell Scripts, MAVEN, Web Sphere, JDK, UNIX, LINUX, Windows XP, Java/ J2EE.

We'd love your feedback!