We provide IT Staff Augmentation Services!

Sr. Aws Cloud Devops Resume

5.00/5 (Submit Your Rating)

Moline, IL

SUMMARY

  • 10.5+ years of substantial IT experience with expertise in Amazon Web Services (AWS), DevOps, Release Engineering, Configuration Management, Continuous Integration, Continuous Deployment, and Cloud Implementations.
  • Experienced in Cloud Migration (Lift & Shift) and Cloud Native systems.
  • Specialized in utilizing AWS as the Cloud Platform - includes Cloud Automation, Managed Services and Serverless.
  • Designed and provisioned Virtual Network at AWS using VPC, Subnets, Network ACLs, Internet Gateway, Route Tables, NAT Gateways.
  • Integrated on-premises ADFS with AWS IAM and STS for Federated Access to AWS Cloud using existing corporate identities.
  • Designed Continuous Integration and Continuous Delivery pipelines using Code Pipeline, CodeBuild and CodeDeploy
  • Experience in working with Code coverage tools Java Jacoco, PMD source code analyzer
  • Automated the creation of Application Stacks using CloudFormation.
  • Experienced and good knowledge on ESXi, HyperV, KVM
  • Good knowledge on tools like CHEF, Kafka, Apache Spark etc.
  • Migrated legacy Jenkins Jobs to Jenkins 2.0 Pipelines using JenkinsFile.
  • Provisioned Centralized Logging Infrastructure based on ELK, using managed AWS ElasticSearch service.
  • Implemented Blue-Green deployment model using EC2 Autoscale Groups and Application Load Balancer.
  • Automated Base Image creation and Custom Image Baking process using Ansible Playbooks.
  • Database Backup Restore and Archival processes using Amazon S3 and Glacier.
  • Deployment Process and ServiceNow Integration to automatically create tickets for Production Deployments via CI/CD pipelines.
  • Experience with Distributed Logging solutions based on ELK Stack and Splunk.
  • Version Control with Git, GitHub and SourceTree.
  • Reviewed OpenShift PaaS product architecture and suggested improvement features after conducting research on Competitors products.
  • Streamlined installation of OpenShift on partner cloud infrastructure such as AWS.
  • Microservices deployment model using Docker for containerization and Kubernetes for orchestration.
  • REST API Design using Swagger and API Management AWS API Gateway.
  • Experienced in developing custom AWS utilities and frameworks using boto3 Python API.
  • Design of Streaming Data solutions using Amazon Kinesis, S3, Aurora, Lambda, API Gateway.
  • Design of Enterprise Data Lake, Big Data solutions using Amazon S3, EMR, Data Pipelines, Apache Spark, Redshift, ElasticSearch and Glacier.
  • Serverless Architectures using AWS SAM, Serverless Framework, Chalice, Terraform.
  • Highly Optimized Static Content solutions for Web Apps using Amazon Route 53, CloudFront, S3.
  • Design of SSO using Auth0 as Identity Provider, integrated with Amazon Cognito and STS.
  • Data Migration solutions to migrate data from Corporate Data Centers to AWS using Database Migration Service.
  • Worked on monitoring tools like Nagios, Splunk, Zabbix and AWS Cloud Watch to health check the various deployed resources and services.
  • Designed Enterprise Cloud charge-back model applicable for all AWS Accounts using a custom-built solution based on Tagging of AWS Resources.
  • Experienced in using APM tools - ELK, New Relic, App Dynamics, PagerDuty.
  • Experienced in using Kafka for streaming of data.
  • ELK Stack provisioning automation using CloudFormation, Ansible and Bash Shell Scripts.
  • Developed “cloud-creator” Python module based on CloudFormation, boto3 SDK and Bash shell scripts.
  • Experience in installing and configuring Pivotal Cloud Foundry environments (PCF)
  • Developed CI / CD pipelines using Jenkins 2.0 Pipeline as code feature.

TECHNICAL SKILLS

Platforms: AWS, Azure, GCP, Linux, Unix, Windows

Cloud/AWS Compute: EC2, ECS, ELB, Auto Scaling

Serverless: Lambda, Step Functions, AWS SAM, Serverless framework, Chalice

Storage: S3, EBS, EFS, Glacier

Database: DynamoDB, Aurora, RDS, ElastiCache, Redshift

Networking: VPC, Route 53, Direct Connect

Analytics: Kinesis, ElasticSearch, EMR, Data Pipeline

Mobile: API Gateway, SNS

Dev Tools: CodeBuild, CodeDeploy, CodePipeline, AWS CLI

Management Tools: CloudFormation, CloudTrail, Config, Trusted Advisor

Monitoring Tools: CloudWatch, Nagios, Zabbix, New Relic, Splunk, App Dynamics, Pager Duty, ELK

Security: Identity & Access Management (IAM), Cognito

App Services: SQS, SES

DevOps: Git, Jenkins, Travis CI, Ansible, Chef, Terraform, CodeDeploy, Maven, ANT, Gradle, NPM, YARN, JFrog Artifactory, CloudFormation, ELK Stack, Docker, Kubernetes, Gatling, Chaos Monkey

Programming: Python, Bash Shell, PHP, JavaScript, C, C++

Server Side IDE: PyCharm, Atom, Eclipse, Sublime Text

Build / CI: Jenkins, Travis

Testing: Gatling, Selenium

Client Side / Mobile Technologies: HTML5, CSS3, JSON, PhoneGap

Frameworks / Libraries: React, Bootstrap, Flux, jQuery, Less

Databases: DynamoDB, ElasticSearch, Postgres, MySQL, Oracle, SQL Server

Agile: Trello, Rally, Jira, Confluence, Slack, Flowdock

Version Control: Git, GitHub, BitBucket, SourceTree, Mercurial, SVN

ETL Tools: Informatica Power Center

PROFESSIONAL EXPERIENCE

Sr. AWS Cloud DevOps

Confidential - Moline, IL

Responsibilities:

  • Defined a roadmap of the possible Architectural Enhancements utilizing AWS Managed Services
  • Strategy to replace existing MongoDB with DynamoDB
  • Utilized Aurora database for Master and Data
  • Introduced AWS Kinesis as a solution for backpressure between decoupled Microservices
  • Created Ansible playbooks to automatically install packages from a repository, to change the configuration of remotely configured machines and to deploy new builds.
  • Designed an Auto Scale Solution to eliminate pets - utilizing AWS ECS (Docker)
  • Virtualized the servers using Docker for the test environments and dev-environments needs, also configuration automation using Docker containers
  • Implemented a Continuous Delivery pipeline with Docker, Jenkins and GitHub and AWS.
  • Worked on microservices project- to build docker containers and deploy to Dev, iTest, Scale, UAT (SIT), PROD.
  • Implemented CI/CD for all the microservices application using Jenkins, Maven and Ansible.
  • Future State ELK Stack - with enhancements like Redis as middleware and ELK autoscaling
  • Experience in integrating code quality tools such as SonarQube in CI/CD pipelines
  • Spin up instances using AMI's available on EC2 with Oracle WebLogic with a version of WebLogic Server with JRockit JDK.
  • Using the AMI, setup the fully functioning Oracle WebLogic Server environment up and running ready to host JEE applications and created a Golden Images from it.
  • Created an Auto-scaling groups technology with WebLogic 12c Dynamic Clusters ensure elastic solutions where new instances are spawned in order to cope with high traffic peaks without any human intervention.
  • Performed database administration, production support, installation, configuration, upgrades, patches, migration, backup and recovery, performance tuning.
  • Designed and developed a Cloud Automation Python Module - to automate the management of resources at AWS
  • Designed Cloud Audit & Compliance module using Cloud Custodian and AWS Config
  • Used Terraform key features such as Infrastructure as code, Execution plans, Resource Graphs, Change Automation
  • Development of automation of Kubernetes clusters with Ansible, writing playbooks.
  • Kubernetes architecture: node, kubelet manages pods, their containers, images, volumes, network etc. kube-proxy is a simple network proxy and load balancer responsible for reflecting services on the nodes.
  • Utilized "Chaos Monkey" for Kubernetes as Kube-Monkey to validate the resiliency of the Kubernetes pods in the cluster for failure-resilient services to validate in development environment.
  • Managed Kubernetes charts using Helm. Created reproducible builds of the Kubernetes applications, managed Kubernetes manifest files and Managed releases of Helm packages
  • Worked on Workload Deployment, managing resources, Replication Controller Operations Rolling Updates Using Configmap Horizontal Pod Autoscaling.
  • Migrated the regional applications to PCF.
  • Build deployed and auto scaling experience in Pivotal Cloud Foundry (PCF) for our applications.
  • Established and maintained continuous delivery pipelines for deployment of Pivotal Cloud Foundry and related products for various Orgs Design and implement continuous integration and continuous delivery processes to deliver applications to DEV/TEST/PROD.
  • Experience working with application teams that support the PCF infrastructure to resolve PCF migration issues, prioritize release of PCF features, implement new PCF platform features and troubleshoot PCF issues for migrated applications.
  • Developed Strategy for Cloud Cost Optimization based on AWS Trusted Advisor
  • Develop SSIS Package for uploads and schedule them as SQL Server Agent. Use SSIS to create ETL packages to validate, extract, transform and load data to SQL Server database.
  • Designed warehouse model for Redshift, integrated with Tableau
  • Provisioned REST API (defined with Swagger) stack using AWS Serverless Application Model (SAM)
  • Enabled single sign-on (SSO) solution by integrating AWS IAM and Auth0 Identity Provider (IdP)
  • Used OpenID Connect (OIDC) for API Security, utilized JWT tokens to propagate Scopes and Claims
  • Designed Data Backup and Restore procedures using AWS Batch, Lambda for scheduling and S3 & Glacier as backup store
  • Managing the Virtual Machines using ARM JSON templates using PowerShell for windows servers.
  • Designed backup Data lifecycle based on S3 Storage Classes
  • Used Maven, npm, ANT and Gradle to build rpms from source code checked out from Subversion repository, with Jenkins being the Continuous Integration Server and Nexus as repository manager

Cloud Analyst

Confidential -Omaha, NE

Responsibilities:

  • Designed and Developed the Data Ingestion and Transformation Phases of Data Lake
  • Developed the Index Model for Data Lake Metadata Store using AWS ElasticSearch Service
  • Data Migration from on-prem to AWS using S3 and Data Pipeline
  • Transactional Data Store - Data Modeling for DynamoDB
  • Experience with ElasticSearch, Logstash and Kibana stacks
  • Implemented cloud infrastructure using Chef and implemented auto scaling and Assign chef roles to EC2 instances
  • Designed Data Warehouse store based on Redshift
  • Developed Data Analytics Batch Jobs using EMR
  • Good working knowledge on NOSQL databases such as MongoDB and Cassandra.
  • Extracted the data from MySQL, Oracle, SQL Server using Sqoop and loaded data into Cassandra.
  • Automated the Excellent knowledge on CQL (Cassandra Query Language), for retrieving the data present in Cassandra cluster by running queries in CQL.
  • Create automation and deployment templates for relational and NoSQL databases including MSSQL, MySQL, Cassandra and MongoDB in AWS.
  • Created Chef Cookbooks for sudo users and network configurations using Chef Server.
  • Written Chef Cookbooks for various DB configurations to modularize & optimize product configuration.
  • Experience in creating Docker Containers leveraging existing Linux Containers and AMI's in addition to creating Docker Containers from scratch
  • Utilized “Hashicorp’s Vault” for Secrets Management - to control access to tokens, passwords, s, API keys
  • Developed Service Discovery component based on Distributed Coordination Component with “Hashicorp’s Consul”
  • Designed an Auto Scale Solution to eliminate pets - utilizing Dynamic Scaling feature of EC2 Autoscale Groups
  • Introduced AWS Kinesis as a solution for backpressure between decoupled microservices
  • Utilized “Chaos Monkey” to validate the resiliency of the Data Lake components
  • Scheduled CronJobs using Ansible
  • ELK Stack provisioning automation using CloudFormation and Ansible

Sr. AWS/DevOps Engineer

Confidential -York, PA

Responsibilities:

  • Established continuous integration (CI) practices and standards with JIRA, Jenkins and Continuous Delivery (CD) through Chef
  • Created CloudFormation created AWS end-to-end infrastructure for Dev, Staging and Prod to host and test
  • Converted our staging and Production environment from a handful AMI's to a single bare metal host running Docker
  • Implemented the Docker for wrapping up the final code and set up development and testing environment using Docker Hub, Docker Swarm and Docker Container Network
  • Installed and configured Nexus Repository Manager to share the artifacts between the teams
  • Used Maven, npm, ANT and Gradle to build rpms from source code checked out from Subversion repository, with Jenkins being the Continuous Integration Server and Nexus as repository manager
  • Responsible for automated Scheduled Builds/Emergency Builds and Release using Maven scripts for Enterprise application (J2EE)
  • Creating an AWS RDS MySQL DB cluster and connected to the database through an Amazon RDS MySQL DB Instance using the Amazon RDS Console

Build and Release Engineer

Confidential, Los Angeles, CA

Responsibilities:

  • Infrastructure Provisioning and AMI Baking process using Ansible
  • Responsible for designing and deploying best SCM processes and procedures
  • Developed Build and Deployment processes for Pre-production environments
  • Written Ansible playbooks for various configurations to modularize and optimize product configuration
  • Documented all build and release process related items. Level1 support for all the build and deploy issues encounter during the build process
  • Used Nexus for Artifact Management
  • Created Jenkins Job for automation of build and deployment process as part of Continuous Integration strategy
  • Developed Bash Shell scripts used as init scripts for EC2 bootstrapping
  • AWS solutions using EC2, S3, RDS, DynamoDB, EBS, ELB, ASGs

Sys Admin

Confidential

Responsibilities:

  • Performed Linux administration tasks in test and production environments with installing, configuring and troubleshooting the client applications on the Linux servers
  • Designed and implemented an automated system installation and configuration of system, based on Linux Kickstart builds for common server configurations
  • Utilized BMC Control-M software for complex Job Scheduling and Management workflows
  • Used Nagios for Linux infrastructure monitoring
  • Created users, managed user permissions; maintained User & File System quota on Red hat Linux
  • Design and implement a DNS/DHCP solution to replace current aging system. Solution required 99.9% uptime via Linux clustering. Transferring files between host and client using FTP
  • Used VERITAS Volume manager to create disk groups, volumes, volume groups, and RAID
  • Extending volume groups, logical volume (LVM) to manage file systems
  • Managed and administrated of all UNIX servers, includes Linux operating systems by applying relative patches and packages at regular maintenance periods using Red Hat Satellite server, YUM, and RPM tools
  • Configured yum repository server for installing packages from a centralized server
  • Responsible for Building & configuring Red Hat Linux systems over the network, implementing automated tasks through crontab, resolving tickets according to the priority basis
  • Tracked overall project progress and reported status to senior management on a regular basis

Linux Admin

Confidential

Responsibilities:

  • Developed Shell scripts for ad hoc automation tasks
  • Handling the day-to-day Operations, install software, apply patches, manage file systems, monitoring performance and troubleshoot alerts
  • Monitoring of Test servers using Health Checker alerts and taking corrective actions
  • User/Group creation deletion & permission, quota implementation on Linux
  • Solving and Troubleshooting Tickets raised by users by using Ticketing tool (SNOW)
  • LVM Installation and configuration of LVM. Extending and partition with LVM depending upon the memory need
  • Carrying out emergency changes like extending file system size in case of 100% utilization of file system
  • FTP Installation and configuration of VSFTPD to download and upload the files for client
  • Monitoring the processes using PS, top, htop, Nagios and killing the zombie processes
  • Application troubleshooting and support
  • Creating Swap Memory
  • Responsible for taking backup and restoring the data using tar and gzip and transferring using scp

Associate Informatica Developer

Confidential

Responsibilities:

  • Involved in internal project for Development and support of Confidential products
  • As an associate ETL Developer, I was involved mainly in creation of mappings
  • Involved in developing Mappings and reusable transformations using Informatica Designer
  • Used Informatica Power Center 7.1 for extraction, transformation and load (ETL) of data in the data warehouse
  • Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer
  • Created the unit test cases for mappings developed and verified the data
  • Involved in extracting the data from the Flat Files and Relational databases like Sybase into staging area
  • Developed Workflows using task developer, workflow designer in Workflow manager and monitored the results using workflow monitor
  • Load this data on monthly basis in DWH / Courses

We'd love your feedback!