Lead Aws Devops Engineer Resume
Herndon, VA
SUMMARY
- Over 9+ years of IT hand on experience as DevOps Engineer expertized in Developing, configuring, Automating and deploying instances on AWS, Azure and GCP.
- Expertized on working wif a wide range of AWS cloud services like EC2, ELB, Auto Scaling, VPC, Route53, RDS, S3, IAM, SNS, SQS, DynamoDB, Elasticsearch and CloudWatch.
- Proficient in Unix/Linux, including FreeBSD, BSDI, Debian, Ubuntu, Red Hat/CentOS, and SCO. Experience wif HPUX and AIX Unix. Proficient in Windows Server
- Integrated Azure Log Analytics wif Azure VMs for monitoring the log files, store them and tracks metrics and used Terraform as a tool, Managed different infrastructure resources Cloud, VMware, and Docker containers.
- Deploying the Microservices in Cloud (Google Cloud Platform - GCP, Amazon AWS) using Kubernetes / AWS ECS
- Experienced in Golang Microservices using channels, routines, functional interfaces, and various frameworks.
- Experience in Devops preferably wif cloud connected devices systems, distributed applications and databases using Java, Scala, Apache Tomcat, Cassandra and RDBMS.
- Strong experience in building DevOps Jenkins CICD Pipeline Jobs using Groovy Script.
- Hands on experience wif AWS Cloud platform and its services which includes EC2, VPC, S3, EBS, ELB, Auto scaling, DynamoDB, Cloud Watch, Cloud Trail, SNS, SES, SQS, AURORA, LAMBDA, API Gateway, Redshift.
- Extensively worked on EKS to create clusters for the application teams to deploy their microservices. Performed regular upgrades to the clusters.
- Creating an AWS RDS MySQL DB cluster and connected to the database through an Amazon RDS MySQL DB Instance using the Amazon RDS Console.
- Working experience on integrating back-end technologies (Node JS) wif JavaScript frameworks (Angular JS) by using AJAX and JSON.
- Optimized the Pyspark jobs to run on Kubernetes Cluster for faster data processing.
- Expertise wif enterprise cloud solutions like Platform-as-a-Service (OpenShift by Red Hat), containers, Kubernetes, cloud management (Red Hat Cloud Forms), and IT automation (Ansible by Red Hat).
- Experience of the design patterns and best practices wif Golang (and more) to start wif design and get to deployable production systems including scale monitoring and instrumentation platform.
- Experienced in writing live Real-time Processing using Spark Streaming wif Kafka on AWS EMR.
- Good understanding of AWS services like EC2, VPC, IAM, RDS, S3, ELB, ALB, EBS, Route53, CloudFront, CloudFormation, SNS, SQS, AMI, AWS Glue, Athena, ECS, Autoscaling, Kibana, Elasticsearch.
- Good experience in Shell Scripting's Server, Unix and Linux, Open stack and Expertise on Python scripting wif focus on DevOps tools, CI/CD and AWS Cloud Architecture.
- Designing and implementing Log Aggregation and a centralized monitoring solution using CloudWatch, CloudTrail, Elasticsearch, Logstash/Kinesis and Kibana/Grafana for AWS cloud.
- Experience wif infrastructure-as-code provisioning (Terraform, Cloud Formation, ARM) and configuration (Ansible, Chef).
- Expertise in improving build & deployment tools in DevOps through automation using scripting languages such as JavaScript, Shell, Bash, Perl, JSON, Ruby, Groovy and Python etc.
- Depth knowledge of Waterfall and Agile Application Development Methodologies, as well as hands-on experience working on an Agile Development Team using SCRUM Methodology and paired programming using Test Driven Development (TDD).
- Created Pyspark frame to bring data from DB2 to Amazon S3.
- Worked on container-based technologies like Docker, Kubernetes and OpenShift along wif development of Microservices architecture using Spring Boot including distributed SSO Authentication and Authorization, Distributed Session Management etc.
- Developed and implemented software applications and systems using various tools and technologies including Golang, AWS, Docker.
- Strong experience in designing CI/CD process, Process automation, build and deployment automation, Release management, source code repository and Amazon AWS infrastructure management
- Created Pyspark frame to bring data from DB2 to Amazon S3.
- Hands on experience in automation of AWS services using Lambda, SQS, SNS, SES, RouteR3, AWS Glue, cataloging tables Crawler, Glue manual database/external table creation, Glue ETL creation for Redshift fact and dimension tables loads for huge volume of data and unload data into S3 for Sage maker/ML consumption for train the predictive models.
- Expertise in improving build & deployment tools in DevOps through automation using scripting languages such as JavaScript, Shell, Bash, Perl, JSON, Ruby, Groovy and Python etc.
TECHNICAL SKILLS
DevOps Tools: Nagios, Jenkins, Puppet, Chef, Ansible, Docker, Git(GitHub), Maven, Vagrant, Splunk, Gradle, JIRA, Salt Stack, IBM Rational ClearCase, ANT Amazon Web Services
Cloud Computing: Amazon web services like AWS EC2, VPC, S3, Route53, Cloud Watch, IAM, SES, RDS, Cloud Front, EC2 CLI, Python Boto module
Operating Systems: UNIX, Linux, MS-Windows, Windows XP
Deployment Tools: Puppet, Ansible, Chef.
Databases: MS Access, MYSQL, MS SQL Server 7.0/2000, Oracle
Programming Languages: Shell/Perl/Python Script, C, C++, JAVA, HTML,XML
Virtualization Tools: OpenShift, Docker, Kubernetes, Terraform, VM virtual Box and VMware.
Scripting: Shell, Bash, Perl, Ruby and Python.
Configuration Tools: Chef, Puppet, salt and Ansible.
Automation Tools: Jenkins/Hudson, DevOps CI/CD, Udeploy, Artifactory, ELK and Build Forge.
SDLC: Agile, Scrum, Kanban, Jira, VersionOne
Networking Virtualization: Virtual Box, Docker, Vagrant, EC2 Container Service(ECS), Micro Services.
PROFESSIONAL EXPERIENCE
Lead AWS Devops Engineer
Confidential, Herndon, VA
Responsibilities:
- Lead all the DevOps initiatives wif a focus on enabling development teams to drive smooth running of CI/CD systems engineering and technical operations
- Enterprise Container Services, and today using AWS Faregate. Implemented Micro Services framework wif Spring Boot, NODE.JS, and OpenShift containerization platform (OCP).
- As Team Lead, mentored and advised other technicians in PowerShell, PHP, Bash scripting, and Linux/Window Release Engineering duties.
- Involved in designing and deploying multitude applications utilizing almost all AWS stack (Including EC2, S3, AMI, Route53, RDS, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and Auto-Scaling in AWS Cloud Formation.
- Sound knowledge on Docker setup in the Golang repository.
- Worked on multiple DevOps and Cloud tools dat achieve KPIs. Coordinating wif implementation team, to build and engineer services for Linux and Windows OS on cloud (AWS, Azure & GCP) platforms. Provisioned Instancs, Storages & monitoring Services and CI/CD pipeline through Jenkins.
- Lead the remote Analytics Platform team to build out Carbonite's Holocron a big data analytics platform utilizing Apache tools/frameworks including Apache Hadoop, Spark, Hive running on Qubole and an Airflow cluster to build out an Amazon S3-backed data lake
- Developed and supported a big data, cloud-based application using the following technology stack: Apache Spark, Amazon Web Services, EC2, CFT, AWS Lambda, S3, Scala, Ruby, Java 8, Apache Kafka, Linux, GitHub, Jenkins, Chef, Nexus, Tomcat, IntelliJ, RubyMine, Cucumber, Gherkin.
- Data Extraction, aggregations and consolidation of Adobe data wifin AWS Glue using PySpark.
- Application Development using Waterfall and Agile Methodologies, wif practical experience on an Agile Development Team using SCRUM Methodology and paired programming wif Test Driven Development (TDD).
- Deployed and monitored scalable infrastructure on Amazon Web Services (AWS) & configuration management using Puppet.
- Developed and maintained automated CI/CD pipeline for code deployment using Packer, Ansible, Bash, Python, PowerShell Scripts for automation - Code Commit, Terraform, AWS Cloud Formation.
- Great exposure to network protocols like TCP/IP, UDP, DNS, SMTP, FTP, TELNET, HTTP and frame works like struts, spring and Used Amazon EMR for map reduction jobs and test locally using Jenkins.
- Virtualization & container technologies: AWS, Azure, Docker, OpenShift
- Implemented AWS solutions using EC2, S3, Aws Lambda, RDS, IAM (Identity and Access management), Route53, Elasticsearch, Cloud front, EBS, Elastic Load Balancer, Auto scaling groups, Optimized volumes and EC2 instances using API Gateway.
- Manage metadata alongside the data for visibility of where data came from, its linage to ensure and quickly and efficiently finding data for customer projects using AWS Data lake and its complex functions like AWS Lambda, AWS Glue.
- Integrate Azure Log Analytics wif Azure VMs for monitoring the log files, store them and track metrics and used Terraform as a tool, Managed different infrastructure resources Cloud, VMware, and Docker containers.
- Extensively working on Jenkins CI/CD pipeline jobs for end-to-end automation to build, test and deliver artifacts and troubleshoot the build issue during the Jenkins build process.
- Developed and supported a big data, cloud-based application using the following technology stack: Apache Spark, Amazon Web Services, EC2, CFT, AWS Lambda, S3, Scala, Ruby, Java 8, Apache Kafka, Linux, Github, Jenkins, Chef, Nexus, Tomcat, IntelliJ, RubyMine, Cucumber, Gherkin.
- GoLang and Java were used as a backend REST service while AngularJS was used for the user interface.
- Used Airflow and Hive to build a big data ETL pipeline dat produces a set of common business entities from an ever expanding list of internal and external sources. These entities are then consumed to generate complex
- Worked on multiple DevOps and Cloud tools dat achieve KPIs. Coordinating wif implementation team, to build and engineer services for Linux and Windows OS on cloud (AWS, Azure & GCP) platforms. Provisioned Instancs, Storages & monitoring Services and CI/CD pipeline through Jenkins.
- Created Pyspark frame to bring data from DB2 to Amazon S3.
- Use SQL Server Integration Services (SSIS) and Extract Transform Loading (ETL) tools to populate datadog from various data sources, creating packages for different data loading operations for the application.
- Designing and implementing Log Aggregation and a centralized monitoring solution using CloudWatch, CloudTrail, Elasticsearch, Logstash/Kinesis and Kibana/Grafana for AWS cloud.
- Implemented GitLab for version control of puppet modules and process documentation
- Developed Groovy scripts for java application deployment by using Jenkins.
- Building Microservices using Golang, JSON, Docker, MongoDB.
- Created function in Lambda dat aggregates the data from incoming events, then stored result data in Amazon DynamoDB and S3.
- Used AWS Cloud platform and its features which include EBS, AMI, SNS, RDS, EBS, Cloud Watch, Cloud Trail, Cloud Formation, Auto scaling, Cloud Front, S3, and Route53.
- Implemented and managed Dynatrace on business social critical desktop and mobile applications.
- Wrote Ansible Playbooks wif Python SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python. Run Ansible Scripts to provision dev servers.
- Expertise wif enterprise cloud solutions like Platform-as-a-Service (OpenShift by Red Hat), containers, Kubernetes, ECS, cloud management (Red Hat Cloud Forms), and IT automation (Ansible by Red Hat).
- Enhancing and supporting a REST API dat enrolls customers into payment plan assistance programs. Tech stack used: Java 8, Spring 4, Spring Boot 2, AWS (EC2, SNS, CloudWatch), Docker.
- Writing live Real-time Processing using Spark Streaming wif Kafka on AWS EMR, EDL, AWS Connect.
- Implemented Microservices architecture using Spring Boot for making different application smaller and independent
- Worked wif Terraform Templates to automate the Azure Iaas virtual machines using terraform modules and deployed virtual machine scale sets in production environments.
- Configured Ansible to manage AWS environments and automate the build process of AMIs used by all application deployments including Auto Scaling, and Cloud formation scripts and automated the infrastructure by Chef scripts.
- Worked in DevOps group running Jenkins in a Docker container wif EC2 slaves in Amazon AWS cloud configuration. Also Extensively worked on EKS to create clusters for the application teams to deploy their microservices. Performed regular upgrades to the clusters.
- Experience AWS services (VPC, EC2, S3, RDS, Redshift, Data Pipeline, EMR, DynamoDB, Redshift, Lambda, SNS, and SQS).
SENIOR AWS DEVOPS ENGINEER
Expedia, Seattle, WA
Responsibilities:
- Developed Terraform Modules to Provision Kubernetes service EKS in AWS.
- Developed a side by side client applications using the Azure & Google Plus & Google Apps Cloud systems creating end-to-end JavaScript, Java, and C# implementations demonstrating how the Cloud API's stack up to each other.
- Supporting a REST API dat enrolls customers into payment plan assistance programs. Tech stack used: Java 8, Spring 4, Spring Boot 2, AWS (EC2, SNS, CloudWatch), Docker.
- Implemented Monitoring tools which includes Grafana, Prometheus and ELK Stack to monitor our Kubernetes cluster.
- Maintained automated CI/CD pipeline - Jenkins (Groovy).
- Developed and implemented software applications and systems using various tools and technologies including Golang, AWS, Docker.
- Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift.
- Integrated Kafka wif Flume in sand box Environment using Kafka source and Kafka sink.
- Configured Spark streaming to receive real time data from the Kafka and store the stream data to HDFS using Scala.
- Integrated Datadog to AWS Cloud using IAM Roles and Lambda Functions to route AWS service level logs and Metrics directly to Splunk Heavy Forwarder.
- Implementing Terraform and Cloud formation templates as Infrastructure as a Code for GCP, Azure and AWS Public Cloud solutions for various use cases.
- Used AWS Beanstalk for deploying and scaling applications and services developed wif PHP, python, ruby and java. working wif AWS, Azure, SNS, RedHat OpenShift, K8s Infrastructure design, deployment, and operational support.
- Automated the cloud deployments using Chef, Python and AWS Cloud Formation Templates.
- Delivered multiple information technology assignments and tasks, such as administering overall Product AWS Accounts, CI/CD, Application, Network, Monitoring, Upstream/Downstream dependencies
- Used Amazon EMR for map reduction jobs and test locally using Jenkins.
- Planned the architecture for migrating applications from bare metal and AWS to OpenShift. me have done few Upgrades and scaled OpenShift environment.
- Analyzed the SQL scripts and designed it by using Pyspark SQL for faster performance.
- Utilized Ansible and AWS lambda, elastic cache and cloud watch logs to automate the creation of log aggregation pipeline wif Elasticsearch, Log stash, Kibana stack (ELK stack)
- Wrote Ansible scripts to manage the creation of EC2 instances, ELBs, Route53 entries, Amazon Security Groups, customized Tomcat applications, Solr instances, Apache web servers, custom Java and Flex applications, and other miscellaneous items in a Linux environment.
- Created and wrote shell scripts (Bash), Ruby, Python and PowerShell for automating tasks.
- Managed IAAS, PAAS and SAAS Services in AZURE/AWS/GCP/SNS.
- Used AWS Beanstalk for deploying and scaling web applications and services developed wif Java, PHP, Node.js, Python and Ruby on familiar servers such as Apache, and IIS.
- Puppet and Chef Work wif solution architects around DevOps integrations wif to allow automated and orchestrated deploys to Hybrid Cloud model
- Designed, Developed and Deployed Web pages using Microservices dat were written using Golang and also used HTML and CSS.
- Wrote PySpark and SparkSql programs.
- Experience on Terraform a cloud-agnostic thereby allowing a single configuration to be used to manage multiple providers, and to even handle cross-cloud dependencies.
- Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR.
- Design and optimize queries as well as write complex SQL scripts during ETL development and testing.
- Reduced environmental cost using ECS(spot) and EKS(spot) for non-production-based application.
- Developed Ansible playbooks to do automated recovery process upon the failure of OpenShift MASTER.
- Worked on Infrastructure Development and Operations involving AWS Cloud Services, EC2, EBS, VPC, RDS, SES, ELB, Auto scaling, CloudFront, Cloud Formation, Elastic Cache, API Gateway, Route53, Cloud Watch, SNS.
- Deployed and optimized two tier Java, Python web applications to Azure DevOps CI/CD to focus on development by using services such as Repos to commit codes, Test Plans to unit test, deploy App Service, Azure Application Insight collects health performance and usage data of the process, stored artifacts in blob storages.
- Built a Full-Service Catalog System which has a full workflow using Elasticsearch, Logstash, Kibana, Kinesis, CloudWatch.
- Real time streaming of data using Spark wif Kafka.
AWS DEVOPS ENGINEER
Confidential, Costa Mesa, CA
Responsibilities:
- Used Terraform configuration to create, provision and bootstrap a demo on cloud providers like AWS.
- Implementing AWS solutions using EC2, S3, RDS, DynamoDB, EBS, and Elastic Load Balancer, Auto scaling, SNS, Cloud Trail, Cloud Formation, IAM, Route53, Cloud Watch, Lambda etc.
- Worked on CI/CD for PostgreSQL and MySQL Applications.
- Worked on Installation and configuration of DevOps tools (Nagios, Jenkins, Docker, Ansible, Git (GitHub), Splunk, Puppet, Chef).
- Created Airflow DAGs to schedule the Ingestions, ETL jobs and various business reports.
- Developed a stream filtering system using Spark streaming on top of Apache Kafka.
- Setup GCP Firewall rules to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.
- Involved in designing and deploying multitude applications utilizing almost all of the AW’S stack (Including EC2, Route53, S3, RDS, DynamoDB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling in AWS Cloud Formation.
- Designed, Developed and implemented a Lambda function to automate the firmware upload process to different environments in Node.js.
- Puppet module creation, integration, and testing Key Technologies: MongoDB, Go Continuous Delivery Engine, Puppet
- Created monitoring and analytic solutions via configurations in Ansible for Datadog and Nagios.
- Improve speed, efficiency and scalability of the continuous integration environment, automating wherever possible using Python, Ruby, Shell and PowerShell Scripts.
- Create new API using GoLang and Docker.
- Data Extraction, aggregations and consolidation of Adobe data wifin AWS Glue using PySpark.
- Worked wif AWS EMR hive and created external tables, hive partitions to analysed the data imported from AWS S3 and write it back to S3
- Manage AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing and Glacier for our QA and UAT environments as well as infrastructure servers for GIT and Chef.
- Administered and Engineered Jenkins for managing weekly Build, Test and Deploy chain, SVN/GIT wif Dev/Test/Prod Branching Model for weekly releases.
- Developed Terraform Modules to Provision Kubernetes service EKS in AWS.
- Expertise in DevOps tools like Chef, Puppet, Salt Stack, Ansible, Docker, Subversion(SVN), GIT, Jenkins, Ant and Maven.
- Defined and implemented Backup and Retention policies for RDS, DynamoDB, ECS, EKS Cluster and Logs in Sandbox, Non-prod, and Prod environments.
- Standardize, develop and maintain common development tools and infrastructure, such as CI/CD pipelines, monitoring, cluster management, config management, etc.
AWS DEVLOPER
Confidential, San Francisco, CA
Responsibilities:
- Provisioned AWS services like VPC, EC2, EKS, ECS, S3, ELB, Auto Scaling Groups, EBS, RDS, IAM, KMS, ECR Route53, CloudWatch and DynamoDB using Terraform.
- Implementing AWS solutions using EC2, S3, RDS, DynamoDB, EBS, and Elastic Load Balancer, Auto scaling, SNS, Cloud Trail, Cloud Formation, IAM, Route53, Cloud Watch, Lambda etc.
- Built an ETL framework for Data Migration from on premise data sources such as Hadoop, Oracle to AWS using Apache Airflow, Apache Sqoop and Apache Spark (PySpark).
- Installing and setting up Oracle8i on Linux for the development team
- Developed Terraform plugins using Golang to manage infrastructure which improved the usability of our storefront service.
- Working wif AWS, Azure, RedHat OpenShift, K8s Infrastructure design, deployment, and operational support.
- Check Datadog Quality of E2E System including ETL tool, Predictive analytics layer and different integrated system.
- Created Chef cookbook using Ruby for provisioning servers like Splunk, Sensu.
- Worked on google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring, and cloud deployment manager.
- Worked on the application as Node.js developer to build different REST APIs for modules such as user, products, checkout.
- Assisted in migration of numerous Confluence instances from internal servers to AWS EC2 instances, including migrating from Windows to Linux based instances.
- Maintained the user accounts (IAM), RDS, Route 53, VPC, RDB, Dynamo DB, SES, SQS and SNS services in AWS cloud.
- Used AWS Cloud platform and its features which include EBS, AMI, SNS, RDS, EBS, Cloud Watch, Cloud Trail, Cloud Formation, Auto scaling, Cloud Front, S3, and Route53.
LINUX ADMIN
Confidential, Battle Creek, MI
Responsibilities:
- Installed, Configured and Maintained RedHat Linux (RedHat Enterprise Linux 5.x, 6.x & 7.x) on SPARC, x86 and Blade Centres.
- Extensively worked on creating Puppet modules as per the business requirement.
- Monitoring: wif Nagios, Splunk, MySQL Dashboard, Cacti.
- Created users, manage user permissions, maintain user and file system quota on RHEL servers.
- Experience working wif RedHat OpenShift Infrastructure design, deployment, and operational support.
- Installation, Configuration, Tuning and Upgrades of Linux (RedHat).
- Working wif ETL tool suite to design and develop work flows, transformation mappings, and UNIX scripting.
- Setup and configured TCP/IP network on LINUX .
- Configuring Kickstart/AutoYast Servers and booting the images using PXE in Redhat Linux RHEL SLES .
- Configured Data-pipelines (wif EMR Cluster) to offload the data to Redshift.
- Strong in building Object Oriented applications using Java, writing Shell Scripts and on Linux environment.
- Set up and configuring of Linux (Red Hat & SUSE) and Solaris servers/workstations for clients.
- Developed UNIX shell scripts using Shell Scripting.
- Upgraded and maintained software packages on servers using RHEL Satellite and Repository servers.
- Monitored ESX server alarms and VM's alarms and acted on those incidents to resolve.