We provide IT Staff Augmentation Services!

Devops Engineer/aws Developer/sre Resume

0/5 (Submit Your Rating)

Dearborn, MI

SUMMARY

  • Around 8 years of Professional IT experience as an Azure DevOps, AWS Cloud Engineer, DevOps Automation Engineer, Build& Release management, and worked in many technical roles both in Linux and Windows environment for build/Release automation process in Web & cloud/server Environment using Java/J2EE/.NET Technologies,Azure, AWS& opensource technologies.
  • IT experience as a DevOps engineer of various applications on Red Hat Enterprise Linux and AWS Cloud environment and exposure to Windows environment.
  • Good understanding of Software Development Life Cycle (SDLC) like Agile, Waterfall Methodologies.
  • Worked on Amazon Web services (AWS) like EC2, S3, VPC, ELB, Auto Scaling, IAM, EBS. Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage and backup on AWS.
  • Extensively used the Terraform Templates (IaaC) to deploy the different AWS resources like EC2, Auto scaling, Load balancers, security groups etc. knowledge on AWS CloudFormation and Azure ARM templates for infrastructure provisioning.
  • Experienced on DevOps tools such as GIT, Bitbucket, Maven, Ant, Gradle, Jenkins, Jfrog Artifactory, Ansible, Chef, Puppet, tomcat, Docker, Kubernetes.
  • Generated artifacts by integrating Jenkins with GIT and Maven and storing artifacts into Nexus and deploying into application server.
  • Written fully automated build and deploy Jenkins pipeline scripts in groovy from scratch for new projects and for new modules within existing projects achieving fully automation from code commit to Build & Deploy on the lower environments and after passing E2E testing & Quality gates automatically deploy to production servers.
  • Experienced in Artifactory Management tools like JFrog Artifactory and Nexus.
  • Worked on automating the deployments using the Ansible and written lot of playbooks and roles and hands - on experience with Puppet Enterprise and Chef by creating custom manifests and recipes for automation.
  • Automated the build process using Continuous Integration (CI) /Continuous Delivery (CD) tools like Jenkins. Resolved Build and Release dependencies in collaboration with other departments.
  • Experienced in various source code management activities using tools like GIT, Bitbucket involving branching and Merging strategy, rebasing, Conflict resolution, configuration and administration maintenance, daily merges, tagging and remote repository.
  • Configured Jenkins master and dynamic agent environment to support High redundant Scalable and more agility.
  • Experience on build tools like Maven and Ant for building the deployable artifacts such as Jar & War from source code. Configured maven plugins to automatically manage the versions, tags and build docker containers using maven profiles.
  • Good experience in upgrading, patching, configuring and managing all DevOps tools and OS packages. Maintain N-1 Version of tools at all times
  • Experienced in working overApache Spark,Kafka,Hadoop,Cassandra. Installed, configured and managed Apache Hadoop Big Data clusters using Apache Ambari.
  • Experienced working on installing and configuring Kafka Cluster with Zookeeper. Have knowledge on partition of Kafka messages and setting up the replication factors in Kafka Cluster.
  • Experienced in Infrastructure Development and Operations involvingAWS Cloud Services, EC2, EBS, ECS, EMR, VPC, RDS, SES, ELB, Auto scaling, CloudFront, Cloud Formation, Elastic Cache, Elastic search, S3, Cognito, SQS, Athena, KMS, API Gateway, Route 53, Cloud Watch, SNS, SES. Used Boto3 with lambda to manage other AWS services. Hands on experience on GCP platform
  • Experience on Hashi corp Vault for secret management store and used Ansible Vault for secrets storage and inject while executing the playbooks.
  • Expertise in scripting for automation, and monitoring using Shell, Python scripts.
  • Experienced with enterprise monitoring solutions such as AppDynamics for APM, Dynatrace and Splunk for Log Analytics alerting and reporting, OP5 for Infrastructure monitoring and knowledge on NewRelic.
  • Installed and configured Nagios, Splunk, ELK for monitoring network services and host resources.
  • Experience in Vulnerability scanning using Twistlock and Jfrog X-ray and configured the intrusion detection systems like Tripwire and Snort.
  • Strong proficiency in supporting Production Cloud environments (AWS and VMWare) as well as traditional managed hosted environments.
  • Developed, enhance and maintain the build, deployment and configurations for continuous integration and automate regression and acceptance testing.
  • Experience in creating Docker Images and handling multiple containers using the Container Orchestration with Kubernetes and OpenShift.
  • Experience in using Kubernetes to provide a platform for automating deployment, scaling, and operations of application containers across clusters of hosts. Used helm to package the manifests and deploying applications on the OCP platforms.
  • Worked with JIRA & Service Now for creating Projects, assigning permissions to users and groups for the projects & Created Mail handlers and notification Schemes for JIRA & Service Now.
  • Experience in installation, configuration,Deploymentand management of web and enterprise applications on tomcat, JBoss, IBM WebSphere web logic Application servers.
  • Experience in Configuration, Backup, Monitoring Systemperformance, Systemand Network Security of Linuxservices

TECHNICAL SKILLS

Operating Systems: RHEL/CentOS, Ubuntu/Debian/Fedora, Windows server 2012/2016/2019

Languages and Scripting: YAML, Groovy, Python, Ruby, Shell, PowerShell

Database: MongoDB, OracleDB, MySQL, AWS RDS, DynamoDB

Infrastructure as a service: OpenStack, AWS, Azure, VMware, Terraform (IaaC)

Containerization: Docker, Kubernetes, OpenShift, AKS, AWS FarGate

Configuration management: Ansible, Chef, Puppet, RunDeck, Ansible Tower

CI, Test & Build Systems: Jenkins Pipelines, Concourse, Maven, Gradle, AntApplication/Web Servers: Tomcat, JBoss, Apache, IBM WebSphere, IBM HTTP server

Logging & Monitoring Tools: Nagios, OP5, Splunk, AppDynamics, NewRelic

Version Control Tools: GIT, SVN, Bitbucket

Security Tools: Twistlock, Rapid7, Tripwire

PROFESSIONAL EXPERIENCE

DevOps Engineer/AWS Developer/SRE

Confidential

Responsibilities:

  • Worked with architects, developers, QA and cloud development team to implement cloud applications and automate processes to reduce toil using DevOps automation tools.
  • Used Google’s SRE (site reliability engineer) culture in maintaining the reliable infrastructure and following key elements SLIs, SLOs, SLAs. Perform post-mortems with teams after every roll back or deployment failure with precise documentation and constantly improving the process from previous failures. Following metrics of MTTR (Mean Time to Rollback, Respond, Resolve, Recovery), Mean Time To Mitigate, Mean Time To Acknowledge.
  • Worked on converting the traditional applications to docker and automate the build and deploy process for faster deployment and reduced the deployment time by 80%.
  • We use terraform on wide range, created more than 200 terraform repos with reusable scripts on AWS.
  • Deployed more than 26 services using terraform Jenkins/Concourse CI tools in AWS.
  • Worked on AWS Cloud Services, EC2, EBS, ECS, EMR, VPC, RDS, SES, ELB, Launch configurations, Auto scaling, CloudFront, Cloud Formation, AMI, Elastic Cache, Elastic search, S3, Cognito, SQS, Athena, KMS, API Gateway, Route 53, Cloud Watch, SNS, kinesis firehose.
  • Deployed 11 services with Single terraform script including IAM Roles, Policies, S3, RDS, KMS, EMR, Glue, load balancers, route53, DynamoDB, and attaching autoscaling group.
  • Automated most of PaaS services using terraform like Vault, Jenkins, SonarQube. etc. on ECS containers
  • Created terraform scripts to use the source as single repo so this can be used by anybody within an organization just by replacing the input values
  • Implemented a lot of security configurations on Cloud services including bucket policy restrictions on S3, data at rest and in-transit encryption for EMR, encryption of all EBS volumes and tags implementation.
  • Automated most of the ec2 software deployments using user-data script on AWS using Terraform.
  • Automated most of the EMR software deployments using Step function and bootstrap actions on AWS using Terraform.
  • Implemented day to day pipelines for encryption of AMI’s from Public to Private for Amazon Linux, Redhat, ubuntu images.
  • Integrated Vault for fetching secrets through automated pipeline through Jenkins and Concourse
  • Implemented EFS storage to mount on different ec2 instances for sharing the Volume.
  • Created Elasticsearch service using a VPC bases access policy. Implemented policy on each service to restrict to VPC endpoints and to certain roles and user accounts.
  • Created many IAM roles and Policy for end user to restrict the Actions and resources on services.
  • Created many Lambda scripts to encrypt the un-encrypted EBS volumes and explicitly mention to stop the instances which doesn't have proper tags. Created Lambda using Boto3 to cleanup old EBS snapshot, un-used EBS volumes, and removed old AMI's for cost optimization.
  • Implemented many launch configurations and autoscaling groups for ECS instances. Implemented Monitoring CloudWatch alarms for EC2, ECS, RDS, EMR, ELB metrics.
  • Implemented VPC peering between different regions and accounts to enable data transfer.
  • Implemented CloudWatch rules with a cronjob to run lambda on periodic bases.
  • Automated the Build workflows, deployments, validations and E2E tests using the Jenkins pipelines.
  • Exposure to PCF and PKS. Worked on implementing different Tiles on PCF and PKS on BU’s requirement.
  • Implemented a Continuous Delivery pipeline with Kubernetes, Microservices, Jenkins and GitHub, Maven, Ansible.
  • Worked onGIT(GitHub) repositories as Distributed Version Control System. Performed Branching, Tagging, Release Activities on Version Control ToolGIT(GitHub).
  • Created the automated build and deployment process for application, re-engineering setup for better user experience, and leading up to building a continuous integration system for all our products.
  • Implemented AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer, and Autoscaling groups, Optimized volumes and EC2 instances.
  • Created multi availability Zone VPC instances and Used IAM to create new accounts, roles and groups and polices.
  • Configured S3 versioning and lifecycle policies to and backup files and archive files in Glacier. Configured Elastic Load Balancers (ELB) with EC2 Autoscaling groups and created monitors, alarms and notifications for EC2 hosts using CloudWatch.
  • Experience in working with AWS Code Pipeline and Creating CloudFormation JSON templates which is converted to Terraform for infrastructure as code.
  • Installed, Configured and Administered Jenkins, Bamboo Continuous Integration Tool. Using Jenkins AWS Code Deploy plugin to deploy to AWS.
  • Designed an Architectural Diagram for different applications before migrating into Amazon cloud for flexible, cost-effective, reliable, scalable, high-performance and secured.
  • Developed strategy to migrate Dev/Test/Production from an enterprise VMware infrastructure to the IaaS Amazon Web Services (AWS) Cloud environment including runbook processes and procedures.
  • Worked on Managing the Private Cloud Environment using Ansible and Enhanced the automation to assist, repeat and consist configuration management using Ansible based YAML scripts.
  • Managed Ansible Playbooks to automate system operations and AWS Cloud management using Ansible Automation.
  • Deployed microservices, including provisioning AWS environments using Ansible Playbooks.
  • Used Ansible run it on Red Hat Enterprise Linux (TM), CentOS, Fedora, Debian, or Ubuntu.
  • Automated the cloud deployments using Ansible, Python and AWS Cloud Formation Templates.
  • Implementing a Continuous Integration and Continuous Deployment framework using Jenkins, Maven & Artifactory in Linux environment.
  • Developed an autonomous continuous integration system by using GIT, Gerrit, Jenkins, MySQL and custom tools developed in Python and Bash.
  • Implemented new projects build framework using Jenkins, Maven as build framework tools.
  • Worked on Design, support and maintain the Splunk infrastructure on Windows and Linux environments. Installation of Splunk Enterprise, Apps in multiple servers with automation.
  • Managed a cloud platform base on the Lambda architecture including Kafka, Spark, and Cassandra.
  • Experience in virtualizing the servers using Docker for the test environments and dev environment needs and gained knowledge in cluster tools like Mesosphere and Kubernetes.
  • Experience in handling messaging services using Apache Kafka.
  • Experience DevOps Practice for Micro Services Architecture using Kubernetes for Orchestration.
  • Developed CI/CD system with Jenkins on Kubernetes container environment, utilizing Kubernetes and Docker for the runtime environment for the CI/CD system to build, test and deploy.
  • Responsible for automated deployment of application inTomcatServer using Ansible playbooks. Installed and managed the enterprise Nexus as a storage repository for artifacts.
  • Organized and Coordinated Product Releases work closely with product development, QA, Support across global locations to ensure successful releases.

Environment: Linux, Java, GIT, Jenkins, JFrog, maven, Python, Kafka, Ansible, AWS EC2, OpenStack, tomcat, WebSphere, Kubernetes, Shell Script, Nexus, Splunk, Ngnix, Docker, Kubernetes, Ubuntu, VMware, Jira, Xml and SQL.

DevOps Engineer/AWS Developer

Confidential, Dearborn, MI

Responsibilities:

  • Responsible for build, design and maintain the platform automation infrastructure using Chef and developed Chef Cookbooks to install and configure Apache, Tomcat, Splunk, Jenkins, Run deck and deployment automation.
  • Installed and configured SVN andGITand cloning, pull requests and pushing repositories into GitHub where source code is used in Jenkins for build configurations.
  • Jenkins is used as a continuous integration tool for automation of process and built Jenkins jobs using GitHub repos for managing weekly Builds too.
  • Experience in Setting up ChefWorkstation, boot strapping various enterprise nodes, setting up keys.
  • Designed and implemented Subversion and GIT metadata including elements, labels, attributes, triggers and hyperlinks.
  • Performed backend development using opensource toolset (PHP, MySQL, Apache, Linux and others (i.e LAMP)
  • Implemented and building tools such as Ant/Maven in order to automate and enhance the overall operational environment.
  • Developed installer scripts using Ant, Python, Unix for various products to be hosted on Application Servers.
  • Developed and maintained the continuous integration and deployment systems using Jenkins, Bamboo, Maven, Nexus and Ruby.
  • Integrated Jenkins with GIT for code pull, push and tag creation for build and deploy of the code to middleware environments and Setting up administering DNS system inAWSusingRoute53.
  • Excelled on creating AMI (AWSMachine Images) that utilizes ELB (Elastic Load Balancer) and Auto Scaling. That Auto Scaling fired up additional resources those will by default bootstrapped with the Chef Server.
  • Performed SVN toGITmigration and Implemented & maintained the branching and build/release strategies utilizingGIT.
  • Installation and configuration of Web Application servers like Apache,Tomcatand WebLogic.
  • Installed ApacheTomcat6, 7 and Apache Http Servers on the EC2 instances using CHEF and deployed the artifacts.
  • In this project, I mainly assisted in Creating and maintaining various DevOps related tools for the team such as provisioning scripts, deployment tools, and development and staging environments on AWS, PaaS and IaaS applications for client acquisition.
  • Performed numerous server migrations on both Linux/Unix and Windows servers. Migrations include moving all clients and their data, configuration settings, testing and verifying everything is correct with zero downtime.
  • Configured Apache on EC2 instances to make sure application that was created is up and running, troubleshoot issues to meet the desired application state.
  • Kafka- Used for building real-time data pipelines between clusters. Integrated Apache Kafka for data ingestion
  • Created the Aws Infrastructure using VPC, EC2, S3, Route 53, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation.
  • Authorized and maintained GNU Make-file support for parallel builds, cross compilers, cross builds, and embedded applications.
  • Proficient in the design of the GNU coding standards for configuration and Make file support.
  • Deployed and administered virtualized Linux infrastructure on Amazon AWS and Rackspace Cloud. Built custom scripts, workers and clients utilizing AWS SDK to manipulate Amazon EC2 and S3 resources.
  • Used Python API for uploading all the agent logs into Azure blob storage. Managed internal deployments of monitoring and alarm services for the Azure Infrastructure (OMS).
  • Responsible for automated deployment of the application inTomcatServer using Chef cookbooks.
  • Worked on deployment automation of all the microservices to pull image from the private Docker registry and deploy to Docker swarm cluster using Ansible.
  • Designed and managed build and Release cycle activities in Agile Methodologies and Implemented Software Configuration Management Standards in line with organization.

Environment: AWS, GIT, Ansible, Agile, RHEL 6/7, windows, Chef, Python, Kafka, Docker, VSTS, Django, Jdk1.7, Apache Ant, Maven, Shell Scripting (Bash), Jenkins, Bamboo, JIRA, Nexus, Apache Tomcat, Splunk.

Cloud/ Build and Release Engineer

Confidential, MI

Responsibilities:

  • Responsible for Automating the build workflows and managing the deployments in multiple clod environments.
  • Used Jenkins for Creating the automated builds and deploy artifacts to nexus and deploy on to the servers using ansible.
  • Creating S3 buckets, maintained and utilized the policy management of S3 buckets and Glacier for storage and backup on AWS.
  • Implemented and maintained the monitoring/alerting of production and corporate servers using AWS Cloud Watch.
  • Support, troubleshooting and problem resolution for the developed Cloud Formation scripts to build on demand EC2 instance formation.
  • Managed multiple AWS accounts with multiple VPC's for both production and non-prod where primary objectives included automation, build out, integration and cost control.
  • Designed AWS Security Groups which acted as virtual firewalls that controlled the traffic allowed to reach one or more AWS EC2 instances.
  • I worked closely with ETL developers and Data engineers to build platforms which can accommodate DevOps methodology.
  • Used Amazon Route53 to manage DNS zones and give public DNS names to elastic load balancers IP's. Use Amazon RDS MySQL to perform basic database administration. Set up DynamoDB for NoSQL data for other teams on lightweight Docker containers with elastic search and quick indexing.
  • Worked on variety of Linux platforms (Ubuntu, Red hat) which includes installation, configuring and maintenance of applications on this environment.
  • Work in migrating code from SVN to GIT repo (stash/bitbucket), clean the GIT repo (purging files).
  • Worked with the groovy scripts in Jenkins to execute jobs for a continuous integration pipeline where 'Groovy Jenkins Plugin' and 'Groovy Post-Build Action Plugin' is used as a build step and post build actions.
  • Configured Jenkins, used as a Continuous Integration tool for Installing and configuring Jenkins Master and hooking up with different build slaves. Automatized Java application builds using with Maven.

Environment: Java, Python, Maven, puppet, Jenkins, Docker, Nginx, Nagios, GIT, Agile, AWS EC-2, Route 53, S3, VPC, Auto-Scaling, ELB, ELK, Shell Scripts, Unix/ Linux environment, spark, Splunk.

Linux Administrator

Confidential

Responsibilities:

  • Installation, configuration and administration of Red Hat Linux servers and support for servers.
  • Planned and performed the upgrades to Linux (RHEL 4x, 5x, SUSE 10, 11, CentOS) operating systems and hardware maintenance like changing memory modules, replacing disk drives.
  • Provided the support of building the server, patching, user administration tasks, deployment, software installation, performance tuning and troubleshooting and KVM.
  • Installation and configuration of Oracle 7. X/8. X. Handling NFS, Auto Mount, DNS, LDAP related issues.
  • Monitoring CPU, memory, physical disk, hardware and software raid, multipath, file systems, networks.
  • Performing failover and integrity test on new servers before rolling out to production.
  • Wrote Shell Scripts for automation of daily tasks, documenting the changes that happen in the environment and in each server, analyzing the error logs, analyzing the user logs, analyzing the /var/log/Messages.
  • Good understanding of OSI Model, TCP/IP protocol suite DNS, IP, ARP, TCP, UDP, SMTP, FTP, and TFTP.
  • Knowledge of Routers and Switches, Subnet, VLAN, TCP/IP, Ethernet, VPN, OSI model, Firewall.
  • Worked on Network security skills include NAT/PAT, ACLs, AAA and ASA firewall.
  • Created local repositories on Linux servers Performed server updates, patching, upgrade and package installations using RPM and YUM.
  • Performed server updates, patches and upgrades using YUM and RPM.
  • Installed Firmware Upgrades, Kernel patches, systems configuration, performance tuning on Linux systems.
  • Extensive knowledge on Server administration, Kernel upgrade and deployment of patches and applying all firewall and security policies with emphasis on maintaining best practices.
  • Identified, troubleshoot, and resolve problems with the OS build failures.
  • Installation, configuration, and customization of services send mail, Apache, FTP servers to meet the user needs and requirements.

Environment: Java, Python, Maven, puppet, Jenkins, Docker, Nginx, Nagios, GIT, Agile, AWS EC-2, Route 53, S3, VPC, Auto-Scaling, ELB, ELK, Shell Scripts, Unix/ Linux environment, spark, Splunk

We'd love your feedback!