We provide IT Staff Augmentation Services!

Aws Devops Engineer Resume

New York, NY

SUMMARY:

  • I Have 7 years of professional experience in the areas of DevOps , AWS Cloud Computing, Build and Release Engineer in automating, building, deploying, and releasing of code from one environment to another environment.
  • Experience in working on Continuous Integration and Delivery platform as DevOpsEngineer .
  • In - depth knowledge of DevOps management methodologies and production deployment which include Compiling, Packaging, Deploying and Application Configurations.
  • Experience in AWS Cloud Computing services, such as EC2, S3, Lamda, API, Dynamo, EBS, VPC, ELB, Route53, Cloud Watch, Security Groups, Cloud Trail, IAM, Cloud Front, Snowball, EMR, RDS and Glacier.
  • Experience on confidential for deploying using Code commit and Code deploy of EC2 instances consisting of various flavors like confidential, Red Hat Linux Enterprise, SUSE Linux, Ubuntu server, Microsoft Window Server2012 and many more.
  • Experience in creating User/Group Accounts and attaching policies to User/Group Accounts using AWS IAM service.
  • Have work experience on Multiple AWS instances, Creating Elastic Load Balancer and Auto scaling to design cost effective, fault tolerant and highly available systems.
  • Defined AWS Security Groups which acted as virtual firewalls that controlled the traffic allowed to reach one or more AWS EC2 instances.
  • Setting up databases in AWS using RDS, storage using S3 bucket and configuring instance backups to S3 bucket.
  • Experience in Creating a snapshot of an EBS volume and stores it in Amazon S3.
  • Experience in migrating databases to confidential using AWS DMS service.
  • OSI network protocols like UDP, POP, FTP, TCP/IP, and SMTP, NIS, NFS, SSH, SFTP. Expertise in working with Layer 7 protocols like HTTP, DHCP, DNS, and SSL/TLS.
  • Expertise on shell and python scripting with focus on DevOps tools, CI/CD and AWS, Azure Cloud Architecture and hands-on Engineering.
  • Deployed and maintained Chef role-based application servers, including Apache, Resin, Nginx and Tomcat.
  • Experience in using Tomcat and Apache web servers for deployment and for hosting tools.

TECHNICAL SKILLS:

Cloud Technologies: AWS and Azure

Build and Release Automation: Jenkins, Hudson, VSTS/Azure DevOps Services

Build Tools: Nuget, Maven, Ant

Configuration Management: Ansible, Puppet and Chef

Cloud Automation: ARM Templates, CloudFormation and Terraform

Monitoring: CloudWatch, Nagios

Scripting: Bash, Powershell, Python, Ruby

Databases: Azure SQL, Amazon RDS, MySQL, MS and Oracle SQL

Operating Systems: RHEL (6.x and 7.x), CentOS, Ubuntu, Windows, Solaris

Container Management: Docker, Kubernetes

Middleware: WebLogic, WebSphere, Tomcat

Web/Proxy Servers: Apache HTTPD, Nginx

Version Control Systems: SVN, Git, Github, Bitbucket, TFS

PROFESSIONAL EXPERIENCE:

AWS Devops Engineer

Confidential - New York, NY

Responsibilities:

  • Followed Agile methodologies and implemented them on various projects by setting up Sprint for every two weeks and daily stand - up meetings.
  • Used Maven and SBT as build tools on Java and Scala based projects, and further development of Responsible for build and deployment using Docker and ECS containers.
  • Developed a generic Jenkins job in Jenkins for continuously building all kinds of maven projects through the webhook. And, I was replacing the existing individual maven jobs.
  • Extended the generic process by attaching the Jenkins job webhook to all the current Java and Scala-based projects in GitHub.
  • For any kind of commit /push had done in the current projects by the various developers and data engineers , it triggers my generic job for building projects.
  • To make this happen peacefully, I created two Jenkins jobs (webhook and generic job) in which one is for getting the payload response from the webhook, then converting it to JSON using ruby script and the other job is getting down streamed, based on the parameter values that were resolved from the first job.
  • Inside the generic job, I had included the building, deploying artifacts to JFrog and copying logs to Amazon S3.
  • Had built a docker image by writing a Dockerfile, which provides the necessary environment required for generic job.
  • To automate the process inside the job, I had to pull the docker image from our docker registry and then, running the containers above that image. For further deploy and S3 copying, I wrote a shell script inside the job.
  • And this job, runs over the slave node (docker was pre-installed) with some set of executors underneath and this node was running over an EC2 instance launched from Jenkins.
  • During this process, it takes certain time to complete the process because of downloading the large sets of mavendependencies from local JFrog repositories. To speedup this, I added the Amazon EFS (Elastic File System) for caching all the dependencies and storing the docker images.
  • Similarly, the above process was also repeated for Scala-based projects, where the maven got replaced by SBT tool.
  • After this, our first version 0.1 got released and it, had received a good feedback from the production teams. This approach, had made our team realize that the projects were able to build inside the docker containers.
  • As a major update for the next release, I was decided to run the Jenkins jobs over ECS (EC2 Container Service) containers with EFS storage (cache support). Finally, ECR (EC2 Container Registry) from AWS was used for the image storage.
  • Had discussions about Qlick Sense for visually conveying your Business solutions.
  • Worked on Okta for orchestrating the Active Directory part using LDAP to be updated with employees details.
  • Worked on creating the docker images that included the OpenCV, which was quite essential for some projects.
  • Discussed with the team of Infrastructure Engineers , regarding Terraform templates in provisioning the AWS resources such as EMR clusters, S3, ECS etc.
  • Worked for restarting various services such as HDFS, Oozie, Spark, MapReduce etc to reflect the necessary configuration changes inside the EMR clusters.
  • Used JIRA for assigning tickets to our team, and had to update the status, story of the tickets as per the sprint setup.
  • Depended on Confluence for documenting about the progress of Projects and Sprints.
  • Used to interact with Java Developers and Data Engineers , for fixing their issues and made them use the required infrastructure effectively.
  • After the end of every Sprint, I had to close the tickets and then, had to perform both internal demo and external demo in front of various teams.

Environment : Agile, Linux, RHEL, Unix, Ubuntu, JIRA, Confluence, Slack, AWS, Jenkins, Git, xcode, Maven, SBT, Groovy, Java, IOS, Scala, vRealize, Blueprint, docker, Amazon EMR, Terraform, WAF, ruby, shell, OpenCV, JFrog, Datadog, Splunk, Hadoop, Kafka, Spark, Oozie, New Relic.

AWS Devops Engineer

Confidential - Norwalk, CT

Responsible

  • Created design and maintenance of the Subversion/GIT Repositories, views, and the access control strategies.
  • Used Gradle, xcode and Maven as a build tools on Java, android and swift based projects for the development of build artifacts on the source code.
  • Responsible for build and deployment automation using Docker and Kubernetes containers and Chef.
  • Developed Linux, UNIX, Perl and Shell Scripts for manual deployment of the code to various environments.
  • Configured Nagios to monitor EC2 Linux instances with puppet automation. And deployed Solarwinds for network monitoring and analysis.
  • Managed the software configurations using Chef Enterprise.
  • Manage configuration of Web application and Deploy to AWS cloud server through Chef.
  • Implementing a Continuous Delivery framework using Jenkins, Puppet, Maven&Nexus in Linux environment. And performed the scheduling of various automated jobs using RUNDECK.
  • Performing data analytics using Redshift under the guidance of various Business Intelligence tools such as Qlik view, Looker, Datameer etc. We had monitored the data pipelines using Apache Kafka and Matillion ETL.
  • Worked with a backend team over various issues related to Qlik Sense and surprised with it's Innovative features.
  • Worked on Chef cookbooks/recipes to automate Infrastructure as a code Used Ansible for deploying the necessary changes on remote hosts and monitored the process using Ansible Tower.
  • Used Ansible Tower for running playbooks stream in real - time and amazed to see the status of every running job without any further reloads.
  • Integration of Maven/Nexus, Jenkins, GIT, Confluence and JIRA.
  • Worked on Azure Resource Manager for creating and deploying templates, assigning roles, and getting activity logs.
  • To create and manage AD tenants, and had effectively implemented Azure container service (DC/OS,Docker, Kubernetes) and functions.
  • Implemented AWS solutions using EC2, S3, Redshift, Lambda, RDS, EBS, Elastic Load Balancer, Auto scaling groups, SNS, Optimized volumes and Cloud Formationtemplates.
  • Worked on Amazon API gateway in creating endpoints for various applications.
  • Implemented DNS service (Route 53) in effectively coordinating the load balancing, fail-over and scaling functions.
  • Understanding of secure-cloud configuration (CloudTrail, AWS Config), networking services (VPC,Security Groups, VPN etc.) and created roles (IAM) Experienced AWS Developer tools such as CodeCommit, CodePipeline, CodeDeploy, CodeBuild etc Configured S3 versioning and life cycle policies to add backup files and archive files in Glacier.
  • And implemented Python Boto3 to access various AWS services from the CLI.
  • Also used Netflix Asgard for filling the gaps between web developers and AWS cloud.
  • Worked on Blueprints for automating the capture of content from various environments using vRCS pack of vRealize.
  • Monitored and created alarms and notifications for EC2 hosts using Cloud Watch.
  • Generated workflows through Apache Airflow, then Apache Oozie for scheduling the hadoop jobs which controls large data transformations.
  • Implemented Hadoop clusters on processing big data pipelines using Amazon EMR and Cloudera whereas it depended on Apache Spark for fast processing and for the integration of APIs. Confidential the end, we managed the above resources using Apache Mesos.
  • Created and moved the data between different AWS compute (EC2, Lambda) and storage services (S3, DynamoDB, EMR), as well as on-premise data sources using Data Pipeline.
  • Experience in Azurevirtual machines so that they will be able to connect to on-premises environments.
  • Created and managed AD tenants, and then configure application to be able to integrate with AzureAD(Active Directory).
  • Experience in working with Storage, Security and Compute services inside the Azure cloud.
  • Implemented AWS solutions using EC2, S3, Redshift, Lambda, RDS, EBS, Elastic Load Balancer, Auto scaling groups, Optimized volumes and EC2 instances Worked on RESTful APIs using Node.js and Spring MVC for communicating between applications or systems.
  • Needed JBoss server for managing and administering the various J2EE applications.
  • Maintained Apigee for building applications, providing security over cyber threats in giving better assistance to the developer teams. Finally, it supports for the betterment of Microservices.
  • To overcome the loss of Monolithic applications, we launched several Microservices and it helped us to maintain scale, deploy and resilence.
  • Implemented Apache Jmeter and AppDynamics for testing the performance of applications/server/protocols.
  • Worked with JIRA and Alfresco for ticket management and documentation for our Sprints.
  • Worked on WebLogic for managing a cluster of logic server pieces.
  • Assisted by OEM 12 for effectively handling the Oracle DB using agents, LB with OMS.
  • Created the SQL, PL/SQL scripts (DML and DDL) in Oracle database, MySQL and revising them in SVN.

Environment : Java/J2ee, IOS, Subversion, Ant, Maven, xcode, Jenkins, GIT, SVN, Chef, Puppet, Ansible, RHEL, Cloudwatch, AWS, Azure, Node.js, Asgard, vRealize, Blueprint, Spring MVC, Qlik Sense, Microservices, WAF, Hadoop, Spark, Kafka, Mesos, Oozie, Python, ruby, Alfresco, Flask, Shell Scripting, MPEG, Ruby, PUTTY, SSL certs,Confluence, HP QTP, Selenium, JMeter, JBoss, Oracle DB, MySQL and SQL.

Devops Engineer

Confidential, Williams, OH

Responsibilities :

  • Maintained and Administered GIT Source Code Tool.
  • Managed Build results in Jenkins and Build Forge and deployed using workflows.
  • Maintained and tracked inventory using Jenkins and set alerts when the servers are full and need attention.
  • Modeled the structure for Multi - tiered applications by orchestrating the processes to deploy each tier using IBM UrbanCode Deploy Developed builds using MAVEN and Gradle in coordination with Spring Boot, where the build packages need to be integrated with Tomcat server spontaneously.
  • While coordinating with developer teams, Spring Boot helped us to create several RESTful applications and for deployment of J2EE in production environments.
  • Using Knife from Chef, we used to bootstrap the nodes and managed roles for automating the chef-client run using ruby recipes Wrote Ansible playbooks to manage Web applications and also used Ansible Tower. We coordinated with Terraform DSL for automating inside the AWS cloud.
  • Orchestrated several CloudFormation templates using openstack Heat and got the block storage support from Cinder Launched Memcached and Redis kind of services using AWS ElastciCache Worked on REST APIs in configuring the changes and to maintain Index points.
  • Integrated Openshift to run Docker containers and Kubernetes clusters.
  • Experience in writing various data models in coordination with the Data Analysts.
  • In addition with supporting large-scale web applications, we indexed database queries using MYSQL server by writing SQL queries. We worked on Apache Cassandra, Spark along with Terradata for managing large data-sets of structured data which also performed ETL Had worked over Qlik Sense for making interactive visuals to coordinate with various teams.
  • For data replication over Terradata, we included the HVR Software.
  • Included Mesos and Kafka for managing the real-time data streamlines under proper environments. Depended on Zookeeper for any assistance.
  • Launched Apache Tomcat along with Hibernate for controlling incoming user requests regarding Web Applications and their persistence with the RDBS Monitored application requests across IIS server by creating worker process and containerized the process through an application pool.
  • Monitored and analyzed the data streams using SFDC Wave Analytics integrated with Various BI Tools.
  • Assisted the Data Analysts in improving the Business logic using MySQL behind those applications.

Environment : Jenkins, RHEL, AWS, open stack, CentOS, GIT, Chef, Ansible, Terraform, Maven, Groovy, JIRA, ruby, Python, Shell/Bash, JAVA/J2EE, IOS, Web Sphere, WebLogic, Microservices, RabbitMQ, SQL Scripts, MPEG, Selenium, vRealize, Blueprinting, SFDC Wave, Alfresco, Nagios.

System Administrator

Confidential, Richmond VA

Responsibilities:

  • Installing and configuring of Windows 2012 R2 Servers
  • Installing, configuring and managing AD, DHCP, DNS Services
  • User accounts maintenance which include creating, deleting, modifying user accounts and rights
  • Installing and configuring LAMP Servers in Linux
  • Installing printers, fax, scanners and other peripherals and maintaining proper function
  • Maintaining proper cabling and troubleshooting for connectivity issues between different devices
  • Performing System backups and recovery
  • Installation and configuration of operating system such as Sun Solaris, Linux, AIX.
  • General Red Hat Linux System administration, OS upgrades, security patching, troubleshooting and ensuring maximum performance and availability.
  • Used CIS Red Hat Enterprise Linux 6 Benchmark to harden newly installed systems. Setup and administer user and groups accounts, setting permissions, web servers, file servers, firewalls and directory services.
  • Performed both interactive and automated (Kickstart) installations of Red Hat Enterprise Linux. Plan and execute RPM and YUM packages and update installations necessary for optimal system performance.
  • Assisted in LDAP server configuration for user authentication on network. Install and configure Apache and provide support on testing and production servers.
  • Managed file systems using software RAID and Logical Volume Management. Create Virtual machine using VMWare and KVM, automate disaster recovery planning and maintenance for the virtual environment.

Hire Now