Aws Devops Engineer Resume
Moline, IL
SUMMARY
- 8 years of experience in Information Technology industry in various roles with excellent experience in Migration, Configuration, Release Engineer, Software Configuration manager, Build & Release with diversified exposure in Software Process Engineering, designing & building a Web Application using Java/J2EE Technology, AWS & open source technologies
- Expertise inDevOps tools such as GIT, Jenkins, Maven, Ansible, Chef, Puppet, Docker, Kubernetes, AWS, Azure, GCP, and Terraform, Splunk, Nagios.
- Experienced in Continuous Integration (CI) and Continuous Delivery (CD) process implementation using Jenkins, TeamCity, and Bamboo.
- Good knowledge on Configuration Management Tools such as Chef, Ansible, Puppet and Automated Linux production server’s setup using the chef cookbooks, Ansible playbooks.
- Experience with Docker containerized environment, hosting web servers on containers, building docker images using Docker.
- Expertise working with Kubernetes to automate deployment, scaling and management of web Containerized applications.
- Worked on Kubernetes to orchestrate Docker containers of new and existing applications as well as deployment and management of complex run time environment.
- Expertise in administering and automating operations across multiple platforms (UNIX, Linux, Windows and MAC.
- Proficient in the AWS Cloud platform and its features, which include EC2, VPC, EBS, CloudWatch, Cloud Trail, Cloud Formation, Cloud Front, IAM, S3, Route 53, SNS, SQS.
- Experience in working on Docker and Vagrant for different infrastructure setup and testing of code.
- Worked on shell/python scripts & implemented auto deployment process and reduced the amount of time.
- Experience on Python, Bash/Shell, Ruby, Perl, PowerShell, JSON, YAML and Groovy.
- Knowledge on Java Lambda Expressions to retrieve the data from Collections using Functional Interfaces.
- Proficient in using Unix Commands and utilities to monitor server - side activities and performance.
- Worked on Artifactory and SonaType Nexus to manage the build artifacts.
- Good understanding of Pivotal cloud foundry (PCF) Architecture (Diego Architecture), PCF components and their functionalities. Experienced in using Pivotal Cloud Foundry (PCF) CLI for deploying applications and other CF management activities.
- Experience on Azure Site Recovery and Azure Backup Installed and Configured the Azure Backup agent and virtual machine backup, Enabled Azure Virtual machine backup from the Vault and configured the Azure Site Recovery (ASR).
- Experience in branching, tagging, and maintaining the version across the environments using SCM tools like Subversion (SVN), Git, Bitbucket, and GitHub on UNIX and Windows environment.
- Expertise in working with diverse Database platforms for Installing, Configuring and Managing NoSQL, RDBMS tools like MYSQL, Oracle, DynamoDB, and MongoDB.
- Worked on various network protocols like HTTP, UDP, FTP, TCP/IP, SMTP, SSH, SFTP & DNS and technologies like load balancers (ELB), ACL, Firewalls.
- Experienced with handling production incidents, queries, problems through remedy ticketing system and non -production issues/tasks/incidents through JIRA ticketing system
- Good understanding of web and application servers like Apache Tomcat, WebSphere, Nginx.
- Experience in integrating Unit Tests and Code Quality Analysis Tools like JUnit, SonarQube.
- Used build tools like Maven, ANT, and Gradle for the building of deployable artifacts from source code.
- Experience in building and deploying Java & SOA applications and troubleshooting the build and deploy failures.
- Proficient with Terraform Configuration files to spin up the infrastructure very easily and efficiently.
- Managed the deployment of virtual storage, sites and server’s infrastructure to augment clients’ datacenters by Agile.
- Experienced using different log monitoring tools like Splunk, Nagios, ELK (Elastic search, Logstash, Kibana) for to see logs information, monitor, get the health & security notifications from nodes.
TECHNICAL SKILLS
Versioning Tools: CVS, Subversion, Clear case, Git, Bitbucket, Git lab
CI/CD Tools: Hudson, Jenkins, Bamboo, Puppet, chef, Teamcity, Cruise Control, Ansible, Salt Stack
Build Tools: ANT, MAVEN, Gradle, Make file, Sonar, Build forge, Nexus
Bug Tracking Tools: JIRA, Rally, Remedy and IBM Clear Quest, Nagios, Ganglia
Languages: C, C++ and Java/J2EE, Python
Scripting: Shell, Batch, Perl, Ruby, Power shell
Virtualization: VMware ESX/ESXi3.5, Fusion, Hypervisor, Docker, Vagrant, KVM
Web Technologies: HTML, Java Script, XML, Servlets, JDBC, JSP, JSON
Web/App server: TC server, Web logic, Web Sphere, Apache HTTP server, Nginx, Apache Tomcat
Cloud Computing: AWS (EC2, ELB, S3), OpenStack (Nova, Swift, Glance), Azure, Cloud Foundry, AWS Elastic ache, Open Shift
Database: Oracle, SQL SERVER, MySQL, NoSQL, MongoDB, PostgreSQL, JBoss
Operating System: Windows, AIX, UNIX, LINUX3,4.X,5,6 and MAC
Others: MS Outlook, Agile, SCRUM, Load Balancing - HA Proxy, Fortify, Black Duck, JUnit
PROFESSIONAL EXPERIENCE
Confidential, Moline, IL
AWS DevOps Engineer
Responsibilities:
- Implement required AWS services to High Performance Computing Experience. Especially, experience in leveraging AWS services including EC2, S3 storage, EFS (Elastic File System), IAM (Identity and Access Management) Cloud Formation, Cloud watch, DynamoDB.
- Automate AWS Backup for applications like Confluence to maintain backup by supported AWS services (EFS, RDS etc.) and migrate Confluence Data from source to destination by using AWS DataSync agent.
- Responsible for administration ofGITversioncontroland Perform activities branching, tagging, backup, restore.
- Automated the process of deployment to Apache Tomcat Application Servers by developing Unix/Python Scripts.
- Worked extensively on AWS broad range of services such as provisioning EC2, VPC, ELB, Security Groups, IAM, EBS, S3, SNS, SQS, Route53, ELB, CloudWatch, Cloud Front, Cloud trial, RDS
- Worked on container-based deployments using the Docker images and Docker registries, pulled the required images from Docker Hub, Nexus. Used Docker to avoid the environment difficulties between Developers and Production.
- Involved in setting CI/CD Pipelines using Jenkins, Maven, Github, Docker, and Nexus.
- Created and developed deployments, namespaces, Pods, Services, Health checks, and persistent volumes etc., for Kubernetes in YAML Language.
- Design and implement the CICD architecture and automation solutions using GITHUB, Bitbucket, Jenkins, Bamboo, and Ansible Tower. Deploying to production environment in AWS using Terraform
- ImplementedFortify andBlackduckScansfor Security and deploying the binary files like zipping, jar, dll’s files to JFrog Artifactory using Jenkins, Bamboo
- Automation of Data transfer between on-premises andCloud platformsleveraging HPC scripts.
- Design and deploy the infrastructure and services for new applications by tools such as SonarQube, Blackduck and Jenkins etc.
- Create testing framework with AWS services like AWS gateway, Dynamo DB, EC2 etc to test different AKANA policies.
- Expertise in working network protocols (e.g. TLS/SSL,MTLS).
- Create Gatling scripts to test different AKANA QOS policies and Performing AKANA end-to-end testing by using Gatling tool.
- Deployed centralize and auto scaling monitoring system for infrastructure using Grafana, Puppet and puppetdb with integration of many custom plugins for real time monitoring.
- Automate SDLS with scripting languages like python, shell and groovy scripts. System troubleshooting and problem-solving across platform and application domains.
- Work closely with development team by rapidly deploying instances, monitoring operating efficiencies of the platform, and responding as needs arise.
- Used Ansible to deploy ELK for automating continuous deployment (CD) and configured Nodes and deployment failure reporting and working with Site Reliability Engineer to implement Data dog system metrics, analytics
- Accountable for the integration/maintenance/development of application and server monitoring tools (for DevOps) - Nagios, Prometheus, Kibana, Datadog.
- Monitor applications and environment by managing the installation/configuration of Splunk and ELK.
Environment: Git, Black Duck, Bitbucket, Confluence, AKANA, Jenkins Pipeline, Groovy, Ansible, YAML, Docker, MAVEN, AWS, Ec2, EFS, RDS, Route53, Dynamo DB, S3, Cloud Formation, IAM, Sonarqube, Jfrog, Artifactory, XML, SSL, TLS, JSON, Rally, Apache Tomcat, Gatling, Python, Shell.
Confidential, Plano, TX
DevsecOps Engineer
Responsibilities:
- Enabling SOE (Standard Operating Environment) with the Dev teams by helping on-board into the CI/CD pipeline and OneHygieia. This would be modification and standardization of current Jenkins Jobs.
- Worked on various source code management activities using GIT involving branching, merging strategy, daily merges, and remote repository.
- Conduct day-to-day tasks in RHEL that included upgrading rpms, kernel, HBA driver, configuring SAN Disks, multipathing and LVM file system
- Create Jenkins onepipeline and integrate with Gitlab. On boarding tools such as Gitlab, Jenkins, sonrqube, Jfrog, Ansible, fortify, Black Duch and Hygieia etc from DevSecOps Toolchain.
- Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation-using Jenkins Pipelines along with Python and Shell scripts to automate routine jobs
- Installed, monitored and configured Applications in Nginx and Apache Tomcat Server and establish connectivity to databases and troubleshoot issues on the fly.
- Design and build data engineering solutions using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc.
- Experience in GCP and writing big queries in GCP.
- Experience in professional software testing in build & release engineering.
- Implement DevOps methodologies to help unify, track, and automate code deployment within a diverse range of system/stack environments usingAWS,Docker, Jenkins andopen source technologies.
- Implement/Develop AWS rehydration approach or any other approach for backup and monitoring though Cloudwatch.
- Hands-on experience Designing, planning and implementation for existing On-Prem applications to Azure Cloud (ARM), Configured and deployed Azure Automation Scripts.
- Hands-on experience to Deploy an AKS cluster using the Azure portal.
- Implemented algorithms parsingXMLto convert toJSON and written AnsibleYAMLscripts that can store the credentials for various sandboxes and secured them on the remote servers
- Worked on Scaling and auto scaling of the pods on Kubernetes (EKS), troubleshooting any issues on pods.
- Experience in integrating Unit Tests and Code Quality Analysis Tools like JUnit, SonarQube
- Configured Application Life Cycle Management (ALM) tools like JIRA to track the progress of the project.
- Installed and administered Nexus and Artifactory Repository for Maven builds
- Install tools to support scanning and analysis of code with the tools like Fortify, sonarqube.
- Enable sonar lint, sonar scanners with SCM and set up sonar dashboards to identify Code Modification.
- Using different kinds of build tools like ANT, Maven, Gradle, and MS build
- Develop dashboards to monitor metrics information about the application by using visualization tools like One Hygieia.
- Automate the process of migrating IPM functionality to CI/CD pipeline, writing the scripts for pipeline as a code.
- Create quality gates throughout pipeline by groovy, python scripts.
Environment: Git, Gitlab, RHEL 6&7, Jenkins, Ansible, YAML, Docker, MAVEN, AWS, Cloudwatch Ec2, Apache Tomcat, Ant, Python, Shell, GCP, Jira, Sonarqube, Kubernetes, Jfrog, Artifactory, Fortify, Black duch, XML, JSON, Hygieia, JUnit, Azure, OneConfluence.
Confidential, Irving, TX
DevOps Engineer
Responsibilities:
- Involved in migrating an existing on-premises (Solaris) application to OracleCloud (OCI Exadata).
- Launching and configuring of OracleCloudServers and configuring the servers for specified applications.
- Automated the task using Ansible and Python reviewed variouscloudprojects for Security approval.
- Worked on Build & Release activities on Jenkins for technologies like Java, .Net,windows ISS.
- Experience in a fast-paced, start-up environment with short release cycles.
- Installed apache tomcat8 application servers for another component and worked with 12 12.3 application servers and apache web server
- Previous experience with production deployment tools like AnthillPro and Nolio a plus Worked with Atlassian tools like Bamboo & Jira.
- Created and owned Build and Continuous Integration environment with Ant, Maven, Visual Studio and Jenkins Pipeline. Building Docker images and pushing them to JFrog Artifactory.
- Worked on SysOps Administration to Implementing a Continuous Delivery framework using Jenkins, Ant and Maven in Linux environment.
- Encrypting the disk using crypt setup tool in the Linux server Part of core engineering team to designing the new platform to host applications through JENKINS.
- Maintained build related scripts developed in ANT. Modified build configuration files including Ant build.xml.
- Experience in Administering Bitbucket and with system automation and deployment with the best tool like Ansible.
- Developed UNIX/Python/Groovy scripts for application building and deployment.
- Creating and configuring Ansible playbook in order to integrate with Jenkins job.
- Deploying the code using shell script, this is configured in the Jenkins Job.
- Integrating Jenkins with Git for deploying the code in test and stage environments and creating documentation for deployment process.
- Understanding security concepts like hardening an OS, SSL, MTLS certification, firewalls.
- Cloud experience with either Amazon Web Services, Microsoft Azure or Google Cloud Platform.
- As part of the team, managedCloudSecurity Overlay network for Managed PaaS and SaaS services in OCI-Classic
- Effectively appliesOracle’s methodologies and policies while adhering to contractual obligations, thereby minimizingOracle’s risk, exposure and configuring SSL certificates in Linux.
Environment: Jenkins, Ansible, Git, Bitbucket, OCI, Azure,Apache Tomcat8, web logic12.3, Ant, Python, Shell, Jira, Jfrog Artifactory, Apache, OS, SSL, MTLS, Windows Server 2012, Groovy.
Confidential, Detroit, MI
Cloud / DevOps Engineer
Responsibilities:
- Created and Managed automation framework for migration of servers and applications to Cloud Platform
- Implemented GCPCloudFunctions in Python and NodeJS for Data Pipelines to transfer data from GCP to AWS.
- Worked on AWS API Gateway for custom domain and Record sets in Amazon Route53 for applications hosted in AWS Environment. Extensive experience focusing on services like VPC, EBS, AMI, IAM, S3, Cloudwatch, SNS, SQS, Amazon Glacier,CloudTrail,CloudFormation.
- Maintained the user accounts IAM Roles, VPC, RDB, Dynamo DB, SES, SQS and SNS services inAWScloud.
- Written Templates for AWS infrastructure as a code using TERRAFORM and CLOUDFORMATION to build staging and production environments.
- Created TERRAFORM scripts to launch the platform common services which includes IAM roles, CI/CD tool JENKINS, Configure Management tool Chef, Secrets management tool Vault, service discovery tool Consul and Security Appliance tool Trend Micro on AWS premises.
- Involved in migration of databases from on premise to AWS RDS, Migrated the MySQL and MsSQL database servers using Database migration service in AWS.
- Expertise in creating and customizingSplunkapplications, searches, and dashboards as desired by IT teams and business.
- Hands-on experience Kubernetes to automate the deployment, scaling, and operations of application containers across clusters of hosts.
- Managing Identity Access management of Azure Subscriptions, Azure AD, Azure AD Application Proxy, Azure AD Connect, Azure AD Pass Through.
- Worked on managing infrastructure provisioning, S3, ELB, EC2, RDS, Route 53, Security Groups (VPC, NAT) and deployment via SCALR and EC2 Installation with CentOS, Ubuntu and Linux.
- Implemented a centralized logging system using log stash configured as an ELK stack (Elastic search, Log stash, and Kibana) to monitor system logs, AWS ColudWatch, VPC Flow Logs, CloudTrail Events, changes in S3 etc.
- Deployed and configured JIRA, both hosted and local instances for issue tracking, workflow collaboration, and tool-chain automation
- Responsible for migrating existing modules inIBM MQtoApacheKafkaand worked on creatingKafkaadaptors for decoupling the application dependency
- Conduct day-to-day tasks in RHEL that included upgrading rpms, kernel, HBA driver, configuring SAN Disks, multipathing and LVM file system.
- Worked on the Docker ecosystem which includes Docker machine, Docker Compose, Docker Swarm and monitored containers using Prometheus tool.
- Worked on SysOps to Installing, configuring and administeringJenkinsContinuous Integration (CI) tool on Linux machines along with adding/updating plugins such as SVN, GIT, Maven, ANT,Chef, Ansible etc on AWS platforms.
- Installed and administered monitoring Nagios and maintained it by using Shell scripting
- Implementing new projects builds framework usingJenkins& maven as build framework tools and Integrated Docker build as a part of Continuous Integration process and deployed localRegistry server.
- Strong use of Shell/Ruby scripting languages including BASH for Linux and PowerShell for Windows systems.
- Managing batch jobs in Unix for data automated import/export of data and system automation programming using Perl, Bash, Shell, Ruby scripting and automated log backup using Python Boto3 API.
Environment: Git, RHEL 6&7, Ubuntu, Centos, Amazon Linux, Windows, Jenkins, Chef, Docker, Kubernetes, MAVEN, AWS, EC2, Cloudwatch, AWS sms, Ata Data, AWS DBMS, Azure, GCP, Openshift, Terraform, ESB, Splunk, Nagios, Apache Tomcat, Kafka, Prometheus, Mule, ANT, Python, Shell, Jira, Ruby.
Confidential
Linux System Administrator
Responsibilities:
- Installation, configuration and Operating System upgrade on, RedHat Linux 3.0, 4.0, 6.0, Centos 5.11, 6.7 and SunSolaris 8,9,10.
- Implemented rapid-provisioning and life-cycle management for Ubuntu Linux using Amazon EC2, CHEF, and custom Ruby/Bash scripts
- Part of theDevOpsteam responsible for containerization efforts and migration of Java apps toOpenShiftContainer Platform.
- Implemented Bash, Perl, Python scripting and implemented automations tools CHEF, PUPPET.
- Implemented CHEF Cookbooks for OS component configuration to keep AWS server's template minimal.
- Configured Elastic Load Balancers with EC2 Auto scaling groups.
- Good working knowledge on NOSQL databases such as MongoDB and Cassandra.
- Experience in Technical and Functional side of the Team Foundation Server Components (Source Control, Work Items,TFSBuilds, and Reporting).
- Web application development using agile methodology using Ruby on Rails, MongoDB.
- Handling of large amounts of data across many commodity servers is done by using Cassandra (NoSQL database).
- Automated the cloud deployments using CHEF, Python and AWS Cloud Formation Templates.
- Implement and manage continuous delivery systems and methodologies on AWS, Used Subversion as source code repositories.Developed Shell/Perl Scripts for automation purpose.
- Used Maven for building the Web projects including the Web Services and created automated reports for the Builds and Test results which QA can access to accelerate the testing process.
Environment: TFS 2010(Team Foundation Server), GIT, JENKINS, CHEF, PUPPET, AWS, OpenShift, REDHAT Linux 3,4.X,5,6, Azure, Vmware ESX 3.5, Veritas Volume Manager.
