Devops Engineer Resume
Phoenix, AZ
PROFESSIONAL SUMMARY:
- 8+ years of experience in IT Industry with ability to accomplish all aspects of the software configuration management (SCM) process, AWS, DevOps and Build/Release management.
- Used Stack driver and AWS cloud monitoring extensively to monitor and debug the cloud based AWS EC2 services.
- Installed, configured multiple operating systems onsite and provisioned similar instances on AWS cloud.
- Handled operations and maintenance support for AWS cloud resources which includes launching, maintaining and troubleshooting EC2 instances, S3 buckets, Virtual Private Clouds (VPC), Elastic Load Balancers (ELB) and Relational Database Services (RDS).
- Hands on experience of Build & Deployment phase and usage of Continuous Integration (CI/CD) tools, build configuration, change history for releases, Maintenance of build system, automation & smoke test processes, managing, configuring, and maintaining source control management systems.
- Implemented multiple CI/CD pipelines as part of DevOps role for our on - premises and cloud-based software using Jenkins, Chef and AWS/Docker.
- Extensively worked on Hudson, Jenkins and Teamcity for continuous integration and for End to End automation for all build and deployments.
- Created and managed a Docker deployment pipeline for custom application images in the cloud using Jenkins.
- Worked on creation of Docker containers and Docker consoles for managing the application life cycle.
- Used Docker as a new level of simplicity to defining and creating applications or services by encapsulating them in containers.
- Used Docker containers for eliminating a source of friction between development and operations.
- Automated application deployment in the cloud using Docker technology using Elastic Container Service scheduler.
- Installed Docker Registry for local upload and download of Docker images and even from Docker hub.
- Source code management is performed using Git from master repository and knowledge on Container management using Docker in Creating images.
- Created and maintained various DevOps related tools for the team such as provisioning scripts, deployment tools and staged virtual environments using Docker and Vagrant.
- Used Debian based linux servers to install Docker based services, monitor and debug the services.
- In-depth understanding of the principles and best practices of Software Configuration Management (SCM).
- Designed and implemented fully automated server build management, monitoring and deployment by using Ansible playbooks and Modules.
- Implemented Infrastructure automation through Ansible for auto provisioning, code deployments,software installation and configuration updates.
- Worked on provisioning different environments using Chef, Puppet and other Configuration management tools.
- Developed Chef Cookbooks, recipes, roles and databags to automate the services related to deployment.
- Installed and configured an automated tool Puppet that included the installation and configuration of the Puppet master, agent nodes and an admin control workstation.
- Experience in using version control tools like Subversion (SVN), GIT, IBM Clearcase UCM and PVCS.
- Worked with development engineers to ensure automated test efforts are tightly integrated with the build system and in fixing the error while doing the deployment and building.
- Exposed to all aspects of software development life cycle (SDLC) such as Analysis, Planning, Developing, Testing, and Implementing and Post-production analysis.
- Experience in using Nexus and Artifactory Repository Managers for Maven builds.
- Excellent experience in documenting and automating the build and release process.
- Experience in using bug tracking systems like JIRA, Remedy and HP Quality Center.
- Proficient in tracing complex build problems, release issues and environment issues in a multi-component environment
- Extensively used build utilities like Maven, ANT for building of jar, war, bar and ear files.
- Monitored the servers & applications using Nagios, and Splunk.
- Implemented detailed systems and services monitoring using Nagios and Zabbix services AWS cloud resources.
- Automated processes with custom built Python & Shell scripts.
- Delivered projects by using Agile metod
- Experienced with RESTful API of Elasticsearch to analyze, search and visualize data.
TECHNICAL SKILLS
Operating Systems: Windows 98/XP/2000/2003/7/8/10, Linux RHEL5.x/6.x, CentOs 5.x/6.x, Ubuntu, Debian, Mac OS, SOLARI
Database: Oracle 10g/11g/12c, MS Server 2008/2012/2014,Db2, MYSQL, Postgres SQL
Programming languages: SQL and Oracle PL/SQL.
Source Control Tools: Subversion, GIT
Continuous Integration Tools: Jenkins, Bamboo
Configuration Tools: Chef, Puppet, Ansible, Vagrant
Containerization Tools: Docker, Docker swarm
Build Tools: ANT, MAVEN.
Monitoring Tools: Nagios, Rest API, Service Now, Splunk
Bug Tracking Tools: JIRA, Rally, Remedy and IBM Clear Quest
Web/App servers: Web logic, Apache Tomcat, JBOSS
Script Languages: Shell Scripting, Perl Scripting, Ruby,Python
Others: Oracle Data Integrator (ETL Tool),Agile Methodology, IAAS, SAAS, PASS
PROFESSIONAL EXPERIENCE
DevOps Engineer
Confidential, Phoenix, AZ
Responsibilities:
- Strong knowledge and experience on Amazon Web Services (AWS) Cloud services like EC2, S3, EBS, RDS, VPC, and IAM.
- Designed and managed public/private cloud infrastructures using Amazon Web Services (AWS) which include EC2, S3, Cloud Front, Elastic File System, RDS, VPC, Direct Connect, Route53, CloudWatch, Cloud Trail, Cloud Formation, and IAM which allowed automated operations.
- Launching Amazon EC2 Cloud Instances utilizing Amazon Web Services (Linux/Ubuntu) and Configuring propelled examples as for particular applications.
- Creating S3 bucket and furthermore overseeing strategies for S3 bucket and Utilized S3 bucket and Glacier for storage and backup on AWS.
- Creating CloudWatch alerts for instances and utilizing them in Auto-scaling launch configurations.
- Implemented and maintained the monitoring and alerting of production and corporate servers/storage using AWS CloudWatch.
- Created customized AMIs based on already existing AWS EC2 instances by using create image functionality, hence using this snapshot for disaster recovery.
- Worked on AWS cloud watch for monitoring the application infrastructure and used AWS email services for notifying & configured S3 versioning and lifecycle policies to and backup files and archive files in Glacier.
- Dockerized CI/CD tools (JENKINS and GIT lab).
- Implemented a Continuous Delivery pipeline with Docker and AWS.
- Experience with container based deployments using Docker, working with Docker images, Docker HUB and Docker registries.
- Worked on creation of Docker containers and Docker consoles for managing the application life cycle.
- Used Docker as a new level of simplicity to defining and creating applications or services by encapsulating them in containers.
- Used Docker containers for eliminating a source of friction between development and operations.
- Used Docker machine as a virtualization between systems.
- Automated application deployment in the cloud using Docker technology using Elastic Container Service scheduler.
- Worked on Kubernetes to provide platform as service on private and public cloud in VMware Cloud.
- Integrated Kubernetes with network, storage, and security to provide comprehensive infrastructure and orchestrated container across multiple hosts.
- Worked on Deployment Automation of all microservices to pull image from private Docker registry and deploy to Kubernetes Cluster.
- Extensively worked on Ansible Playbooks with Ansible roles. Created inventory in Ansible for automating the continuous deployment. Configure the servers, deploy software, and orchestrate continuous deployments or zero downtime rolling updates.
- Experienced with Ansible Tower for managing complex network deployments by adding control, knowledge and delegation to Ansible powered environments.
- Enhanced the automation to assist, repeat and consist configuration management using Ansible based YAML scripts.
- Designed and implemented fully automated server build management, monitoring and deployment by using Ansible playbooks and Modules.
- Wrote Ansible Playbooks with python SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using python scripts.
- Design and document CI/CD tools configuration management.
- Responsible for orchestrating CI/CD processes by responding to Git triggers, human input, and dependency chains and environment setup.
- Implemented Jenkins CI/CD Pipeline flow for different projects by creating multiple stages like build, integration, test, stage and production.
- Generating the Jenkins Pipeline Framework and wright Jenkinsfile to create Build, Test and Deployment Pipeline across different applications environments.
- Establishing and maintaining of setup to Build and deploy the application to the Cloud AWS.
- Worked on Project-Generator which creates Jenkinsfile and seed jobs in Jenkins with job definitions. In this webhooks also configured when the commits done in Bitbucket and jobs are triggered in Jenkins.
- Implemented CI and CD for database using Jenkins and UDeploy.
Environment: Anisible, Docker, ECS, Kubernetes, Apache, VPC, NAT, LAMP, AWS (EC2, S3, ELB, RDB, Dynamo DB, SES, SQS, SNS, Route53, VPC, Autoscaling, Cloudformation), IAAS, SAAS, PASSCI CD,SVN, Ruby, Python, GITHUB, YAML,JIRA, MAVEN, Jenkins and Agile.
DevOps Engineer
Confidential, Chevy Chase, MD
Responsibilities:
- Implemented nightly builds on Jenkins and automated various scopes of testing on Jenkins.
- Implemented rapid-provisioning and lifecycle management for Linux using custom Bash scripts.
- Installed, Configured and Administered Hudson/Jenkins Continuous Integration Tool.
- Developed automation framework for Application Deployments to the Hadoop environments.
- Performed Branching, Tagging, Release Activities on Version Control Tools: GIT and GITLAB.
- Developed shell scripts for automation of the build and release process, developed Custom Scripts to monitor repositories, Server storage.
- Used Maven as build tool on Java projects for the development of build artifacts on the source code.
- Performed and deployed Builds for various Environments like Dev, Test, QA, Onboarding and Productions Environments.
- Responsible for defining branching & merging strategy, checkin policies, improving code quality, automated Gated Checkins, defining backup and archival plans.
- Enabled Continuous Delivery through Deployment into several environments of Development, Test and Production using Maven and Sonarqube.
- Automated Hadoop sqoop deployments from production to the development environments.
- Created Python Scripts to Automate AWS services which include web servers, ELB, Cloudfront Distribution, database, EC2 and database security groups, S3 bucket and application configuration, this Script creates stacks, single servers or joins web servers to stacks.
- Automated the cloud deployments using Puppet, Python and AWS Cloud Formation Templates.
- Created scripts in Python which Integrated with Amazon API to control instance operations.
- Maintained Build Related scripts developed in ANT, Python and Shell. Modified build Configuration files including Ant's build.xml.
- Converting production support scripts to chef recipes.
- Implemented Packer based scripts for continuous integration with the Jenkins server and deployed packer based scripts on to the Amazon EC2 instances.
- Implemented AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer, Auto scaling groups, Optimized volumes and EC2 instances.
- Involved in automated deployment of ebs and chroot based automated deployments on to the AWS cloud ec2 instance servers.
- Configured Nagios to monitor EC2 Linux instances with puppet automation.
- Configured Apache webserver in the Linux AWS Cloud environment using Puppet automation.
- Extensive knowledge on writing and deploying modules in puppet.
- Hands on Experience in AWS Cloud in various AWS Services such as Redshift Cluster, Route 53 Domain configuration.
- Configured Elastic Load Balancers with EC2 Auto scaling groups.
- Worked on creation of various subscriptions and topics using SNS and SQS based services and automated the complete deployment environment on AWS.
- Conceived, Designed, Installed and Implemented Chef configuration management system.
- Created and updated Chef manifests and modules, files, and packages.
- Automated the cloud deployments using Chef and AWS cloud formation templates.
- Implemented rapid-provisioning and lifecycle management for Ubuntu Linux using Amazon EC2, Chef, and custom Bash scripts.
- Developed automation scripting in Python (core) using Chef to deploy and manage Java applications across Linux servers.
- Worked on Version control setups like GIT and integration tools Jenkins
- Installed, Configured and Administered Hudson/Jenkins Continuous Integration Tool.
- Developed automation framework for Application Deployments to the cloud environments.
- Developed shell scripts for automation of the build and release process, developed Custom Scripts to monitor repositories, Server storage.
Environment: Ubuntu, Hadoop, Jenkins, Maven, Docker, Python, Shell, VMware, Java, Ant, Maven, Hudson, GIT, Windows, JIRA,YAML, Ansible, Packer Chef, AWS, Sonarqube, IAAS, SAAS, PASS, Nagios, Python, Shell, VMware ESXi, Apache Webserver, JBoss, Apache JMETER, GIT, Ruby and Agile.
DevOps Engineer
Confidential, Tampa, FL
Responsibilities:
- Administered large scale server environments consisting of over 800 RHEL5/6 VMWare VMs running multiple technologies including apache, jboss, memcache, MySQL, postfix, ActiveMQ, python.
- Implementing a Continuous Delivery framework using Puppet, Bamboo, Openstack in Linux environments.
- Managed Roles and profiles for various technology stacks in Puppet.
- Maintained and enhanced existing Puppet modules to be deployed across various providers and deployment architectures.
- Troubleshooting, event inspection and reporting of various Puppet issues and starting/restarting of Puppet enterprise services.
- Deployed the Java applications into web application servers like JBoss.
- Performed and deployed Builds for various Environments like QA, Integration, UAT and Productions Environments.
- Worked on the installation and configuration of the monitoring tool Nagios.
- Implemented Nagios core/XI for monitoring Infrastructure resources.
- Responsible for User Management and End-to-End automation of Build and Deployment process using Jenkins and uDeploy.
- Strong knowledge / experience in creating Jenkins CI pipelines, and troubleshooting issues along the CI/CD Pipelines.
- Expert in performance monitoring tools like Iaas, AWS CloudWatch and stack driver.
- Provided installation & maintenance of Puppet infrastructure and developed Puppet Modules & Manifests for configuration management.
- Used Puppet to deploy and manage Java applications across Linux servers.
- Worked on the creation of Puppet manifest files to install tomcat instances and to manage Configuration files for multiple applications.
- Developed Chef Cookbooks to install and configure Apache, Tomcat, Splunk, Jenkins, WebLogic and Deployment automation.
- Supporting Development team to implement the process of automation of builds and deployment using chef.
- Created various cookbooks for automating network configurations in chef-server.
- Chef is built on top of the Ruby programming language, when needed have Ruby to customize Chef.
- Written Recipes in Ruby scripts to customize the Chef as per our environment.
- Implemented Chef to deploy the builds for Dev, QA and production.
- Using Chef, enable web IT providing support for managing cloud infrastructure.
- Chef to speed up application deployment, even creating a continual deployment pipeline.
- Set up customized monitoring with Nagios, & PNP4Nagios Graphs for the legacy and new environments.
- Automated Nagios services for database server, web-server, application-server, networks, file sizes, RAM utilization, Disk performances using Python script in Chef.
- To achieve Continuous Delivery goal on high scalable environment, used Docker coupled with load-balancing tool Nginx.
- Bootstrapping automation scripting for virtual servers, using VMWare clusters.
- Worked on Apache and Firewalls in both development and production.
- Created Python Scripts to totally automate AWS services which includes web servers, database, EC2, and Database security groups.
- Plan, design, review and maintain Python based applications and solution.
- Using Puppet, deployed and configured Elastic search, Log stash and Kibana(ELK) for log analytics, full text search, application monitoring in integration with AWS Lambda and Cloud Watch.
- Worked on the migration from physical servers to cloud (AWS), and used Puppet to automate the infrastructure in AWS by creating EC2, S3, RDS, VPC, uDeploy, Route 53 and Postgre SQL.
Environment: Agile method, AWS, Git, Puppet, Nexus, Tomcat, SQL, Postgre SQL, uDeploy, Gerrit, Stash, Bash, Bitbucket, SSH, Deployment Automation, Rest API, Maven, Nexus, Docker, Jira, Python, Ruby, Shell Scripts, Jenkins.
OracleDBA/DevOps Engineer
Confidential, Jacksonville, FL
Responsibilities:
- Worked on Managing the Private Cloud Environment using Groovy Scripting.
- Puppet automation, installing and configuring puppet 3.x server and agent setup, developing IHS, Web Sphere MQ 7.0, Web Sphere Application Server Automation, Apache 4.x/5.x, Jenkins, foremen.
- Wrote Manifest YAML for installing and managing java version files
- Monitored and maintained a log for system status and health using Linux commands and Nexus system.
- Worked with a complex environment on Red Hat Linux and Windows Servers while ensuring that these systems adhere to organizational standards and policies.
- Configuration and maintenance of a Two node RAC on Oracle 10g
- Configuration of ASM with normal and external redundancy.
- Restoration of databases from Critical failures.
- Responsible for scheduling backups.
- Taking Incremental backup using RMAN.
- Responsible for refreshing and cloning of databases per customer requirements using RMAN, Data Pump, and hot backup scripts
- Creating and maintenance of databases using DBCA and manual scripts.
- Configured and implemented Oracle Streams (Advanced Replication) for high availability.
- Performed Schema and Table level Replication using Oracle Streams.
- Tuning performance using Advisors, AWR, ADDM, SQL tuning etc.
- Setup oracle streams for replication of selected tables
- Upgrading databases from 10.0.1.0 to 10.0.4.0 using Patch sets.
- Alert monitoring and Ticketing.
- Administered DB2 Z/OS for v 9.x/8.x/7.x/6.x on IBM mainframe and AIX DB2 environments.
- Support and Administer Db2 Queue replication between DB2 Z/OS and distributed environments
- Implementing backup procedures and jobs using Crone Tab.
- Performed activities such as Query Optimization, Database Consistency Monitoring.
- Creating roles and granting privileges to the users.
- Expertise in Standalone TEST database built and running with a copy of SWISSLOG from WMS site
- Responsible for planning, testing and implementing different backup and recovery strategies.
- Scheduling Oracle Jobs in the database.
- Managing table spaces, user profiles, indexes, storage parameters.
- Involved in developing and implementing PL/SQL stored procedures and triggers.
- Responsible to provide 24 x 7 database administrative activities over different database servers.
Environment: Oracle 10.2.0.3/11.1.0.6/.7, RHEL 4.x, Solaris 10,Window, Oracle Enterprise Manager Database Control, SVN, Nexus, Tomcat, Apache, Linux, Git, Jenkins, Maven, JIRA, Python, Open Stack, Artifactory, Shell Scripts, Bash, ASM, Data guard and Agile.
OracleDBA
Confidential
Responsibilities:
- Oracle Database software installation, Instance creations, schema creations.
- Supported different Databases like Production, Development, Testing, and Data warehouse.
- Upgraded Databases from Oracle 9i to Oracle 10g on Solaris.
- Installed and Configured of Oracle OEM (Oracle Enterprise Manager) Grid Control and OEMDatabase Control to monitor database.
- I was involved in Maintaining critical interfaces with MMIS NCTRACKS using Curambatch framework-CuramV6
- Thorough understanding of how the Curambatch works for Recipient File and also Provider Mass Change Process.
- Created and Managed users, user profiles, table spaces, data files, archive log.
- Created database links that are required for databases.
- Experience in Data conversion using RMAN in the migration process of Oracle databases in Cross platforms.
- Worked with a team for installation and maintenance of ORACLE 10g RAC, ASM database for high availability.
- Well conversed using the CRSCTL and SRVCTL tools for maintaining database, ASM and node applications
- Created physical standby database and worked with data guard to provide Disaster Recovery.
- Performed troubleshooting of database using AWR and ADDM reports.
- Extensively used SQL LOADER as a tool for loading data.
- Performed replications using Oracle Streams.
Environment: OEM Grid Control, Oracle 9i, 10g RAC, Solaris 7, RMAN, Sql*Loader, Import & Export, Data pump, Data guard, RMAN, AWR, ADDM, Stats pack, ASM, TOAD 7.4, Korn Shell scripting, Data warehouse, AIX 4.3 etc.