Devops Engineer Resume
Malvern, PA
SUMMARY
- DevOps engineer with experience in building and administering cloud infrastructures. Experienced in managing large scale Linux servers and applications. Proficiency in all major cloud platforms and container technologies.
- Technical expertise in CI/CD pipelines and devops automation tools. Well versed with installation, configuration and maintenance of applications in Linux on physical and virtual environments.
- IT Professional with 7 years of experience as a System Administrator, Build and Release/DevOps Engineer in automating, building, deploying, and releasing of code in various environments.
- Over 4 years of experience as a DevOps Engineer with configuration management tools such as Chef, Puppet, Ansible, Docker, continuous integration using Jenkins, ANT, Maven build tools and version control using GIT, SVN deployed on cloud infrastructure using AWS and on - premise virtualization using VMWare and extensive programming in Python, Ruby, Perl and Shell scripting.
- Experience in planning, installation, configuration and migration of systems; along with maintenance and deployment of Linux RHEL 5.x/6.x, Windows 2008 and 2012.
- Build and troubleshooting experience of Red Hat Linux, Windows Server 2008 & 2012 hosted on VMware ESXi Servers and AWS.
- Experience in installation, configuration and administration of VMware ESXI, vCenter, RDS, HA Clustering, SUN Virtual Box, and Microsoft Virtual PC.
- Extensively worked on Jenkins/Hudson by installing, configuring and maintaining for the purpose of Continuous integration (CI) and for End-to-End automation for all build and deployments.
- Hands on Experience in AWS Cloud administration such as AMI, EC2, ELB, S3, Route 53 Domain configuration.
- Experience in creating the company's DevOps strategy in a mix environment of Linux (RHEL, Ubuntu) servers along with implementing a cloud strategy based on AWS.
- Expertise in managing EC2 instances, EBS and RDS on the Amazon Web Services (AWS) platform using chef configuration management tools.
- Experience with Chef Enterprise as well as On-Premise, Installed Workstation, Bootstrapped Nodes, Wrote Recipes and Cookbooks and uploaded them to Chef-server, Managed On-site OS/Applications/Services/Packages using Chef as well as AWS for EC2/S3/Route53 & ELB with Chef Cookbooks.
- Written Chef Cookbooks and recipes to Provision several pre-prod environments consisting of Cassandra DB installations, WebSphere installation and creation of profiles.
- Developed OpenStack infrastructure with automation and configuration management tools such as Ansible, Puppet, or custom-built cloud hosted applications and used Ansible for continuous integrations.
- Experience in deploying and maintaining private cloud infrastructure of OpenStack. Proficient in tracing complex build problems, release issues and environment issues in a multi-component environment like OpenStack.
- Experienced in creating puppet manifests and modules to automate system operations.
- Great Efficiency in installing, configuring and implementing the RAID Technologies using various tools like VxVMand SVM.
- Experience with container based deployments using Docker, working with Docker images, Docker hub and Docker registries.
- Virtualized the servers on AWS using the Docker, created the Docker files and version control.
- Experienced in Implementation and maintenance of branching and build/release strategies by using GIT and Subversion (SVN) involved in periodic archiving and storage of the source code for disaster recovery.
- Experienced in Administration and managing the source code control of multiple development efforts using Clear Case, Subversion, TFS, Git and SVN version control tools.
- Experience with Visual Studio Team Server (VSTS) a.k.a. Azure DevOps workflows and pipelines (or equivalent CI/CD orchestration toolsets)
- Work with development team and key stake holders to create plan for monitoring Azure resources.
- Develop and manage Storage infrastructure for Azure.
- Good understanding of Storage infrastructure for Azure Securing storage accounts
- Configuration and administration of Fiber card Adapter's and handling Linux part of SAN (SAN arrays HITACHI).
- Performed regular software release build and deployment based on defined process and procedure, including J2EE, UNIX Scripts, Oracle PL/SQL build and deployment. Managed, maintained and deployed to test, acceptance and PROD environments.
- Good experience to setup, configure continuous build processes using Buildforge/CruiseControl/Hudson, Jenkins, Maven, Ant, Nant, MSbuild, Subversion and ClearCase, Perl.
- Extensive experience using MAVEN and ANT as build tools for the building of deployable artifacts (jar, war & ear) from source code.
- Expertise in detecting network outages and protocol failures by using Nagios monitoring system and also experienced in configuring other monitoring tools like Splunk, Sitescope, and Cloud watch.
- Ability to resolve issues quickly, able to identify tasks which should be automated and then write scripts to automate them.
- Having good experience in Red Hat, EMC storage devices, Disaster Recovery like data Recovery, Hitachi storage devices and VERITAS Netback up version 6.
- Participated in entire Software Development Life Cycle (SDLC) including Requirement Analysis, Design, Development, Testing, Implementation, Documentation and Support of software applications.
- Experience in Agile Methodology, deploying applications, Load Balancing and Fail over functionality in a clustered environment.
- Experienced in writing Python Scripts to support Web Logic Scripting Tool (WLST).
- Have extensive experience in building and deploying applications on Web/Application Servers like JBoss, WebLogic, IBM WebSphere, GlassFish and Tomcat.
- Manage deployments at internal and internal and externally accessible QA, SIT demo, and Production systems.
- Extensive experience in managing and configuring secured environments using SSL, Mutual Authentication and Digital Certificates.
- Strong Knowledge in networking (Switching, routing, Firewall, DNS, TCP/IP, HTTP, SSL).
- In-depth understanding of the principles and best practices of Software Configuration Management (SCM) processes, which include compiling, packaging, deploying and Application configurations.
- Hands-on Experience of the J2EE Framework and its components as related to Java build, test, deployment and release management initiatives.
- Deep understanding of Layer 7 protocols like HTTP, DHCP, DNS, and SSL/T.
- Experienced in Querying RDBMS such as Oracle, MY SQL and SQL Server by using SQL for data integrity.
- Experienced in First tier escalation for critical system issues and user support via 24x7 NOC & on-call rotation.
TECHNICAL SKILLS
Operating Systems: Red Hat Linux ES & Centos OS 4.X, 5.X, 6.X & 7.X, Ubuntu 10.X Solaris 9,10, 11, Windows 2K, XP, 2003, NT, 2008, 2012, and 6, AIX 7, HP-UX 11.23, Mac.
OS Administration: RedHat 5.X 6.X,7.X Linux administration, Solaris 9, 10 Administration
Application servers: WAS 7.X, 8.X JBoss AS 5.x, 6.x, 7.x and JBoss EAP 5.x, 6.x
E-Mail servers: Sendmail, Postfix, Zimbra
Web Servers: Apache(httpd), apache-tomcat, and Apache http server
Networking: DNS, DHCP, TCP/IP, SMTP, LDAP
Monitoring: Nagios, Splunk, Grafana
Scripting Tools: bash, Perl, Python, Ruby, power Shell
Scheduling Tools: Autosys, crontab
Virtualization tools: VMWare vsphere, ESX 5.x/6.0
Third Party Tools: Puppet, Chef, Jenkins, Various DevOps Tools, Git,Github
Cloud: AWS
PROFESSIONAL EXPERIENCE
Confidential, Malvern, PA
DevOps Engineer
Responsibilities:
- Experience in Software Integration, Configuration, building, automating, managing and releasing code from one environment to another environment and deploying to servers.
- Responsible for build and deployment automation using Bitbucket, Jenkins, Nexus and Ansible containers and Chef.
- Developed Linux, UNIX, Perl and Shell Scripts for manual deployment of the code to various environments.
- Manage configuration of Web application and deploy to AWS cloud server through Ansible.
- Used Ansible for deploying the necessary changes on remote hosts and monitored the process using Ansible Tower.
- Used Ansible Tower for running playbooks stream in real-time and amazed to see the status of every running job without any further reloads.
- Integration of Maven/Nexus, Jenkins, GIT, Confluence and JIRA.
- Monitored, created alarms and notifications for Linux servers using Splunk.
- Generated workflows through Control-M, then scheduling Linux jobs which controls file transfer protocols, restart of on-perm servers.
- Prepared, arranged and tested SPLUNK search strings and operational strings.
- Involved in standardizing Splunk forwarder deployment, configuration and maintenance across UNIX and Windows platform
- To overcome the loss of Monolithic applications, we launched several Microservices and it helped us to maintain scale, deploy and resilience.
- Responsible for design and maintenance of the Subversion/GIT Repositories, views, and the access control strategies.
- Used Gradle, XCode and Maven as a build tools on Java, android and swift based projects for the development of build artifacts on the source code.
- Good knowledge on Ec2, EFs, S3 AWS instances, Code Commit, Code Pipeline, Code Deploy, Code Build.
- Worked on ServiceNow to implement changes, raise incidents, tickets and to submit requests
- Worked with JIRA for ticket management and documentation for our Sprints.
- Worked on WebSphere for managing a cluster of logic server pieces.
- Assisted by OEM 12 for effectively handling the Oracle DB using agents, LB with OMS.
- Created the SQL, PL/SQL scripts (DML and DDL) in Oracle database, MySQL and revising them in SVN.
- Worked on creating reports and publishing on Tableau.
Environment: Java/J2ee, IOS, Subversion, Ant, Maven, xcode, Jenkins, uDeploy, GIT, Ansible, Splunk, AWS, Python, ruby, Bitbucket, Shell Scripting, Ruby, PUTTY, SSL certs, Confluence, HP QTP, Tableau, Oracle DB, MySQL and SQL Developer.
Confidential, Raleigh, NC
AWS DevOps Engineer
Responsibilities:
- Responsible for design and maintenance of the Subversion/GIT Repositories, views, and the access control strategies.
- Used Gradle, xcode and Maven as a build tools on Java, android and swift based projects for the development of build artifacts on the source code.
- Responsible for build and deployment automation using Docker and Kubernetes containers and Chef.
- Developed Linux, UNIX, Perl and Shell Scripts for manual deployment of the code to various environments.
- Manage configuration of Web application and Deploy to AWS cloud server through Chef.
- Worked on Chef cookbooks/recipes to automate Infrastructure as a code
- Used Ansible for deploying the necessary changes on remote hosts and monitored the process using Ansible Tower.
- Monitored, created alarms and notifications for Linux servers using Splunk.
- Used Ansible Tower for running playbooks stream in real-time and amazed to see the status of every running job without any further reloads.
- Integration of Maven/Nexus, Jenkins, GIT, Confluence and JIRA.
- Implemented AWS solutions using EC2, S3, Redshift, Lambda, RDS, EBS, Elastic Load Balancer, Auto scaling groups, SNS, Optimized volumes and Cloud Formation templates.
- Worked on Amazon API gateway in creating endpoints for various applications.
- Implemented DNS service (Route 53) in effectively coordinating the load balancing, fail-over and scaling functions.
- Understanding of secure-cloud configuration (CloudTrail, AWS Config), networking services (VPC, Security Groups, VPN etc.) and created roles (IAM)
- Experienced AWS Developer tools such as Code Commit, Code Pipeline, Code Deploy, Code Build etc.
- Configured S3 versioning and life cycle policies to add backup files and archive files in Glacier.
- Monitored and created alarms and notifications for EC2 hosts using Cloud Watch.
- Generated workflows through Apache Airflow, then Apache Oozie for scheduling the Hadoop jobs which controls large data transformations.
- Implemented Hadoop clusters on processing big data pipelines using Amazon EMR and Cloudera whereas it depended on Apache Spark for fast processing and for the integration of APIs. At the end, we managed the above resources using Apache Mesos.
- Created and moved the data between different AWS compute (EC2, Lambda) and storage services (S3, DynamoDB, EMR), as well as on-premise data sources using Data Pipeline.
- Implemented AWS solutions using EC2, S3, Redshift, Lambda, RDS, EBS, Elastic Load Balancer, Auto scaling groups, Optimized volumes and EC2 instances
- Worked on RESTful APIs using Node.js and Spring MVC for communicating between applications or systems.
- Needed JBOSS server for managing and administering the various J2EE applications.
- To overcome the loss of Monolithic applications, we launched several Microservices and it helped us to maintain scale, deploy and resilience.
- Implemented AppDynamics for testing the performance of applications/server/protocols.
- Worked with JIRA and Alfresco for ticket management and documentation for our Sprints.
- Worked on WebLogic for managing a cluster of logic server pieces.
- Assisted by OEM 12 for effectively handling the Oracle DB using agents, LB with OMS.
- Created the SQL, PL/SQL scripts (DML and DDL) in Oracle database, MySQL and revising them in SVN.
Environment: Java/J2ee, IOS, Subversion, Ant, Maven, xcode, Jenkins, uDeploy, GIT, SVN, Chef, Puppet, Ansible, CloudWatch, AWS, Python, ruby, Alfresco, Shell Scripting, Ruby, PUTTY, SSL certs, Confluence, HP QTP, Selenium, Oracle DB, MySQL and SQL.
Confidential, New York, NY
DevOpsEngineer
Responsibilities:
- Implemented CI/CD pipeline using tools Jenkins, Ansible, Chef, Docker, Nagios etc.
- Created automated tool for the Facebook data center simulation to run for aws cloud
- Created automated tool to launch instance and deploy jobs in the created instances using terraform
- Monitoring instance details using Graphana and Prometheus database
- Used Sonar Qube for statically analysis code
- Created ELK stack for monitoring log information (Elasticsearch, Logstash, and Kibana)
- Extensively used Ec2 instance to test large network simulation jobs
- Good knowledge on Ec2, EFs, S3 AWS instances.
- Deploy application in AWS and use MPI for parallel execution using python
- Implement scripts using python, bash for automating the execution of the model.
- Create large network model for the development of application.
- Used Python Unit test framework for developing and implementing the unit tests using test driven approach.
- Hands on experience on Docker to deploy job for various customers.
- Used aws lambda and s3 for live production
- Experience with Visual Studio Team Server (VSTS) a.k.a. Azure DevOps workflows and pipelines (or equivalent CI/CD orchestration toolsets)
- Work with development team and key stake holders to create plan for monitoring Azure resources.
- Develop and manage Storage infrastructure for Azure
- Good understanding of Storage infrastructure for Azure Securing storage accounts
- Worked on both git hub and Bit bucket.
- Used Nginx as the load balancer.
- Agile method and continuous delivery methodology were followed
- Git and bitbucket used as the version control tools.
Environment: AWS, terraform, Grapahana, nexus, Elk stack, Sonar, Ubuntu, Bash, python, yamil, json, Ansible,Jenkins, Docker, Chef, Nagios Python, bash, git hub, Bit bucket, s3, aws lambda, cloud formation, cloud watch, Nginx, Bitbucket.
Confidential, Boise, ID
Devops Engineer
Responsibilities:
- Experience in Software Integration, Configuration, building, automating, managing and releasing code from one environment to another environment and deploying to servers.
- Extensive experience on configuring Amazon EC2, Amazon S3, Amazon Elastic Load Balancing, IAM and Security Groups in Public and Private.
- Created AWS Route53 to route traffic between different regions and alarms and notifications for EC2 instances using Cloud Watch.
- Used Cloud Front to deliver content from AWS edge locations to users, allowing for further reduction of load on front-end servers.
- Extensively worked on Jenkins CI/CD pipeline jobs for end-to-end automation to build, test and deliver artifacts and troubleshoot the build issue during the Jenkins build process.
- Integrated Jenkins with GitHub repository and Maven build tool and created different environments like Dev, QA, Stage and Prod on Jenkins.
- Implemented Jenkins Code Deploy plugin to deploy to AWS and used to automate the build process and deploy the application to Tomcat server.
- Used Ansible for configuration management of hosted Instances within AWS Configuring and Networking of Virtual Private Cloud (VPC).
- Worked on requests for adhoc deployment for a particular environment and use the adhoc deploy plan in Jenkins.
- Experience in writing several Cloud Formation templates to describe the AWS resources and compositions that compose our stack.
- Configured/Integrated Jenkins with Bit bucket to pull codes, ANT to generate builds and push artifacts to AWS S3.
- Automated the continuous integration and deployments using Jenkins, Docker, Ansible and AWS Cloud Templates.
- Hands-on experience working with several Docker components like Docker Engine, Hub, Compose, and Docker Registry for storing Docker images and files, running multiple containers in staging and production environments.
- Experience in managing and monitoring all pre-production and production environments for Elasticsearch.
- Have experience in diagnosing root cause and resolving ELK and platform issues.
- Worked to setup Jenkins as a service inside the Docker Swarm cluster to reduce the failover downtime to minutes and to automate the Docker containers deployment without using configuration management tool.
- Have experience in using Confluence, a team collaboration software in the workspace.
- Used AppDynamics for administrative activities like user management, application management and monitoring controller performance. Splunk used for monitoring system logs essential to finding problems and halting deployment pipelines. Implemented POC for AppDynamics monitoring along with Splunk to enhance the performance of Applications.
- Provided regular support guidance to Splunk project teams on complex solution and issue resolution
- Developed PowerShell 2.0 scripting to work with TFS Object Model to utilize for more repeatable, automated processes and tasks.
- Have worked on deployment automation of all micro-services to pull image from the private Docker registry and deploy to Docker Swarm cluster using Ansible.
- Used RabbitMQ, a message-oriented middleware which uses advanced message queuing protocol for exchanging data between processes, applications and servers.
- Had experience working with Dynatrace to have a view of code-level monitoring of the applications.
- Continuous Integration, automated deployment and management using Team City, Puppet, Griddle, JIRA, Testing Frameworks, Code quality tools like SonarQube and many other comparable tools based on the requirements.
- Deployed .NET applications to application servers in an agile continuous integration environment and also automated the whole process.
- Expanded Experience in Network Management like DNS, NIS, NFS, LDAP, TFTP and system troubleshooting skills. Experience automating deployments on Servers using Jboss, Tomcat, Web Sphere and Apache web server.
- Worked on creating a POC on Migrating data from Oracle and MariaDB and hence got the chance to get some exposure on Apache- NIFI.
- Configured Apache NiFi flow for loading data from non-relational data sources into raw access layer of HDFS.
- Closely worked with development, QA and other teams to ensure automated test efforts and integrated with the build system and in fixing the error while doing the deployment and building.
Environment: .NET, Windows, Ant, Maven, Nagios, Subversion, ELK Stack, Chef, Puppet, PowerShell, ORM, Open Stack, Shell/Perl, Python, SCM, GIT, CVS, TFS, MS Build, Tomcat, Jenkins, Jira, Oracle.
Confidential
Linux/Unix Administrator
Responsibilities:
- Administered, maintained Red Hat 3.0, 4.0, 5.0, 6.0 AS, ES, Troubleshooting Hardware, Operating System Application & Network problems and performance issues; Deployed latest patches for, Linux and Application servers, Performed Red Hat Linux Kernel Tuning.
- Experience in implementing and configuring network services such as HTTP, DHCP, and TFTP.
- Install and configure DHCP, DNS (BIND, MS), web (Apache, IIS), mail (SMTP, IMAP, POP3), and file servers on Linux servers.
- Administered Linux servers for several functions including managing Apache/Tomcat server, mail server, and MySQL databases in both development and production.
- Manage Security, Backup, Disaster Recovery, Performance Monitoring and Fine-tuning on Linux (RHEL) systems.
- Monitoring System Performance of Virtual memory, Managing Swap Space, Disk utilization and CPU utilization.
- Managed system processes and scheduled processes with the cron utility.
- Performing tape backups, archiving and checking data integrity through Shell Scripts and job automation.
- Configure client networks to integrate Windows systems with Linux systems including SAMBA sharing, Print servers, and Router configurations.
- Initiating the crisis management calls and provide troubleshooting during the outage situation
- Installing, Configuring and Maintaining the DHCP, DNS, NFS, NIS, send mail server and LDAP.
- User Account Management, Group Account Management, configuring dumb terminals, adding modems, formatting and partitioning disks, manipulating swap, local and remote printer management, restoring backup, scheduling jobs.
- Performance tuning and preventive maintenance, performed daily backup. Diagnosed hardware and software problems and provided solution to them.
- Resolving TCP/IP network access problems for the clients. Implementing Remote System Monitoring with Sun Microsystems. Develop, Maintain, update various scripts for services (start, stop, restart, recycle, cron jobs) Unix based shell.
- Created Bash shell, Power shell and perl scripts to monitor system resources and system maintenance and performed administrative tasks such as System startup/shutdown and Backup strategy.
- Installing and setting up Oracle9i on Linux for the development team and worked with Linux kernel, memory upgrades and swaps area. Modified the Linux kernel to add tracing probes for Configuration access tool.
- Created users, manage user permissions, maintain User & File System quota on Redhat Linux.
- Responsible for reviewing all open tickets, resolve and close any existing tickets. Monitored trouble ticket queue to attend user and system calls.
- Attended team meetings, change control meetings to update installation progress, and for upcoming changes in environment.
- Updated data in inventory management package for Software and Hardware products.
- Capacity Planning, Infrastructure design and ordering systems. Worked with DBAs on installation of RDBMS database, restoration and log generation.
- Provided 24*7 production support and performed weekend changes during maintenance window.
Environment: Red Hat Linux 3.0,4.0,5.0 AS ES, HP-DL585, Oracle 9i/10g, Samba, VMware Tomcat 3.x,4.x,5.x, Apache Server 1.x,2.x, Bash.
