Sr. Devops Engineer Resume
Green Bay, WI
SUMMARY
- 7+ years of experience as an AWS DevOps Engineer in Installation, Configuration, Management of Linux OS (RHEL, CentOS, Ubuntu), Amazon Web Services.
- Extensively worked on continuous integration tool like Hudson, Team city, Jenkins, CA Harvest and Bamboo for end - to-end automation of various build and deployments.
- Experience working with code compilation, packaging, deployment/ release methodology, Linux Systems, Network troubleshooting and Database Development & Administration.
- Experience working with developing scripts and automation tools used for building, integrating and deploying software releases to multiple environments. Extensive experience working with build tools like Ant, Maven.
- Core development experience Groovy Grails restful web services.
- Responsible for writing the Design specifications for the generic and application specific web services in Groovy Grails.
- Experience in Building/MaintainingDockercontainer clusters managed byKubernetes Linux,Bash,GIT,DockeronGoogle Cloud Platform.
- Working Experience onAzure Cloud Services, Azure Storage, SQL Azureand in different PaaS Solutions with Web, and worker Roles and Azure Web Apps.
- Create, update, Define, Design, develop and deploy Azure Resource Manager (ARM) Templates includes PaaS WebApp, PaaS SQL Server.
- Proficient with Continuous Integration (CI) tools like Jenkins, Hudson.
- Strong experience in Configuration Management like Chef, Puppet performing application builds/packaging, defect management, troubleshooting, version control and environment supervision.
- Extensive experience in working with continuous delivery using Chef and developed Cookbooks and coded recipes for configuring infrastructures and for automating deployments and administrating infrastructure of the nodes.
- Web Application Development using Groovy Grails JQuery, AJAX.
- JIRA development with Java and Groovy Scripting.
- Worked on Kafka Backup Index, Log4j appender minimized logs and Pointed ambari server logs to NAS Storage.
- Create clusters inGoogle Cloudand manage the clusters usingKubernetes and OpenShift. Using Jenkins to deploy code to Google Cloud, create new namespaces, creating docker images and pushing them tocontainer registryofGoogle Cloud.
- Deployed Data lake cluster with Hortonworks Ambari on AWS using EC2 and S3.
- Installed the Apache Kafka cluster and Confluent Kafka open source in different environments.
- Basically, one can install kafka open source or confluent version on windows and Linux/Unix systems.
- Implemented real time log analytics pipeline using Confluent Kafka, storm, elastic search Logstash kibana, and greenplum.
- Good understanding ofPivotal cloud foundry (PCF)Architecture (Diego Architecture), PCF components and their functionalities. Experienced in using Pivotal Cloud Foundry (PCF) CLI for deploying applications and other CF management activities.
- In - depth knowledge of Hadoop Eco system - HDFS, Yarn, MapReduce, Hive, Hue, Sqoop, Flume, Kafka, Spark, Oozie, NiFi and Cassandra.
- Strong experience working with version control systems like Subversion and GIT and used source code management client tools like GitHub, Git GUI, CVS and other command line applications.
- Expertise working with installation and configuration of web servers like Apache, Tomcat, and Web Logic.
- Experience working and maintaining Atlassian products like JIRA
- Profound understanding of version control tools like CVS, GIT, SVN, clear case to track and update the code written by different people.
- Hands on experience onAzureVPN-Point to Site, Virtual networks,AzureCustom security, end security and firewall. UsedAzureExpress Route to set up a private connection to Microsoft cloud services such as a MicrosoftAzure, Office 365, and Dynamic 365.
- Experience with deploying Puppet, Puppet Dashboard and Puppet DB for configuration management to existing infrastructure and monitor scalable on Amazon Web Services (AWS) and was also involved in Container Based deployments (Docker) with Chef Configuration management tool.
- Good understanding of the principles and best practices of Software Configuration Management (SCM) in Agile, scrum, and Waterfall methodologies.
- Good understanding of AWS and related services like EBS, RDS, ELB, Route53, S3, EC2, AMI, IAM through AWS console.
- Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
- Was also involved in architecture and implementation of DevOps platform and cloud solutions.
- Implemented and supported monitoring tools like AppDynamics, Nagios and Splunk on QA and Production servers for Resource, Network and Log Trace Monitoring. Proficient in working with network protocols like TCP/IP, DNS.
- Gained sound knowledge in product deployment in servers, mail servers, monitoring tools & shell scripts, networking, SQL/MySQL.
- Hands on experience on performance monitoring tools like Cloud Watch and Services Related to AWS.
- Gained good knowledge in Linux command line & bash shell scripting.
TECHNICAL SKILLS
Version control Tools: GIT, SVN, Clear case
Build Tool: Ant, Maven, Artifactory, Docker, Kubernetes, OpenShift
Configuration Integration Provisioning Management: Jenkins, Chef, Puppet, Terraform, Kafka
Cloud: AWS, EC2 command line tools, Azure, Pivotal Cloud Foundry, Data ware house
Bug tracker & Testing: JIRA, Bugzilla, Unit
Infrastructure Tools: VMware, KVM, Chef, Puppet Enterprise, Foreman, Apache Lib cloud, Google Cloud, AWS
Monitoring Tools: App Dynamics
SDLC: Agile, Scrum
Web/App Servers: Apache Tomcat, IBM Web sphere, IBM AIX, Jboss
Web Technologies: Servlets, JDBC, JSP, HTML, JavaScript, XML
Scripts & Languages: Shell Script-Bash, Ruby, Perl Script, C, C+, Python, Java, Groovy
Database System: My SQL, Mongo DB, F5-Big IP.
Operating systems: Red hat enterprise Linux, Solaris & Windows
PROFESSIONAL EXPERIENCE
Confidential, Green Bay, WI
Sr. DevOps Engineer
Responsibilities:
- Working with Continuous Integration Builds on Chef Automation and working with various environments like Ubuntu, Red hat, and Windows.
- Creating the automated build and deployment process for application, re-engineering setup for better user experience, and leading up to building a continuous integration system for all our products.
- Implementing new projects builds framework using Jenkins & maven as build framework tools
- Implementing a Continuous Delivery framework using Jenkins, Chef and Maven& Nexus in Linux environment.
- Managed SVN repositories for branching, merging and tagging and developing Shell/Groovy Scripts for automation purpose.
- Point team player on OpenShift for creating new Projects, Services for load balancing and adding them to Routes to be accessible from outside, troubleshooting pods through ssh and logs, modification ofBuildconfigs, templates,Image streams, etc.
- Knowledge onSaaS,PaaSandIaaSconcepts of cloud computing architecture and Implementation usingAWS,OpenStack, OpenShift, Pivotal Cloud Foundry (PCF)andAzure.
- Unblocked development efforts with additional or upgraded Chef Capabilities. Wrote new chef cookbooks and utilized LWRP's from community cookbooks and recipes to build new Open Resty (nginx) application server and MongoDB server roles.
- Used ETL/ELT process with Azure Data Warehouse to keep data in Blob Storage with almost no limitation on data volume.
- Creating scripts in DSL Groovy which integrate with Jenkins for Automation to create seed jobs.
- Migrated 9 microservices toGoogle Cloud Platformfrom skava and have one more big release planned with 4 more microservices.
- Working on the migration of mobile application from skava to cloud (Google Cloud) by making the chunk of code to microservices.
- Setting up and configuring Kafka Environment in Windows from the scratch and monitoring it.
- Created a data pipeline through Kafka Connecting two different clients Applications namely SEQUENTRA and LEASE ACCELERATOR.
- Worked closely with our BI analyst and designed the Data warehouse star schema.
- Repaired broken Chef Recipes and corrected configuration problems with other chef objects.
- Developed Scripts and great ideas to automate system deployment to scale infrastructure.
- Deployment and implementation of Chef for infrastructure as code initiative.
- Writing different Chef Cookbooks for installing, configuration, and upgrading different applications on the Servers.
- Developed real-time data synchronization systems with reactive programming concepts like Akka and Kafka.
- Developed DevOps Scripts in Groovy to automate and collection analysis of Cassandra.
- Written some Groovy scripts for setting up LDAP configuration for Jenkins using security matrix.
- Good understanding of Data Mining and Machine Learning techniques
- Managing the OpenShift cluster that includes scaling up and down the AWS app nodes.
- Had very strong exposure using ansible automation in replacing the different components of OpenShift likeECTD,MASTER, APP, INFRA,Gluster.
- Extensively used Docker for virtualization, Ship, Run and Deploy the application securely for fasten the Build/Release Engineering.
- Manage deployment automation by creating Chef Roles.
- AWS Cloud management and Chef automation
- Imported and managed multiple corporate applications using GIT.
- Responsible for Design of different Release Environments for new projects.
- Using Jenkins AWS Code Deploy plug-in to deploy into AWS
- Defining Release Process & Policy for projects early in SDLC.
- Responsible for Database build, release and configuration
- Experience working with log monitoring with ELK Stack (Elasticsearch, Logstash, Kibana).
- Perform Deployment of Release to various QA & UAT in Linux environments.
- Configured Elastic Load Balancers with EC2 Auto scaling groups
- Created multi AZ VPC instances.
- Successfully secured the Kafka cluster with Kerberos Implemented Kafka Security Features using SSL and without Kerberos. Further with more grain-fines Security I set up Kerberos to have users and groups this will enable more advanced security features.
- Integrated Apache Kafka for data ingestion
- Successfully Generated consumer group lags from kafka using their API Kafka- Used for building real-time data pipelines between clusters.
- Extensively used Docker for virtualization, Ship, Run and Deploy the application securely for fasten the Build/Release Engineering.
- Familiarity with Kubernetes, Mesos and Docker Swarm.
- Created documents on build and release process and flow, release processes, order of activities for all releases, user guide for developers for local builds.
- Implemented AWS solutions using E2C, S3, RDS, EBS, Elastic Load Balancer, Auto scaling groups,
- Optimized volumes and EC2 instances
- Used IAM to create new accounts, roles and groups.
- Created monitors, alarms and notifications for EC2 hosts using Cloud Watch.
- Migrated applications to the AWS cloud Environment.
- Scripting in multiple languages on UNIX, LINUX and Windows - Perl, Ruby, Shell, etc.
- Work with different team members for automation of Release components.
- Resolved system issues and inconsistencies in coordination with quality assurance and engineering teams.
- Troubleshoot the build issue during the Jenkins build process.
- Participated in the full release project life cycle which involves deployments in various environments like QA/UAT/TRAIN/STG/PROD.
Environment: Solaris, Linux, Eclipse, Java, SQL, AWS EC2, Python, Subversion, Bash, Hudson, NT Command Shell, Java/J2EE, Maven, Gradle, Chef, AWS, JIRA, XML, Vagrant LINUX (RHEL, CentOS), Docker, Jenkins
Confidential, New York
AWS/DevOps Engineer
Responsibilities:
- Participate in product design reviews to provide input on functional requirements, product designs, schedules, or potential problems.
- Taking Backup to cloud Storage Account using Cloudberry Cloud Storage Tools. Configure Site to Site VPN Connectivity.
- Participate in product design reviews to provide input on functional requirements, product designs, schedules, or potential problems.
- Developer to align process and tools, such as branching source control structure dependency management Linux/Windows hybrid build infrastructure, code review & check-in policies,that are developed and instrumented by DevOps teams.
- Written Cloud Formation Templates (CFT) in JSON and YAML format to build the AWS services with the paradigm of Infrastructure as a Code.
- Experienced in automating, configuring and deploying instances onAzure,AWS, and Rack space cloud environments and Data centers.
- Used Java Message Service (JMS) for reliable and asynchronous exchange of important information between the clients and the customer
- Writing Ansible playbooks from scratch using YAML and deploying them in AWS using Terraform.
- Used Ansible to Setup/teardown of ELK stack (Elasticsearch, Log stash, Kibana) and troubleshoot the build issues with ELK and work towards the solution.
- Written Maven and Ant build tools for application layer modules on AWS EC2 instances.
- Building/Maintaining Docker container clusters managed by Kubernetes, Linux, Bash, GIT, Docker, on GCP. Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test deploy.
- Using Kubernetes, controlled and automated application deployments and updates and orchestrated deployment.
- Created and maintained the Python Shell deployment scripts for TC Server/Tomcat web application servers.
- Troubleshooting all build problems, using ClearQuest as the bug tracking system, posting all-knowing issues of the build on JIRA for supporting all developer teams.
- Worked on deployment automation of all the Microservices to pull image from the private Docker registry and deploy to Kubernetes using Ansible.
- Worked able to create scripts for system administration and AWS using languages such as BASH and Python.
- Developed an SQL and jQuery-based front-end app to track all the transmission related data and the transmission, so that the users do not have to go through various Sterling configurations to find details about a transmission.
- Have Implemented Enterprise Management Software using Simple Network Management protocol.
- Installation of WebSphere, upgraded to service pack updates, installed IBM patches, configuring and creation new admin & managed servers, start & stop WebSphere server using Puppet.
- Experience with configuring Ansible on servers and using Ansible for deploying packages to multiple nodes.
Environment: RHEL 6/7, Anible, Ansible Tower, Java, VMware ESXi, Splunk, JIRA, GIT, Active Directory, WebSphere, Apache, Terraform, Jboss, F5, Python, Shell Scripting, Haproxy.
Confidential, Atlanta, Georgia
DevOps Engineer
Responsibilities:
- Performed software configuration/release management activities for three different Java applications.
- Designed and implemented Continuous Integration process using tools like Jenkins with approval from development and other affected teams. Defined processes to build and deliver software baselines for internal as well as external customers.
- Converted old builds using MAKE to ANT and XML for doing Java build.
- Expertise in using build tools like MAVEN & ANT to frame the deployable artifacts such as jar and war from source code as well as migrating of build tool from Ant and Maven.
- Created and maintained automation scripts using PYTHON.
- Implemented AWS client API to interact with different services as Console configuration for AWS EC2.
- Used AWS Lambda to manage the servers and run the code in the AWS.
- Using Docker Container clusters to clone the production servers and implementing kubernets orchestration for clone’s production servers.
- Scripting infrastructure and (Linux) machine provisioning from scratch using tools such as bash and the Ruby AWS-SDK.
- Responsible for authoring automated test suites in Java for web-UI and web-service testing
- Build Java code and JavaScript code on to different Jenkins servers as per the schedule.
- Created users, manage user permissions, maintain User & File System quota on Redhat Linux and AIX.
- Configured AWS Cloud Watch to monitor AWS resources, including creating AWS customized Scripts to monitor various application and system & Instance metrics.
- Planned and executed the migration from Bugzilla-based bug-tracking and Jenkins CI tool in to the JIRA.
- Integrated puppet with SVN to manage and deploy project related tags.
- Automated Compute Engine and Docker Image Builds with Jenkins and Kubernetes.
- Maintained and executed build scripts by coordinating with development and QA teams.
- Deployed the EAR and WAR archives into WebLogic and Apache Servers.
- Created and Maintained Subversion repositories, branches and tags.
- Enforced development policies using Subversion hooks and other metadata.
Environment: RHEL, UBUNTU, Jira, SVN, Java, Terraform, WebLogic, Jenkins, ANT, Cron Jobs, Java, PHP, Puppet
Confidential
Build and Release Engineer
Responsibilities:
- Created Continuous Build Process using Jenkins as Continuous integration tool.
- Implemented a Continuous Delivery framework using Jenkins, Puppet in Linux environment.
- Managed tools like Subversion, Jenkins, JIRA and Performed maintenance and troubleshooting of build / deployment systems.
- Build scripts using ANT and MAVEN build tools in Jenkins to move from one environment to other environments.
- Puppet automation, installing and configuring Puppet 3.x server and agent setup, developing IHS, Web Sphere MQ 7.0, Web Sphere Application Server Automation, Apache solar 4.x/5.x Jenkins, foremen.
- Expertise in using Jenkins for Adding scripts, building the Suites and analyzing the results.
- Oversaw production deployments for new code releases from staging to production via GIT/Jenkins .
- Involved installing and managing different automation and monitoring tools on RedHat Linux like Nagios and Puppet.
Environment: Red-Hat Enterprise Linux, HP ProLiant DL 585, BL 465/485, ML Series, SAN(Netapp), BladeLogic, Veritas Cluster Server 5.0, Windows 2003 server, Shell scripting, Jboss 4.2, VMware Virtual Client 3.5, VMware Infrastructure 3.5.