We provide IT Staff Augmentation Services!

Sr. Cloud/devops Engineer Resume

Hartford, CT

SUMMARY

  • I have 7+ years of experience in various domains such as retail industry, investment management industry, financial service industry and information technology industry with excellent knowledge to implement DevOps strategies in Linux/Windows environment and cloud technologies like AWS/Azure. For automation of environment, build and release management tools Git, Maven, Jenkins, Ansible, Puppet, Terraform, Docker and Kubernetes.
  • Azure experience extended working knowledge in cloud service, IaaS, worker role, service bus, queue, Azure blob and table storages and API Management. Configured NSGs for two tier and three tier applications.
  • Worked on Azure web application, App services, Azure storage, Azure SQL Database, Virtual machines, Fabric controller, Azure AD, Azure search and notification hub.
  • Designed, configured and managed public/private cloud infrastructures utilizing Amazon Web Services (AWS) including EC2, Auto - Scaling in launching EC2 instances, Elastic Load Balancer, Elastic Beanstalk, S3, Glacier, Cloud Front, RDS, VPC, Direct Connect, Route53, Cloud Watch, Cloud Formation, IAM, SNS
  • Working knowledge on other Amazon Web Services, like Relational Database Service (RDS), Redshift, EMR, Kinesis, DynamoDB, Elastic Beanstalk, Lambda, Glacier, Storage Gateway, Data Pipeline.
  • Extensively used Security groups, network ACL’s, Internet gateways and Route tables to ensure a secure zone for organization in AWS public cloud.
  • In depth experience on SaaS, PaaS and IaaS concepts of cloud computing architecture and Implementation using Azure, AWS, Google cloud platform, OpenStack and Pivotal Cloud Foundry (PCF), Slack, Salesforce.
  • Using Clover ETL migrated data to AWS Redshift and Managed Amazon redshift clusters such as launching the cluster with specifying the nodes and performing the data analysis queries.
  • Hands-on experience for data mapping using ETL tool. strong ability to write procedures to ETL data into a Data Warehouse from a variety of data sources including flat files and database links ( MySQL and Oracle).
  • Excellent understanding of SDLC, Traditional, Agile, RUP and other Methodologies. Expertise Configuration/Release/Build Management on both UNIX and Windows environments using Team Foundation Server TFS 2015/2013/2012/2010 and 2008, Rational Clear case, Subversion (SVN), Team site.
  • Implemented Chef Recipes for Deployment on build on internal Data Centre Servers. Also re-used and modified same Chef Recipes to create a \Deployment directly into Amazon EC2 instances
  • Involved in creating Chef Knife, Recipes and Cookbooks to maintain chef servers, its roles and cloud resources. Used Chef for server provisioning and infrastructure automation in a SAAS environment.
  • Installed/Configured/Managed Puppet Master/Agent. Wrote custom Modules and Manifests, downloaded pre-written modules from puppet-forge. Upgradation or Migration of Puppet Community and Enterprise.
  • Implemented the Ansible Client-Server architecture from scratch level and hands on experience to create the projects, inventory to run templates over Ansible tower.
  • Experience in managing uDeploy configuration, administration, upgrade, security and maintenance of systems, platforms like Web, application.
  • Experience in Design and Automation of uDeploy Application process, component process, Environment resources model, Plugins, Security Model (Roles, Teams) in uDeploy , Notification Schemes, Environment Gates, and Approval Process.
  • Experience in automating uDeploy agent installation and configuration process.
  • Experience in Cloud computing, Windows Azure environment, creating the New VMs, Azure subscriptions, storage accounts, managing SSL certificates for IIS websites, administering azure assets using PowerShell.
  • Expertise in Google Cloud Platform (GCP) services like Compute Engine, Cloud functions, Cloud DNS, Cloud Storage and Cloud Deployment and SaaS, PaaS and IaaS concepts of Cloud computing architecture and Implementation using GCP.
  • Worked hands-on to create automated, PCF, containerized cloud application platforms (PaaS), and design and implement DevOps processes that use those platforms
  • Expert automation skills with Chef or similar. Expert knowledge of Open Cloud APIs and SDKs. Broad knowledge of the following with deep knowledge of several: PHP, Python, Ruby, Javascript, LAMP, nginx, node.js, nosql, git.
  • Working knowledge of software development methodologies including code profiling, regression testing, and continuous integration
  • Experience in creating Docker Containers leveraging existing Linux Containers and AMI's in addition to creating Docker Containers from scratch and Worked on Docker service rolling updates, BLUE GREEN DEPLOYMENT to implement zero downtime PROD deployments.
  • Created several pods using Master and Minion architecture of Kubernetes and developed microservice on boarding tools leveraging Python allowing for easy creation and maintenance of build jobs and Kubernetes deploy and services
  • Experienced in Branching, Merging, Tagging and maintaining the version across the environments using SCM tools like GIT and Subversion (SVN) on Linux platforms
  • Proficient in Python, Shell Scripting, SQL , build utilities like open make, ANT and Cruise Control.
  • Strong experience building distributed systems that uses technologies like Ruby, Apigee, Elasticsearch, Node JS.
  • Imported and managed with various corporate applications into GitHub code administration repo and Managed GIT, GitHub, Bit bucket and SVN as Source Control Systems and Managed SVN repositories for branching, merging, and tagging
  • Implementing shell scripts for creating jobs in Jenkins from command line interface and user interface as well as to automate the deployment process for various applications through SDA (Serena Deployment Automation)
  • Configured and managed Nagios for monitoring over existing AWS Cloud platform. Build Nagios monitors for new services being deployed.
  • Working on building deployment & delivery CI/CD pipelines in Jenkins and integrating it with various testing tools (Splunk, Hygeia) for continuous deployment
  • Experience in implementation of New Relic including Server, APM, and Synthetic with automation and integrating with ticking tool (Service Now) and automating management report.
  • Experience in Developing KORN, BASH, PERL, Python shell scripts to automate cron jobs and system maintenance. Scheduled cron jobs for job automation.
  • Implemented docker -maven-plugin in and maven pom to build docker images for all microservices and later used Dockerfile to build the docker images from the java jar files.
  • Proficient DB administration (SQL Server, Oracle, DB2, MongoDB, RabbitMQ, MySQL, MS SQL, T-SQL, Sybase) knowledge for maintaining and performing required DB tasks.
  • Extensive experience in ticketing and tracking tools like JIRA, REMEDY, ClearQuest, Redmine, Bugzilla for Production hotfixes and bugfixes.
  • Worked on installing, integrating, tuning, and troubleshooting Apache 2.x, 1.3.x webserver, JBoss 4.x, Tomcat and WebLogic 8.x, 9.x and WebSphere 7.x / 8.x application servers.
  • Knowledge on networking protocols (e.g., HTTP, TCP, IP, SSH, FTP, SMTP, DNS, DHCP NFS, RPM, YUM, LDAP and Auto FS, LAN, WAN, iptables), load balancer, firewall, storage.

TECHNICAL SKILLS

Configuration Management: Ansible, Chef and Puppet

Continuous Integration: Jenkins, Team city, Bamboo, Cloud bees Jenkins, Urban Code (uDeploy), Urban Code Release/Build

Version Control: Git, SVN GitHub, Bitbucket and Subversion

Build Tools: MAVEN, Gradle and ANT

Cloud Platforms: AWS, Azure, Google Cloud and PCF VMWare, Vagrant

Package Management: Nexus, Artifactory

Issue Tracking: JIRA, Service-Now.

Containerization: Docker, Kubernetes

Operating Systems: Linux (Red Hat 5/6), Ubuntu, CentOS, Windows and Unix

Databases: MySQL, MongoDB, Oracle DB

Programming Languages: JavaScript, XML, HTML, Groovy, Shell script, Ruby and Python.

Infrastructure spin up tools: Terraform, CloudFormation Templates and Azure Resource Manager Templates.

Web & Application servers: Web logic, Web Sphere, Apache Tomcat and JBOSS.

Logging& Monitoring Tools: Nagios, Sumo Logic, Cloud watch, Splunk, ELK

PROFESSIONAL EXPERIENCE

Confidential, Hartford, CT

Sr. Cloud/DevOps Engineer

Responsibilities:

  • Implementing Microsoft Azure (Public) Cloud to provide IaaS support to client. Create Virtual Machines through Power Shell Script and Azure Portal.
  • Worked on Microsoft Azure (Public) Cloud to provide IaaS support to client. Create Virtual Machines through Power Shell Script and Azure Portal .
  • Creating Storage Pool and Stripping of Disk for Azure Virtual Machines. Backup, Configure and Restore Azure Virtual Machine using Azure Backup.
  • Designed, configured and deployed Microsoft Azure for a multitude of applications utilizing the Azure stack (Including Compute, Web & Mobile, Blobs, Resource Groups, Azure SQL, Cloud Services, and ARM), focusing on high - availability, fault tolerance, and auto-scaling.
  • Working on Azure functions like Auto-scaling, and serverless methods, Run-checks of code as an individual function to get problem-solving.
  • Architect and improve growing application deployment on the Azure cloud platform (PaaS & SaaS)
  • Deployed Azure IaaS virtual machines (VMs) and Cloud services (PaaS role instances) into secure VNets and subnets and Automated VSTS (Visual Studio Team System) build and deployment to IAAS and PAAS environment in Azure and developed build support utilities in PowerShell.
  • Experience in Azure Cloud Services (PaaS & IaaS), Storage, Web Apps, Active Directory, Azure Application Insights, Logic Apps, Data Factory, Service Bus, Traffic Manager, Azure Monitoring, AzureOMS, Key Vault, Cognitive Services (LUIS) and SQL Azure, Cloud Services, Resource Groups, ExpressRoute, Load Balancing, Application Gateways.
  • Configured continuous integration from Source control, setting up build definition within Visual Studio Team Services (VSTS) and configure continuous delivery to automate the deployment of ASP.NET MVC applications to Azure web apps.
  • Extensive experience in web development, application development using Visual Studio.NET technologies like C#, ASP.NET MVC 5, ASP.NET, ADO.NET, XML, Web Services, WCF, and WPF .
  • Involved in design, implementation and modifying the Python code.
  • Developed Python and shell scripts for automation of the build and release process.
  • Installed TFS 2015/2013 and setup different TFS user groups for the project team.
  • Extensive experience in Setting up Application Tier, Build Controllers, Build Agents in Team foundation Server
  • Set up different kinds of Build triggers including - Gated - Check in, Continuous Integration (CI), and Rolling Builds in Team Foundation Server TFS 2013 & TFS 2015
  • Excellent experience in Installation and Configuration and Testing of Team Foundation Server
  • Experience managing software lifecycle from build, continuous test, deployment and release
  • Strong experience with large scale, distributed, enterprise system delivery and operations
  • Worked in integrating TFS with Visual Studio VSTS 2013 & 2015 and integrated TFS with SSRS & SSIS for Custom Reporting.
  • Gathered Client requirements and converting them into the Technical Specifications and developed Web forms using C#.NET .
  • Developed Web applications C#, ASP. NET with MVC architecture.
  • Managing keys by creating the keys and attaching them to library & Variable Groups with the help of Key Vault.
  • Maintain storing certificates and secrets for Azure APIM and Azure Application Gateway.
  • Written the Ansible playbooks which are the entry point for Ansible provisioning. Where the automation is defined through tasks using YAML format to setup continuous Delivery pipeline and ran Ansible Scripts to provision Dev servers.
  • Written Ansible playbooks for automating tasks and managed Ansible to configure Web Apps and deploy them on server.
  • Used Docker for running different programs on single VM, Docker images includes setting the entry point and volumes, also ran Docker containers and worked on installing Docker and creation of Docker container images, tagging and pushing the images.
  • Worked on Docker registry, Machine, Hub and creating, attaching, networking of Docker containers, container orchestration using Kubernetes for clustering, load balancing, scaling and service discovery using selectors, nodes and pods.
  • Used Kubernetes for container operation in Azure and used Kubernetes clusters as a network and load balancer, and chosen Kubernetes is also good at running web applications in a cluster way, also used in multiple services by creating images and reduced space.
  • Maintaining Jenkins in various multiple environments by installing packages on Jenkins master and slaves and perform regular security updates for Jenkins.
  • Involved in setting up Jenkins Master and multiple slaves for the entire team as a CI tool as part of Continuous development and deployment process .
  • Setup full Jenkins CI/CD pipelines so that each commit a developer makes will go through standard process of software lifecycle and gets tested well enough before it can make it to the production.
  • Helped individual teams to set up their repositories in Git and maintain their code and help them setting up jobs which can make use of CI/CD environment.
  • Extensively worked on Jenkins to implement continuous integration (CI) and Continuous deployment (CD) processes.
  • Get requirements for every release and make uDeploy design changes and developed, maintained and distributed release notes for each scheduled release.
  • Get requirements for every release and make uDeploy design changes and creating, Configuring and managing Jenkins build automation and collaborated with development support teams to setup a continuous delivery environment.
  • Responsible for managing Cron jobs on production servers and Integration implementation of Jenkins, uDeploy, JIRA and Crucible for DevOps automation .
  • Developed a fully automated continuous integration system using GIT, Jenkins and custom tools developed in Python and Bash.
  • Configured build tool Maven for building deployable artifacts such as jar, war, and ear from source code and Artifactory Repository like Nexus for Maven and ANT builds to upload artifacts using Jenkins.
  • Maintain build profiles in Team Foundation Server and Jenkins for CI/CD pipeline.
  • Managed and monitored the server and network infrastructure using Splunk applied blackouts for any outages, pulling reports by providing them to the client.
  • Supported multiple teams for multiple applications including .Net and Java/j2EE.
  • Created New Relic dashboard for all the services, Created New Relic queries for all the services.
  • Extensively involved in infrastructure as code, execution plans, resource graph and change automation using Terraform.
  • Provided administration and Monitoring for Cassandra clusters on VM’s and Implemented a distribute messaging queue to integrate with Cassandra using Kafka and Zookeeper.
  • Developed a stream filtering system using Spark streaming on top of Apache Kafka.
  • Designed a system using Kafka to auto - scale the backend servers based on the events throughput.
  • Also worked on Apache Hadoop and used Kafka for messaging system and spark for processing large sets of data.
  • Build and release of Cloud based products containing Linux (Centos, RHEL, Ubuntu) and Windows environments, using PowerShell, Python, Shell.
  • Strong hands-on background in database technologies (Oracle, Mysql, MS SQL, RDS, DynamoDB)
  • Configured a Google Cloud Virtual Private Cloud (VPC) and Subnet Group for isolation of resources. Architecting the infrastructure on Google Cloud Platform using GCP services and automated GCP infrastructure using GCP Cloud Deployment manager.
  • Secured the GCP infrastructure with Private, Public subnets as well as security groups and leveraged the Google cloud services such as compute, auto-scaling and VPC to build secure, Scalable systems to handle unexpected workloads.

Environment: Azure,, Terraform, Chef, Ansible, Docker, Kubernetes, Jenkins, GIT, Maven, Splunk, GCP, Kubernetes, CI/CD, Kafka.

Confidential - Columbus, Ohio

AWS/DevOps Engineer

Responsibilities:

  • Worked on AWS EC2 Instances creation, setting up VPC, launching EC2 instances different kind of private and public subnets based on the requirements for each of the applications and used IAM to assign roles, to create and manage AWS users, groups, and required permissions to use AWS resources.
  • Designed the infrastructure using AWS cloud services like EC2, S3, ELB, EBS, VPC, Route53, Auto scaling groups, CloudWatch, IAM, Route 53, Lambda, API-gateway and SNS.
  • Migrated On-prim to Amazon Web Services using Snowball
  • Experience in different migration services like AWS Server Migration Service (SMS) to migrate on-premises workloads to AWS in easier and faster way using Rehost “lift and shift” methodology and AWS Database Migration Service (DMS) , AWS Snowball to transfer large amounts of data and Amazon S3 Transfer Acceleration.
  • Setting up databases in AWS RDS including MSSQL, My SQL, MongoDB storage using S3 bucket and configuring instance backups to S3 bucket.
  • Implemented a 'server less' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS Lambda code from Amazon S3 buckets. Created a Lambda Deployment function, and configured it to receive events from your S3 bucket
  • Designed the data models to be used in data intensive AWS Lambda applications which are aimed to do complex analysis creating analytical reports for end-to-end traceability, lineage, definition of Key Business elements from Aurora.
  • DevOps role converting existing AWS infrastructure to Server-less architecture (AWS Lambda, Kinesis) deployed via CloudFormation.
  • Created S3 buckets and managed policies utilized S3 bucket and Glacier for storage and backup on AWS
  • Experience with messaging systems such as RabbitMQ.
  • Proven capability with end-to-end management of projects from concept and requirements gathering to development, testing and deployment
  • Created alarms in Cloud Watch service for monitoring the server's performance, CPU Utilization, disk usage etc.
  • Implemented AWS High-Availability using AWS Elastic Load Balancing (ELB) , which performed a balance across instances in multiple Availability Zones .
  • Experienced in extract transform and load ( ETL ) processing large datasets of different forms including structured, semi-structured and unstructured data.
  • Involved in File Manipulations, File Uploads using Node JS.
  • Linux knowledge and an understanding of deploying and operating large scale web services on Linux
  • Design and implement Continuous Build/Release/Deploy solutions using Urban code as well as Git, Maven, Jenkins etc.
  • Integration of user-facing elements developed by front-end developers with server-side logic using Node.js.
  • Implementing performance-tuning techniques along various stages of the ETL process.
  • Designed and developed ETL processes to handle data migration from multiple business units and sources including Oracle, MSSQL and Access
  • Wrote Ansible Playbooks with Python SSH as the Wrapper to manage configuration of AWS Nodes and Test Playbooks on AWS instances using Python and ran Ansible Scripts to provision Dev servers.
  • Created Terraform modules to create instances in AWS & automated process of creation of resources in AWS using Terraform.
  • Infrastructure buildout, maintenance & automation, collaborated with infrastructure team to maintain servers using Terraform for provisioning, Servers were spread across various regions and availability zones on AWS.
  • Developed Ansible Playbooks using YAML scripts for launching different EC2 virtual servers in the cloud using Auto-Scaling and Amazon Machine Images (AMI). Used Ansible server to manage and configure nodes.
  • Created AWS, VPC network for the installed instances and configured the Security Groups and Elastic IP’s and used EC2 Container Service to support Docker containers to run applications on cluster of EC2 instances.
  • Created 99.9999% up-time scalable REST API platform with AWS Lambda, Spring Boot, JPA.
  • Build and maintaining Docker container clusters managed by Kubernetes, Linux, Bash, GIT, Docker, on AWS. Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test, deploy.
  • Implemented Git for Branching , Merging , Tagging and maintaining the version across the environments on Linux platforms and, designed and implemented Git Metadata including elements, labels, attributes, triggers , hyperlinks.
  • Designed and implemented Git and Perforce metadata including elements, labels, attributes, triggers and hyperlinks.
  • Establish/follow standards, guidelines, and best practices around reference data, master data, metadata and operational data management including modeling
  • Improve speed, efficiency and scalability of the continuous integration environment, automating wherever possible using Python, Ruby, Shell and PowerShell Scripts .
  • Hands-on experience migrating Oracle, SQL Server instances to AWS RDS.
  • Develop shell, SQL scripts, and stored procedures to automate database tasks
  • Configure SQL Server database mirroring and SQL server log shipping between sites for maximum availability of mission critical production database
  • Provided operational and application support to Oracle database and SQL server databases.
  • Created multiple Python, Bash, Shell and Ruby Shell Scripts for various application level tasks.
  • Automated setting up server infrastructure for the DevOps services, using Ansible , shell and python scripts .
  • Created meta db, soft partitions, and RAID levels using sun Solaris Volume Manager
  • Designing and Developing API proxies using Apigee Edge.
  • Experience in creating custom policies using gateway scripts or DP processing rules and configuring message Assemblies.
  • Worked on building the data centers using Amazon Web Service, installed images on Amazon web services using Jenkins, GIT.
  • Designing and implementing CI (Continuous integration) system and configuring Jenkins servers and nodes by writing required scripts (Bash & Python) and creating configuring VMs.
  • Create and maintain fully automated CI/CD pipelines for code deployment using Groovy Script.
  • Integrated Kafka with Flume in sand box Environment using Kafka source and Kafka sink.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required. Integrated Kafka with Spark in sand box Environment.
  • Responsible for Installing, setup and Configuring Apache Kafka and Apache Zookeeper.
  • Used Kafka to collect Website activity and Stream processing.
  • Responsible for managing and supporting Continuous Integration (CI) and Continuous Deployment (CD) using Jenkins.
  • Worked on integrating GIT into the continuous Integration (CI) environment along with Anthill-Pro, Jenkins and Subversion.
  • GIT version control to manage the source code and integrating with Jenkins to support build automation and integrated with JIRA to monitor the commits and worked with Ansible tower for scheduling playbooks and used GIT repository to store these playbooks and implemented continuous deployment pipeline with Jenkins.
  • Installed and administrated Nexus repository to deploy the artifacts generated by Maven and to store the dependent jars which are used during the build.
  • Utilized Splunk for monitoring of logging, software, operating system, and hardware resources and used these monitoring tools for working of instances in AWS platform.
  • Configured Splunk to send and management logs component and deployed code on Apache Tomcat, JBoss Application Servers for UAT and development environments and used JIRA as Life Cycle Management tool to handle project activities in sprint fashion.
  • For any kind of commit had done in the current projects by the various developers and data engineers, created Jobs in Jenkins by scheduling jobs using Webhooks.
  • Created Maven POM files for Java projects & then installed the application on AWS EC2 AMI(Linux), RedHat, Ubuntu.
  • Resolved system issues and inconsistencies in coordination with quality assurance and engineering teams.

Environment: AWS, OpenStack, Terraform, Ansible, Docker, Kubernetes, Jenkins, GIT, Maven, New Relic, Nexus, Maven, JIRA, uDeploy, CI/CD, Lambda, Kafka .

Confidential

DevOps Engineer

Responsibilities:

  • Involved in designing and deploying multiple application utilizing AWS stack and implemented AWS solutions like EC2, S3, IAM, EBS, Elastic Load Balance (ELB), Security Group, Auto Scaling.
  • Automated and implemented the Cloud Formation Stacks for creating AWS resources like VPC, Subnets, Gateways, Auto-Scaling, Elastic-Load-Balancers (ELB), creating DB Instances and many others across different Availability Zones.
  • Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3 and text files into AWS.
  • Design and develop extract, transform, and load (ETL) mappings, procedures, and schedules, following the standard development lifecycle
  • Develop and maintain operating procedures and support documentation for ETL packages and the SSIS infrastructure
  • Experience managing data in relational databases and developing ETL pipelines
  • Automating backups by shell for Linux to transfer data in S3 bucket.
  • Provide highly durable and available data by using S3 data store, versioning, lifecycle policies and create AMIs for critical production servers for backup.
  • Maintained Apigee for building applications, providing security over cyber threats in giving better assistance to the developer teams. Finally, it supports for the betterment of Microservices .
  • Experience automating deployments on Servers using JBoss, Tomcat and Websphere and knowledge on apigee platform.
  • User management including IAM level (AWS Console level) by creating roles to allow the multiple users to switch roles and editing trust relationship to allow switch from main account to other account and at AWS instance level as well.
  • Designed up-time scalable REST API platform using Spring boot and AWS Lambda .
  • Created functions and assigned roles in AWS Lambda to run python scripts, and AWS Lambda using java to perform event driven processing.
  • Creating Lambda function to automate snapshot back up on AWS and set up the scheduled backup.
  • Managed a cloud platform base on the Lambda architecture including Kafka , Spark, and Cassandra.
  • Implemented Kafka Storm topologies, which are capable of handling and channelizing high stream of data and integrating the storm topologies with Esper to filter and process that data across multiple clusters for complex event processing.
  • Worked with application teams to install operating system, Hadoop updates, patches, Kafka version upgrades as required.
  • Worked on the Analytics Infrastructure team to develop a stream filtering system on top of Apache Kafka.
  • Used AWS Lambda to manage the servers and run the code in the AWS.
  • Used AWS Beanstalk for deploying and scaling web applications and services developed with Java, PHP, Node.js, Python, Ruby, and Docker on familiar servers such as Apache, and IIS.
  • Implemented highly interactive features and redesigned some parts of products by writing plain JavaScript due to some compatibility issues using JQuery.
  • Developing application using various Java /J2EE design patterns to improve usability and flexibility.
  • Designed User Interface using Java Server Page (JSP), HTML, Cascading Style Sheets (CSS), and XML.
  • Used TeamCity Enterprise CI and distributed build that supports all environments to run the builds and deployments. Developed Shell scripts for automation of the build and release process, to monitor repositories developed Custom Scripts and Used for automating process and collecting reports for daily tasks.
  • Performed SVN to Bitbucket migration and managed branching strategies using Bitbucket workflow. Managed User access control, Triggers, workflows, hooks, security, repository control in Bitbucket.
  • Created snippets that allow developers to share code segments and generated pulled requests for code review and comments using Bitbucket.
  • Implemented Chef Recipes for Deployment on build on internal Data Centre Servers. Also re-used and modified same Chef Recipes to create a Deployment directly into Amazon EC2 instances.
  • Worked with Chef servers and management application that can use Service Now data to bring computers into a desired state by managing files, and used Chef attributes, templates, Chef recipes, for managing the configurations across various nodes.
  • Managed Docker Orchestration using Docker Swarm. Creating Docker clusters using Docker Swarm and managing to run the multiple Tomcat application clusters using Docker compose.
  • Configured Docker file with different artifacts to make an image and using chef cookbooks deployed those Docker images to different servers for running the applications on containers.
  • Migrated Docker swarm to Mesos/Marathon for the microservices project
  • Worked as an SCM in automating the build and deploy process through Anthill Pro and Build forge on deploying applications in WebLogic.
  • Installed and Configured Jenkins for Automating Deployments and providing an automation solution.
  • Implemented CI/CD process using Ansible for global development team, allowing for dozens of code updates per hour with zero downtime
  • Installed and configured, managed Monitoring Tools such as Nagios and used to identify and resolve infrastructure problems before they affect critical processes and worked on Nagios Event handlers in case of automatic restart of failed applications and services.
  • Created views and appropriate meta-data, performed merges, and executed builds on a pool of dedicated build machines.
  • Worked on integrating Nagios with cloud watch as a monitoring solution, implementing monitoring tool as Nagios for analyzing and monitoring the network loads on the individual machines.
  • Automate the Build and deploy of all internal Java &SC environments using various continuous integration tools and by Python Scripting language.
  • Created Python scripts to totally automate AWS services, which includes web servers, ELB, Cloud Front distribution, EC2 and security groups, S3 bucket and application configuration.
  • Installed Solaris 8, 9 based servers with Jumpstart and RedHat Linux EL 4.x based servers using Kickstart development, test, and production environments.
  • Performed Web logic Server administration tasks such as installing, configuring, monitoring and performance tuning on Linux Environment, maintaining security by installing and configuring SSH encryption to access on Ubuntu and RHEL Linux.

Environment: AWS, Chef, Bitbucket, Nagios, Docker, Docker Sawm, Linux, Jenkins, Python, uDeploy CI/CD, Lambda, Kafka.

Confidential

Build and Release Engineer

Responsibilities:

  • Installation, Configuration, management and Troubleshooting of VMware ESXi 6.5 - 4.0, vSphere 6.0, 5.5, and older and different versions of vCenter server
  • Wrote custom script in Puppet for package management (rpm, yum) in RHEL Linux and worked closely with the development and operations organizations to implement the necessary tools and process to support the automation of builds and deployments.
  • Worked in building Puppet Enterprise modules using Puppet DSL to automate infrastructure provisioning and configuration automation across the environments and done node classifications and external node classifiers and parameterization for Puppet modules.
  • Provide operational support for large scale Puppet infrastructure; develop, test and maintain various Puppet modules and Ansible playbooks
  • Worked with Puppet administrator, adding new Puppet enterprise nodes to the master, deactivating the nodes and troubleshooting connection issues, and troubleshooting, event application and reporting various issues and starting or restarting the Puppet enterprise services.
  • Configured Jenkins to build and deploy by setting up SonarQube, Maven, Nexus to build CI/CD pipeline which includes to trigger auto builds, code analysis, and deploy it nexus for various projects.
  • Maintain and track inventory using Jenkins and set alerts when the servers are full and need attention, and integrated GIT with Jenkins to automate the code checkout process with the help of Jenkins DSL plugin.
  • Webhooks for pushing the commits from GIT to Jenkins and written Groovy scripts to automate Jenkins
  • Pipeline and set up the automate the build in periodic time and set alerts to notify after the build.
  • Build end to end CI/CD Pipelines in Jenkins to build CI/CD Pipeline and code deployment guide for Developers, Testers and Production management.
  • Created metadata types Branch, Label, Trigger and Hyperlink. Supporting developers in creating config-spec.
  • Worked on SonarQube for continuous inspection of code quality and to perform automatic reviews of code to detect bugs and Automated Nagios alerts and email notifications using Python script.
  • Created ear, war, jar files using ANT scripts and responsible for builds and managing the testing and Pre-Pod environments, and setup ANT script-based jobs in Jenkins and worked with Jenkins pipelines.
  • Involved in developing custom scripts using Python, JSON, PowerShell, Perl, Shell to automate jobs, and wrote Python scripts for automated backup -ups and Cron Jobs.
  • Developed UNIX and Perl Scripts for manual deployment of the code to the different environments and E-mail the team when the build is completed.
  • Wrote Puppet modules for installing and managing Java Versions and Build and Deployment of the Java application onto different environments Dev, QA.
  • Configured Jenkins for .Net
  • Applications using MS build and PowerShell Scripting and Used Maven as a build tool on Java projects for development of build artifacts of the source code.

Environment: Puppet, Jenkins, ANT, GIT, Maven, Linux, Puppet, windows, SonarQube, Nagios, CI/CD.

Hire Now