Cloud Infrastructure/site Reliability Engineer Resume
SUMMARY
- Certified AWS - Developer and Cisco Network (CCNA) - associate with 9 years of experience in Linux Administration, Build and Release management, Cloud implementations AWS, Azure, GCP and Windows Server OS involving extensive work towards code compilation, packaging, building, debugging, automating, managing, tuning, and deploying code across multiple environments yet flexible in multitasking when necessary.
- Expertise of administration which includes AWS Cloud Services, (EC2, Auto Scaling, S3, EBS, ELB, Elastic IP, RDS, SNS, SQS, SES, Glacier, IAM, VPC, Direct Connect, CloudFront, Cloud Formation, Route53, Red shift, Kinesis, Lambda, Systems Manager Cloud Watch, cloud trail and security groups).
- Expertise in creating and maintain Jenkins pipelines to drive all microservices builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.
- Designed EC2 Container Service a highly scalable, fast, container management service that makes it easy to manage, run and stop Docker containers on an AWS cluster.
- Hands on experience in Cloud Infrastructure - building, provisioning, migrating Platform as Service (PaaS) and Database as Service (DaaS) solutions - cloud platforms using Oracle on Linux (OEL) VM Environments.
- Orchestrated and migrated CI/CD processes using Cloud Formation and Terraform Templates and Containerized the infrastructure using Docker, which was setup in Vagrant, AWS and VPCs.
- Extensively worked on Continuous Integration/continuous deployment tools like Jenkins, AWS code commit, code pipeline, code deploy, Step functions.
- Created functions and assigned roles in AWS Lambda to run python scripts, and AWS Lambda using java to perform event driven processing.
- Experience configuring Azure App services, Azure Application insights, Azure Application, gateway, Azure DNS, Azure Traffic manager, App services, Analyzing Azure Networks with Azure Network Watcher, Implementing Azure Site Recovery, Azure stack, Azure Backup and Azure Automation.
- Used Apache spark for processing large sets of data volumes for rapid processing and enhancing the output.
- Used MLops and Devops for Building and deploying solutions.
- Worked with container-based deployments using Docker, Docker images, Docker file, Docker Hub, Docker Compose and Docker registries for task and service definitions to deploy tasks on AWS ECS clusters on AWS EC2 instances.
- Experience with designing and configuring secure Virtual Private Cloud (VPC) through private and public networks in AWS by creating various subnets, routing table, Network ACL, NAT gateways
- Strong experience on bootstrapping and maintaining AWS using Chef on complex hybrid IT infrastructure nodes through the VPN and Jump/Bastion Servers.
- Implemented a CI/CD pipeline involving GitLab, Jenkins, Chef, Docker, and Selenium for complete automation from commit to deployment .
- Converted existing AWS infrastructure to server less architecture with AWS Lambda (Python and Node.JS) and deployed via Terraform/AWS Cloud Formation.
- Knowledge in semaphore for fast continuous integrations and continuous deployments across the cloud platforms.
- Designed Groups, users, roles, policies by using AWS (IAM) and automated configurations using Chef and AWS Opsworks.
- Good knowledge in GCP which includes different services like google compute engine, google cloud functions, Auto Scaler, Cloud Storage, Google Kubernetes Engine (GKE) and cloud big table.
- Implemented Hadoop clusters on processing big data pipelines using Amazon EMR and Cloudera whereas it depended on Apache Spark for fast processing and for the integration of APIs. Confidential the end, we managed the above resources using Apache Mesos.
- Experience in No SQL Database: MongoDB and Redis Server. Proficient in writing Stored Procedures and Functions in SQL-SERVER and Oracle. RDBMS knowledge and experience includes SQL Server & database programming skills including creating stored procedures, views, triggers and data connectivity using ADO.NET.
- Deploy, configure, and maintain Systems Management tools: IBM Tivloi Netcool Omnibus 7.1, HP SiteScope, Heroix Longitude, Computer Associates Autosys, Opalis Robot
- Experience in Designing, Architecting, and implementing scalable cloud-based web applications using AWS and GCP.
- Used AWS ECS to create highly scalable, high-performance container orchestration service to support Docker containers to easily run and scale containerized applications on AWS.
- Creating Application Domains, objects like Web Service proxies, Multi-Protocol Gateways as per the requirements.
- Experience in working on DevOps/Agile operations process and expertise in areas like (Unit test automation, Build & Release automation, Environment Management, Service Management, Incident Management and Change Management).
- Set up a GCP Firewall rules in order to allow or deny traffic to and from the VMs instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.
- Managed a cloud platform base on the Lambda architecture including Kafka, Spark, and Cassandra.
- Experience in migrating on-premises infrastructure to cloud platforms like Aws/Azure/Rackspace/Pivotal Cloud Foundry (PCF) and involved in virtualization using (VMware, VMware ESX, ESXI, Xen) and infrastructure orchestration using containerization technologies like Docker and Kubernetes.
- Experienced in branching, tagging, and maintaining the version across the environments using SCM tools like GIT, Bitbucket, Subversion (SVN) and TFS on Linux and windows platforms.
- Understanding of spinnaker in using for continuous deployments with high velocity across multiple cloud environments.
- Experience in continuous integration technologies Bamboo and Jenkins. Designed and created multiple deployment strategies using Continuous Integration and Continuous Development Pipelines and configuration management tools with remote execution to ensure zero downtime and shortened deployment cycles via automated deployments.
- Experience in load balancing and monitoring System/Application Logs of server using Datadog, Splunk, Nginx, Nagios, Kibana, ELK, Log stash to detect Production issues.
- Accomplished patented work with data analytics, natural language processing, streaming data platform along with NoSQL schema design, infrastructure setup, development, and architecture patterns to provide DaaS solutions.
- Created and wrote shell scripts (Bash), Ruby, Python and PowerShell for automating tasks.
- Knowledge of NGINX in using as a reverse proxy and load balancer to manage incoming traffic and distribute it to slower upstream servers - anything from legacy database servers to microservices.
- Experienced in introducing, designing, adjusting, testing and conveying applications on Apache Webserver, Nginx and Application servers, for example, Tomcat, JBoss.
- Responsible in managing all aspects of the software configuration management process including code compilation, packaging, deployment, release methodology and application configurations.
- Configured and monitored distributed and multi-platform servers using chef. Excellent at defining Chef Server and workstation to manage and configure nodes. Developed Chef Cookbooks & manifests to manage systems configuration.
- Have Around 7+ years of experience on Python Scripting in various projects for automating tasks.
- Knowledge in load balancing and web content acceleration using Akamai and F5.
- Administered tasks like taking backups, expanding file system disk space, creating NFS mounts.
- Established capabilities in application design, implementation, troubleshooting, monitoring, continuous improvement and change controls. Enhances and automates internal processes to generate efficiency.
- Administration of Production, Development and Test environment is carrying Windows, Ubuntu, Red Hat Linux, SUSE Linux, Centos and Solaris servers.
- Coordinated different teams across the globe to deploy different builds to different environments on parallel development for multiple projects.
TECHNICAL SKILLS
Cloud Technologies: Open Stack, AWS (VPC, EC2, EMR, S3, EBS, ELB, RDS, SNS, config, SQS, Glacier, IAM, VPC, Cloud Formation, Route53, Red shift, Kinesis, Lambda, Systems Manager Cloud Watch, cloud trail) Microsoft Azure, Spinnaker, Rackspace, pivotal cloud Foundry (PCF), Appstream.
Operating Systems: UNIX, red hat Linux (RHEL), Ubuntu, Windows 98/NT/XP/Vista/7/8
SCM Tool: Subversion, GIT, Bitbucket, TFS
Build Tool: Ant, Maven
CI/CD Tools: Jenkins/Hudson/bamboo/octopus, U Deploy, CircleCI, Semaphore
Containerization: Docker, Kubernetes, Mesos, Openshift.
Configuration Management : Chef, Ansible.
CDN : Amazon Cloudfront. Akamai Site Accelerator. Softlayer.com CDN. CDN.net, KeyCDN.com, CDN77.com .
Bug tracker & Testing: JIRA, Bugzilla, Junit, Test Flight and Test Rail
Repositories: Nexus, Artifactory
Web Service Tools: JBOSS, Apache Tomcat, IntelliJ IDEA, Oracle Web logic, IBM Web sphere, IIS Server, Nginx VMware, VMware Esxi
Languages/Utilities: Shell Script, ANT Script, Batch Script, Ruby, Perl, Node.js, C, C++, Objective C, Python, Java, J2EE
Networking: TCP/IP, NIS, NFS, DNS, DHCP, Cisco Routers/Switches, WAN, SMTP, LAN, FTP/TFTP.
Databases SQL Server: MS.SQL, MySQL, Oracle, DB2, Teradata
Monitoring and profiling tools: Datadog, Splunk, Dynatrace JProfiler, Kibana, Logstash.
PROFESSIONAL EXPERIENCE
Confidential
Cloud Infrastructure/Site Reliability Engineer
Responsibilities:
- Provisioning infrastructure on AWS EC2/VPC/S3/SQS/SNS based automation through Cloud Formation, Terraform, Python Boto and Bash Scripts.
- Created automation and deployment templates for relational and non-relational databases including MS-SQL, MySQL, Cassandra, and MongoDB for different micro-services.
- Created AWS cloud formation templates to create organization AWS network from scratch with custom sized VPC, subnets, NAT gateways, internet gateways, Route tables, ACL’s EC2 instances, ELB's, security groups.
- Automated the infrastructure using Terraform and making it auditable by storing all the infrastructure changes in a version control system like GIT.
- Set up and built AWS infrastructure with various services available by writing cloud formation templates in JSON.
- Acquaintance with tagging standards for proper identification and ownership of EC2 instances and other AWS Services like Cloud Front, cloud watch, Elastic beanstalk, RDS, Red shift, S3, Route53, SNS, SQS, Cloud Trail.
- Managing a team of DevOps engineer for infrastructure support on AWS cloud. Creating Cloud Formation scripts for hosting software on AWS cloud. Automating the installation of software through Power Shell scripts and CM tools.
- Configuring IAM roles for EC2 instances and assigns them policies granting specific level access to S3 buckets. Using Cloud Watch service, created alarms for monitoring the EC2 server’s performance like CPU Utilization, disk usage etc
- Automated infrastructure tasks for NOC using Python, Bash, and Jenkins
- Utilized AWS ECS to run micro services applications with native integration to AWS services and enables CI/CD pipelines.
- Managing DNS, LDAP, FTP, JBOSS, Tomcat and Apache web servers on Linux servers.
- Developed automation scripts for various Configuration Management Tools Including AWS Lambda Functions to reduce day-to-day repetitive work using Python/Shell.
- Experience in dealing with Windows Azure IaaS - Virtual Networks, Virtual Machines, Cloud Services, Resource Groups, Express Route, Traffic Manager, VPN, Load Balancing, Application Gateways, and Auto-Scaling.
- Develop tools to automate the deployment, administration, and monitoring of a large-scale AWS Linux environment.
- Developed Web API to extract the data from the Sql server.
- Proficient in using Docker in swarm mode and Kubernetes for container orchestration, by writing Docker files and setting up the automated build on Docker HUB.
- Deployed MongoDB using Docker image and setting up Mongo DB cluster using MongoDB Atlas.
- Experience in setting up MongoDB Client and writing queries to validate data against Mongo Collections, MySQL.
- Installed, configured, and managed MongoDB servers and performance tuning of Mongo Databases.
- AWS Application services: SES, Cloud Search, SNS, AppStream, SQS and Elastic Transcoder
- Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test deploy.
- Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker, on AWS.
- Implemented a production ready, load balanced, highly available, fault tolerant AWS infrastructure and microservice container orchestration.
- Set-up databases in GCP using RDS, storage using S3 bucket and configuring instance backups to S3 bucket. prototype CI/CD system with GitLab on GKE utilizing Kubernetes and Docker for the runtime environment for the CI/CD systems to build and test and deploy.
- Created puppet manifests and modules to automate system operations. Created monitors, alarms and notifications for EC2 hosts using Cloud Watch.
- In addition with supporting large-scale web applications, we indexed database queries using MYSQL server by writing SQL queries. We worked on Apache Cassandra, Spark along with Terradata for managing large data-sets of structured data which also performed ETL
- Built scalable Docker infrastructure for Micro services utilizing ECS - AWS Elastic Container service by creating task definition json file.
- Implemented a 'server less' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS Lambda code from Amazon S3 buckets. Created a Lambda Deployment function, and configured it to receive events from your S3 bucket.
- Included security groups, network ACLs, Internet Gateways, and Elastic IP's to ensure a safe area for organization in AWS public cloud.
- Developed a stream filtering system using Spark streaming on top of Apache Kafka.
- Install and configure Netcool/Omnibus 7.4 (primary/backup) on collection, aggregation (virtual pair), and display layers, bi-gateway, uni-gateway, migrate object server DB. Setup Netcool/Omnibus email notification.
- Created detailed AWS Security Groups, which behaved as virtual firewalls that controlled the traffic allowed to reach one or more AWS EC2 instances. Handled operations and maintenance support for AWS cloud resources which includes launching, maintaining, and troubleshooting EC2 instances, S3 buckets, Virtual Private Clouds (VPC), Elastic Load Balancers (ELB) and Relational Database Services (RDS).
- Experience in code deployment, Orchestration and Scheduling using tools such as Kubernetes, Docker Swarm, Apache Mesos, CoreOS Fleet, Cloud Foundry’s Diego, CloudFormation, and automation validation using Test Kitchen, Vagrant, Ansible and Terraform.
- Used Ansible Playbooks to setup Continuous Delivery Pipeline. Deployed micro services, including provisioning AWS environments using Ansible Playbooks.
- Migrated data from MongoDB to Amazon DynamoDB using AWS Database migration service.
- Experience in Kubernetes to deploy scale, load balance and manage Docker containers with multiple names spaced versions and good understanding of AWS in managing Docker Containers and Kubernetes Clusters.
- Hands on experience with Ansible and Opsworks in AWS cloud environment and implemented Ansible on AWS.
- Strong Knowledge in working with Selenium automation framework to perform regression testing using TestNG and Jenkins.
- Worked on google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring and cloud deployment manager.
- Developed build scripts using Gulp to compress, minify all Java Script files.
- Worked on Docker container snapshots, attaching to a running container, removing images, managing directory structures and managing containers in AWS ECS
- Created PowerShell scripts to constantly monitor the health of the Exchange messaging infrastructure and notify my teammates or myself in the event of a problem.
- Perform secure desktop application streaming from AWS to a web browser using Amazon AppStream 2.0
- Implemented a Continuous Delivery pipeline with Docker, Jenkins, GitHub and AWS AMI's. Whenever a new GitHub branch gets started, Jenkins server automatically attempts to build a new Docker container from it, the Docker container leverages Linux containers from the baked AMI.
- Centrally managed an automated research-oriented Linux environment through automation/configuration tools like Ansible.
- Design, build, configure, test, install software, manage and support all aspects and components (Chef) of the application development environments in AWS.
- Used the AWS Sage Maker to quickly build, train and deploy the machine learning models.
- Developed various helper classes needed following Core Java multi-threaded programming and Collection classes.
- Experience in performance tuning and troubleshooting Java and JS application by performing thread and heap dump analysis and utilizing profiling and monitoring tools like Wiley, Dynatrace and Google Dev tools profiles and network tools and log monitoring solutions like Splunk and ELK stack.
- Implemented Ansible Playbooks for Deployment on build for internal Data Centre Servers. Also, re-used and modified same Ansible playbooks to create a Deployment directly into Amazon EC2 Instances.
- Used Ansible Tower, which provides an easy-to-use dashboard and role-based access control, so that it's easier to allow individual teams access to use Ansible for their deployments.
- Wrote Terraform templates for AWS Infrastructure as a code to build staging, production environments & set up build & automations for Jenkins.
- Setup GCP Firewall rules to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.
- We had monitored the data pipelines using Apache Kafka and ETL.
- Used Spring Kafka API calls to process the messages smoothly on Kafka Cluster setup.
- Spun up core services according to demand with custom Python, Ruby scripts and Ansible playbooks.
- Automate the installation of ELK agent (file beat) with Ansible playbook.
- Deployed and configured Elastic Search, Log stash and Kibana (ELK) for log analytics, full text search, application monitoring in integration with AWS Lambda and Cloud Watch. Established DevOps culture based on Docker and Kubernetes tools.
- Worked with different feeds data like JSON, CSV, XML,DAT and implemented Data Lake concept using AWS platform.
- Also worked on Apache Hadoop and used Kafka for messaging system and spark for processing large sets of data.
- Written AWS Lambda code in Python for nested Json files, converting, comparing, sorting etc.
- Have experience with AWS LAMBDA which runs the code with response of events.
- Used Kafka as a messaging tool between APIs and implemented the workflow in API development using Activiti.
- Had knowledge on Kibana and Elastic search to identify the Kafka message failure scenarios.
- Worked on Nagios Monitoring tool, and configuration with Puppet and AWS cloud management with puppet automation.
- Utilized Spark to improve the performance and optimization of the existing algorithms in Hadoop using Spark context, Spark-SQL, Data Frame, pair RDD's, Spark YARN.
- Develop Netcool events query tool to provide events statistics for various teams. Worked with Netcool support member on various monitoring testing.
- Migration of on premise data (SQL Server) to AWS Data Lake using AWS Data Factory and AWS Data Exchange.
- Design and construct of AWS Data pipelines using various resources in AWS including AWS API Gateway to receives response from aws lambda and retrieve data from snowflake using lambda function and convert the response into Json format using Database as Snowflake, DynamoDB, AWS Lambda function and AWS S3.
- Manage deployment automation using Puppet, MCollective, Hiera, Custom Puppet modules.
- Created puppet manifests and modules to automate system operations.
- Automated applications and MySQL container deployment in Docker using Python and monitor them using Nagios.
- Configured Nginx for proxy RESTful API calls to micro-services in Docker containers.
- Hands on experience in power Shell/bash Scripting for Nagios.
- Worked for restarting various services such as HDFS, Oozie, Spark, MapReduce etc to reflect the necessary configuration changes inside the EMR clusters.
- Designed and implemented by configuring Topics in new Kafka clustering all environment.
- Closely worked with Kafka Admin team to set up Kafka cluster setup on the QA and Production environments.
- Designed & Implemented static webpage architecture using services such as IAM, KMS, Cognito, API Gateway, Route 53, S3.
- Used Terraform to reliably version and create infrastructure on Azure.
- Created resources, using AzureTerraform modules, and automated infrastructure management.
- Similar infrastructure is deployed to Azure and additional cloud providers or on-premises datacenters using Terraform and managed infrastructure on multiple cloud providers.
- Create and maintain highly scalable and fault tolerant multi-tier AWS and Azure environments spanning across multiple availability zones using Terraform and CloudFormation.
- Implemented Kafka Consumers, Kafka Producers and Kafka Topics to reliably capture user actions, log API calls and audit events within the system for security and compliance purposes.
- Created Python scripts to totally automate AWS services which includes web servers, ELB, Cloud Front distribution, database, EC2 and database security groups, S3 bucket and application configuration, this script creates stacks, single servers, or joins web servers to stacks.
- Converted existing Terraform modules that had version conflicts to utilize cloud formation during Terraform deployments to enable more control or missing capabilities.
- Expertise in Working on Data Encryption (Client-Side and Server-Side) and securing data at rest and in transit for data in S3, ELB, EBS, RDS, EMR, Red Shift using Key Management Service (KMS).
- Involved in creating and documenting POC for helping migrate the current application to micro service architecture. The architecture included Docker as the container technology with Kubernetes and worked on with REST API.
- Generate monthly and weekly performance reports to be presented to upper management (Director and above) encapsulating different initiatives within engineering and operations team. Define future roadmap for IAM related services and functions.
Environment: AWS, NoSQL, MS SQL, MySQL, Git, Jenkins, Ansible, VMware, Docker (Daemon), Kubernetes, Pivotal Cloud Foundry (PCF), Terraform, Python, shell script, Maven, Ansible, ANT, Cassandra, MongoDB, Cloud Front, Cloud Watch, Cloud Trail, Cloud Formation, Java, PHP, Node.js, Elk, Nagios, Red Hat Linux, MP stat, Packer, Nexus, JIRA, Vagrant, Power shell.
Confidential, Foster city, CA
DevOps Engineer
Responsibilities:
- Worked closely with the Development Team in the design phase and developed Use case diagrams using Rational Rose.
- Understanding of node.js in maintaining concurrent connection
- Knowledge in using the Node.js uses as event-driven, non-blocking I/O model that makes it lightweight and efficient.
- Implemented & maintained the branching and build/release strategies utilizing Subversion / GIT .
- Experience in working with ASP.NET 4.0, MVC3, WCF, Visual Studio .Net Framework.
- Automated the infrastructure using Terraform and making it auditable by storing all the infrastructure changes in a version control system like GIT.
- Written Ansible Playbooks for installing Software packages, web applications in Virtual Machines and AWS EC2 instances.
- Worked with the application based on JSP, JavaScript, Struts 2.0, JSF 2.0, Cloudant, Hibernate 3.0, Service Oriented Architecture System Analysis and Design methodology as well as Object Oriented Design.
- Setup and configure the CI/CD process using Bitbucket Data Center, Jenkins, Docker, RHEL Satellite 7.0 and OpenSCAP as image scanning tool for vulnerabilities.
- Created Docker images from scratch and customized and modified base images from existing environment configurations and maintaining the image repo for development teams.
- Understanding as an asynchronous event driven JavaScript runtime by designing node.js to build scalable network applications.
- Developed a migration approach to move workloads from On-Premises to Windows Azure for Windows machines & AWS for Linux Solaris machines. Administered RHEL, Centos, Ubuntu, UNIX & Windows servers.
- Developed scripts for deployment of customer environments into AWS using Bash, Python and created scripts which integrated with Amazon API to control instance operations.
- Worked with various Docker components like Docker Engine, Hub, Machine, Compose and Docker Registry
- Worked running Python scripts to totally automate AWS services which includes web servers, ELB, Cloud Front distribution, database, EC2 and database security groups, S3 bucket and application configuration, this script creates stacks, single servers, or joins web servers to stacks.
- Implemented a new CI/CD pipeline with application containerized deployment using container orchestration tools like Docker and Kubernetes .
- Configured PHP, MySQL and WAMP/MAMP server.
- Experience with MySQL database development. Experience writing sql queries. Report writing experience with Excel, PHP and MySQL and JavaScript charting libraries such as HighCharts and D3.
- Developed PHP Worked on CAS module for Single sign on Authentication.
- Proficient knowledge with Helm charts to manage and release of helm packages.
- Strong expertise on DevOps concepts like Continuous Integration (CI), Continuous delivery (CD) and Infrastructure as Code (IAC), Cloud Computing etc.
- Designed and deployed applications utilizing all the AWS stack (Including EC2, Route53, S3, ELB, EBS, VPC, RDS, DynamoDB, SNS, SQS, IAM, KMS, Lambda, Kinesis) and focusing on high-availability, fault tolerance and auto-scaling in AWS Cloud Formation, deployment services (Ops Works and Cloud Formation) and security practices (IAM, Cloud Watch, Cloud Trail).
- Configured AWS IAM and Security Group in Public and Private Subnets in VPC.
- Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
- Created AWS Route53 to route traffic between different regions.
- Performed SVN to GIT/BitBucket migration and managed branching strategies using GIT flow workflow. Managed User access control, Triggers, workflows, hooks, security and repository control in BitBucket.
- Implemented multiple CI/CD pipelines as part of DevOps role for on-premises and cloud-based software using Jenkins, Chef, Openshift and AWS/Docker.
- Experience in Configuration Management, Cloud Infrastructure, and Automation like Amazon Web Services (AWS), Ant, Maven, Jenkins, Chef, SVN, GitHub, Clear Case, Tomcat, and Linux.
- Data Profiling, Mapping and Integration from multiple sources to AWS S3/RDS/Redshift.
- Automation of ETL loads into Redshift Database using Windows Batch Scripts.
- Performed Murex CVA desk tasks.
- Install and configure Virtual machines, storage account, virtual network, Azure load balancer in the Azure cloud.
- Responsible for implementing, design and architect solution for Azure cloud and network infrastructure, Data center migration for public, private and Hybrid cloud
- Perform assessment of the existing environment (Application, servers, database) using various tool like MAP toolkit, Azure Website Migration assistant and Manual assessment.
- Developed a migration approach to move workloads from On-Premises to Windows Azure for Windows machines & AWS for Linux Solaris machines. Administered RHEL, Centos, Ubuntu, UNIX & Windows servers.
- Secured Data is stored in MySQL. Vault (by HashiCorp) secures, stores and tightly controls access tokens and passwords used by the overall platform, started in the AWS cloud and currently integrates with several services like: AWS AIM, Amazon DynamoDB, Amazon SNS, Amazon RDS.
- Deployed the tools Microsoft Azure Cloud Service (PaaS, IaaS), and Web Apps.
- Used SQL Azure extensively for database needs in Customer Lookup.
- Coordinated with QA Testing Team to test various test scenarios involving Test Plans, Test Cases and Test Scripts.
- Migrated the Azure CXP Tools to HTTPS based authentication using SSL encryption.
- Worked with Nagios for Azure Active Directory & LDAP and Data consolidation for LDAP users. Monitored system performance using Nagios, maintained Nagios servers and added new services & servers.
- Delivered alarm stream from Netcool to Auspice TLX.
- Experience configuring and managing Puppet master server and also experience in updating and creating modules and pushing them to puppet clients.
- Experience in writing Infrastructure as a code (IaC) in Terraform, Azure resource management, AWS Cloud formation. Created reusable Terraform modules in both Azure and AWS cloud environments.
- Implemented Federation Solution using SAML 2.0 Ping Federate 6
- Installing Bitbucket Data Center on premise to merge the existing Stash/Bitbucket servers of individual companies in the organization to one source of platform for code repositories. purging multiple Akamai URL's, and report everything verbosely into the Jenkins console output log
- Performed fine tuning of the Gitlab pipeline by removing unnecessary code in the Gitlab-ci.yaml file and streamline of the stages to be triggered from specific branches.
- Experience in using Tekton in both IBM cloud DevOps service and IBM Cloud Pak for Applications like OpenShift 4.2
- Management of RedHat Linux user accounts, groups, directories, and file permissions.
- Supporting the development team to provision the server and integrate with the system and CI/CD setup. experience on using Terraform along with packer to create custom machine images and automation tools like Chef to install software's after the infrastructure is provisioned.
- Designed and developed Dynamics-AAA (Access, Authorize & Audit) Portal which provides secure access to Azure resources and assigns custom roles. This Portal became a standard for granting access and same compliance with MSIT standards.
- Deployed the Java applications into web application servers like Web logic .
- Involved in periodic archiving and storage of the source code for disaster recovery .
- Worked as a system administrator for the build and deployments process on the enterprise server.
- Executed user administration and maintenance tasks including creating users and groups, reports, and queries .
- Responsible for design and maintenance of the Subversion/GIT Repositories, views, and the access control strategies.
- Worked closely with developers to pinpoint and provide early warnings of common build failures.
- Used ANT and MAVEN as build tools on Java projects for the development of build artifacts on the source code.
- Automated the build and release management process including monitoring changes between releases.
- Kept track of all the releases and request of the developers through Infrastructure management tool.
- Used tools like JIRA for Bug tracking/created tickets, generated reports on different bugs and tickets.
- Involved in Video management, video analytics, manipulation, and distribution applications.
- Experience in Implementing a Continuous Delivery framework using Jenkins, Maven and Nexus in Linux environment.
- Analyzing the Data from different sourcing using Big Data Solution Hadoop by implementing Azure Data Factory, Azure Data Lake, Azure Data Lake Analytics, HDInsights, Hive, Sqoop.
- Worked with Docker container snapshots, attaching to a running container, managing containers, directory structures and removing Docker images.
- Experienced in using advanced next generation DevOps tool stack such as Jenkins, Anthill, UBuild, UDeploy, TFS, Docker, Mesos.
- Administration of DevOps tools suite like Puppet Enterprise, AWS, TeamCity, GitHub, JIRA, Confluence, Rundeck, Puppet, Octopus Deploy, Splunk and ELK stack.
- Used ANT and MAVEN as build tools on Java projects for the development of build artifacts on the source code.
- Created views and appropriate meta-data, performed merges, and executed builds on a pool of dedicated build machines
- Documented project's software release management procedures with input decisions.
- Developed, maintained, and distributed release notes for each scheduled release.
- Provided periodic feedback of status and scheduling issues to the management.
- Used the continuous integration tool Anthill Pro to automate the daily processes.
Environment: Bitbucket, GIT, AWS, Jenkins, terraform, Docker, Kubernetes, Java, ANT, MAVEN, JIRA, python, Ruby, LINUX, XML, Windows XP, Windows Server 2003, Web logic, MY SQL, Node.js, Perl Scripts, Shell scripts.
Confidential, Cary, NC
DevOps / Build and Release Engineer
Responsibilities:
- Designed and Developed Jenkins Build deployments.
- Managed Nexus Maven repositories to download the artifacts during the build.
- Used Maven Nexus Repository to upload the build artifacts after a successful build.
- Designed and developed build Korn shell and Perl scripts.
- Installed and administered tools like GitLab/Jenkins, Jira, Confluence and Fisheye.
- Written Perl/Shell scripts for deployments to servers
- Created PostgreSQL and Oracle databases on AWS and worked on modifying their settings.
- Experience in writing maven pom.xml and ANT build.xml for build scripts.
- Branching and merging code lines in the GIT and resolved all the conflicts raised during the merges.
- Provided the reports using Bean shell scripting in Anthill Pro
- Merging, and automation processes across the environments using SCM tools like GIT, OCTOPUS, Stash and TFS on Linux and windows platforms.
- Edit and maintain Netcool rules files and SCOM management packs.
- Used Behat for User Acceptance Testing for the Website
- Develop Netcool Probe rules files from SNMP MIBs to provide correlation of events (alarming and clearing), establish severities and develop Impact policies.
- Installed and configured Netcool s Precision to discover ATM devices.
- Installed and configured Netcool high performance Oracle gateway providing capacity lacking in the reporter gateway.
- Experienced in installing databases such as MySQL and Oracle in Linux machines.
- Deployed the build artifacts into environments like QA, UAT & Production according to the build life cycle. Worked on Develops automated tools that produce repeatable, auditable software builds and deployments across all environments and a variety of platforms.
- Used GIT as Version Control System for two applications. Managed development streams and integration streams.
- Trained teams on using Confluence for the document and collaboration activities.
- Used JMeter and Selenium for load testing and Front-End performance testing.
- Required to run SQL scripts under MySQL and index the databases suitable for analysis using Mesos environment.
- Written WLST scripts and integrated with Anthill Pro to automate the deployment activities to various environments.
Environment: GIT, Jenkins, Java/J2EE, ANT, MAVEN, JIRA, Ruby, LINUX, XML, Windows XP, Windows Server 2003/2008, MY SQL, Perl Scripts, Shell scripts.