We provide IT Staff Augmentation Services!

Site Reliability /devops Engineer Resume

Dallas Fort Worth Area, TX

SUMMARY

  • Over 8+ yrs. of experience in software development life cycle (SDLC) such as Analysis, Planning, Developing & Testing of the Cloud Infrastructure and Support applications in migrations, troubleshoot, Build & Deploy using DevOps principles in an Agile scrum methodology projects.
  • Experienced in install and configuration of different versions of Linux distributions (CentOS, RHEL, & Ubuntu), Windows Server 2012 - r2 & 2016.
  • Experienced in creating a centralized OS repository in AWS S3 bucket so that AWS instances can receive YUM updates.
  • Worked on automation processes using Ansible for server deployment in Amazon Web Services (AWS) auto-scaling group through AWS cloud formation template.
  • Experienced in creating custom Linux Amazon Machine Images (AMI's) of CentOS and RedHat through Amazon Web Services (AWS) cloud formation templates.
  • Experienced in writing Amazon Web Services (AWS) Cloud Formation templates for infrastructure provisioning. Experience supporting application deployments to PCF from on-prem VMware cloud.
  • Experienced integration and usage of New Relic, AppDynamics, Splunk, Grafana monitoring tools.
  • Experience administering/ maintaining and provisioning of AWS EKS and Kubernetes (k8s) environment from staging to production and implementing and CI/CD pipelines using Jenkins.
  • Experience working on Platform as a Service (PaaS) infrastructure for Pivotal Cloud Foundry (PCF) Implementation project. Experience in Installation and Configuration of different versions of Pivotal Cloud Foundry on VMware Cloud for Production, Pre-Production and Sandbox Environments. Extensively worked on Upgrade and Configuration of different versions of Pivotal Cloud Foundry on VMWare cloud for Production, Pre-Production and Sandbox Environments.
  • Extensively worked on installation and configuration of various PCF product tiles on Pivotal Cloud Foundry. Performance tuning and troubleshooting of the applications and resolution of issues arising out of the ticketing systems in Remedy.
  • Expertise in implementation and maintenance of Apache HTTPD, SMTP, DHCP, TCP/IP, NFS, NIS, NIS+, LDAP, DNS, SAMBA, SQUID, Postfix, send mail, FTP, Remote Access, Security Management & Security trouble shooting skills. Ability in development and execution of XML, Shell and Perl Scripts.
  • Strong knowledge in writing Bash Scripts, Pearl Scripts (hash and arrays), PowerShell, Python programming for deployment of Java applications on bare servers or Middleware tools.
  • Good Interpersonal Skills, team-working attitude, takes initiatives and very proactive in solving problems and providing best solutions.

TECHNICAL SKILLS

Operating Systems: RHEL/CentOS 5.x/6.x/7, Ubuntu/Debian/ Windows Server 2012

Build/Automation Tools: Ansible, Puppet, Gradle, Maven, Jenkins Languages Shell, Bash, Python and Perl

Databases: AWS Aurora, DynamoDB, MySQL, MongoDB, Cassandra, PostgreSQL, SQL Server

Web/App Server: Apache, IIS, HIS, Tomcat, WebSphere Application Server, JBOSS, NodeJS

Bug Tracking Tools: Remedy and JIRA Confluence

Version Control Tools: Bit-Bucket, Subversion, GIT, Tortoise SVN, Visual SVN

Web Technologies/Programing Languages: Servlets, JDBC, JSP, XML, HTML, Java Script, C, C++, Perl scripting, Python, Shell scripting.

PROFESSIONAL EXPERIENCE

Confidential, Dallas Fort Worth Area, TX

Site Reliability /DevOps Engineer

Responsibilities:

  • Support e-com infrastructure and its various micro-service applications on a daily routine towards handling, analyzing and resolving the Remedy incident tickets with problems that real time customers encounter relating to Confidential orders and its respective customer service agents in timely manner.
  • Provide 24/7 production support with monitoring, incident ticket handling and troubleshoot respective issues along with various Jenkins deployment issues of diff applications.
  • Monitor various micro-service apps of production environment Splunk alerts, Tibco Ques, Dashboards of Kibana, Grafana and OpsCenter with diff metrics usages such as CPU-memory/Sales/returns/ adjustments etc. data flows, analyze them, fix its causes and if required involve respective teams to further triage the issues. Update inventory item stocks for diff Confidential stores.
  • Setup Splunk dashboard alerts with min and max thresholds in productions environments.
  • Support various micro-service application deployments within Jenkins Pipelines.
  • Supporting the team in handling various issues and analyzing them timely to resolve.
  • Support teams in creating CRQ’s that required for each business change & app release/upgrade deployments and ensure they were executed and completed on the right start/end dates.
  • Install/Configure and setup Magento 2.4.0 to 2.4.3 version environments in AWS cloud with various services such as PHP-FPM-7.4, ElasticSearch-6.8.x, Nginx-1.16, Varnish-6.2.x, RabbitMQ-3.8.x, Redis-5.0, MariaDB-10.3 through Docker-compose containers and through Ansible playbooks.
  • Create bitbucket pipelines with end-end testing and create GitHub Action Runners in AWS cloud.
  • Well experienced in using various AWS services for infrastructure provisioning’s & application hosting’s and migrations. Perform Cache flush requests in Akamai.

Confidential, Dallas Fort Worth Area, TX

Cloud DevOps Engineer

Responsibilities:

  • Extensively worked on various AWS services EC2, Security Groups, Autoscaling Group, Load Balancers, VPC’s, Subnet’s, AWS Direct Connect, AMI’s, IAM, S3, ECS, EKS, EBS Volumes, Snapshots, Cloud Watch, Cloud Front, Cloud Formation, Lambda, Elastic Search, Elastic Cache, KMS, AWS SDK, Fargate, DynamoDB, Route53 and RDS.
  • Create CI/CD pipelines using Jenkins for Micro Service applications and deploy in AWS Kubernetes (k8s) environment (Dev to QA to UAT to Prod) environments & Administer AWS cloud environments.
  • Responsible for Build, Deploy & troubleshooting of the Micro Service applications on a daily routine to move services from staging to production environment and used Maven and Gradle as build tools.
  • Developed Bash/Shell scripts to automate the infrastructure dependencies.
  • Create Jenkins pipelines for Flyway DB migration, to create upstream Jenkins build projects, to deploy AWS CloudFormation templates for provisioning of AWS resources like EC2’s, Security Groups, AMI’s, RDS Instances’, IAM policies, to delete cloud formation stacks and etc.
  • Experience in supporting JAVA, .Net, React & NodeJS Micro Service applications and worked on Python scripts to fetch the hundreds of DB transaction records.
  • Support NetApp storage systems & multiprotocol CIFS/NFS in SAN/NAS NetApp environments.
  • Manage and support data migrations in NetApp storage systems.
  • Create NetApp storage naming conventions and volumes on NetApp filters.
  • Experience writing Terraform and Cloud Formation templates to provision the AWS resources like EC2, S
  • G’s, VPC’s, S3 buckets and IAM, AMI’s, Autoscaling Group, Load Balancers and Subnet’s as infrastructure as code.
  • Experience writing Python, Groovy and Shell Scripts to automate the build & deployment process and administration jobs. Experience in Blue Green and Canary Deployment strategies.
  • Experience integrating app security tools like SonarQube, Fortify, Black Duck, Twist lock for security vulnerabilities and using GitHub, HAProxy, NPM, Nexus, Udeploy and CloudBees.
  • Experience writing Ansible playbooks and Ansible roles for various cloud IAS tasks.
  • Create Jenkins Slave machines Centos & Red Hat with customized AWS AMI’s.
  • Experience creating Kubernetes (k8s) names spaces, helm, Container PODS, delete PODS, execute commands within POD shell & monitor PODS health & troubleshoot app deployments.
  • Experience in writing/building Docker Images to install application dependency software’s and orchestrate with AWS Kubernetes environment for micro service applications.
  • Involved in Single Sign On (SSO) implementation in AWS Kubernetes environment.
  • Create Load balancers for applications in multiple regions using internal Confidential ’s CASINO tool.
  • Experience in provisioning, monitoring and troubleshooting POSTGRESS database systems in AWS.
  • Experience using New Relic, Grafana and Splunk for App monitoring, JFrog Artifactory, GIT for version control system, Jira and Service Now for project management.
  • Followed Scrum Agile methodology and responsible for Sprint Releases for QA, UAT and Production code releases, create release notes and Provide 24/7 production support for on-prem VMware and cloud applications.

Confidential, Dallas Fort Worth Area, TX

DevOps Cloud Engineer

Responsibilities:

  • Plan, design, implement and develop AWS cloud infrastructure using EC2, AMI, lambda, S3, Glacier, VPC, Route 53, Direct Connect, Cloud Front, CloudWatch, Auto Scaling, Security Groups, Load Balancers (ELB’s), Cloud Formation, KMS, EKS, IAM, Cloud Trail, Config, DynamoDB, Volumes, Snapshots, SNS SQS, Kinesis.
  • Create AWS Configs to provide a detailed view of the configurations of AWS resources in multiple AWS accounts and geographic locations in dev, cert and production environments.
  • Create Ansible Vault Config files to configure AWS EC2 instances with custom configurations.
  • Implement the process of obtaining and providing Linux OS repositories to cloud instances.
  • Implemented, Automated and Created Terraform & AWS Could Formation templates in JSON to get the required operating system packages for CentOS 6.x & 7.x and RedHat 6.x & 7.x from CentOS and RedHat communities and sync the repositories in S3 bucket to receive YUM updates to EC2.
  • Created a centralized Linux OS repository so that AWS instances can receive software updates.
  • Automated OS repository server deployment using AWS Cloud Formation templates and OS repository synchronization (dev, cert and prod).
  • Automated Ansible server deployment in an Auto-Scaling Group using the AWS cloud formation.
  • Monitor application health checks and system wide performance using AWS CloudWatch logs
  • Involved with Docker container management systems like Kubernetes & OpenShift and deploying Docker Engines in Virtualized Platforms for containerization of multiple apps.
  • Performed OS patching, created RedHat OpenShift repository and automated the synchronization of packages to AWS S3 bucket.
  • Worked on deploying Docker Clusters to Kubernetes and Implemented a Continuous Delivery pipeline with Docker and AWS. Good understanding of Kibana, log stash and Elastic Search.
  • Developed custom Linux AMI’s using AWS Cloud Formation templates in JSON to launch RedHat 6.x, 7.x and Centos 6.x, 7.x instances and configures them to meet Confidential 's OS standards.
  • Created AWS cloud formation templates to create, RedHat 6.x, 7.x CentOS 6.x, 7.x, instances to test AMI custom configurations, McAfee EPO instance for McAfee Agents and S3 buckets.
  • Automated the installation of McAfee anti-virus process by baking McAfee into custom Linux AMI’s through AWS Cloud Formation templates and Bash shell scripts.
  • Create and deploy JSON format cloud formation & Terraform template for Qualys Virtual Scanner Appliance to scan for vulnerabilities in ec2 instances in multiple AWS global user accounts and regions.
  • Created Virtual Private Clouds (VPC’s) and Virtual Private Cloud Endpoints (VPCe) in AWS, created Direct Connect Gateways and Virtual Interfaces in AWS global network accounts, created Elastic Block Storage (ESB) and AMI Snapshots. Configured network subnets, NAT Gateways, association of network ACL’s to subnets.
  • Used Ansible for encrypting confidential data security through Ansible Vault in AWS and create YAML configuration files for AWS Lambda instance bootstrap creation.
  • Monitoring AWS services and resources using Amazon CloudWatch logs and create tables in AWS Dynamo DB with AMI’s list to aid application ec2 instances.
  • Deploy cloud formation stacks to create AWS VPC’s, Python Lambda Sep Functions, VPC-peering, Security Groups, AWS Configs, in multiple global user accounts & AWS regions for development, certification and production environments.
  • Used configuration management tools like Ansible to automate Ansible Roles creation and to Install, configure and troubleshoot various infrastructure services.
  • Created Ansible role to automate the RedHat IDM-Client installation, configuration and updating the krob5.conf and sssd.conf files with the server details and domain names to communicate with the IDM-client. Experience using Rally for Scrum agile project management and ServiceNow (SNOW) for ticketing system to manage technology service management tasks.

Confidential, Round Rock, TX

Cloud DevOps Engineer

Responsibilities:

  • Implemented Platform as a Service (PAAS) infrastructure for Pivotal Cloud Foundry (PCF) project.
  • Worked on Installation and Configuration of different versions of PCF on VMware Cloud for Production, Non-Prod and Sandbox environments.
  • Worked on Upgrades and Configurations of different versions of Pivotal Cloud Foundry on VMware cloud for Production, Non-Production and Sandbox environments.
  • Worked on installation and configuration of different versions of PCF product tiles such as RabbitMQ, Redis, JMX Bridge, MYSQL, PCF Spring Cloud Services, Single Sign On, Log Search and PCF Metrics and binding them with applications hosted on Pivotal Cloud Foundry.
  • Worked on Installation and Configuration of different versions of Windows Diego Cells for hosting .NET applications on Pivotal Cloud Foundry.
  • Worked on integration of Splunk with Pivotal Cloud Foundry.
  • Responsible to monitor application logs/data on Splunk and troubleshoot application logs issues
  • Worked on Installation and Configuration of JConsole Java Virtual Machine (JVM) to monitor application logs. Worked on on-boarding .NET applications to Pivotal Cloud Foundry.
  • Worked on deploying applications on Pivotal Cloud Foundry using Cloud Foundry CLI and troubleshooting deployment issues with development teams.
  • Good understanding of OpenShift platform in managing Docker containers Kubernetes.
  • Created Change Requests (CRQ's)/Service Requests and Remedy tickets to L4 networking/firewall team for implementation changes on Pivotal Cloud Foundry Production environment and resolved the technical challenges. Good understanding of Kibana, log stash, Elastic Search.

Confidential, Rockville, MD

Systems/Cloud DevOps Engineer

Responsibilities:

  • Installed, configured and Administration of all UNIX/LINUX servers on Amazon Web Services (AWS), includes the design and selection of relevant hardware to Support the installation/upgrades of Red Hat (5/6), CentOS 5/6 operating systems. Worked on Installing and configuring EC2 CLI tools and AWS CLI tools on the Linux Machine.
  • Improvised storage to the cluster disks and increasing/ decreasing the filesystem in RHEL.
  • Working with backup team for the Legato backup & Restore, Legato client Installation in the Redhat Linux server
  • Automated the process of deployment to Apache Tomcat Application Servers by developing Python Scripts using Puppet. Implementing a Continuous Delivery framework using Jenkins, Maven, Nexus, and Puppet as tools.
  • Configured various jobs in Jenkins and Hudson for deployment of Java based applications and running test suites
  • Installed and Configured AWS EC2 resources such as instances, EBS volumes, snapshots, elastic load balancers, AMI's, security Groups, elastic IP's, Key pairs and Amazon Cloud watch for different zones in development, testing and production environments.
  • Able to use Amazon Web Services like EC2, S3 bucket, Route 53, RDS, EBS, ELB, Lambda, Auto-Scaling, AMI, IAM through AWS Console and API Integration with Puppet Code.
  • Designed roles and groups for users and resources using AWS Identity Access Management (IAM) and also managed network security using Security Groups, and IAM.
  • Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation JSON templates.
  • Responsible for maintaining and configuring NFS, DNS, DHCP to give IP Address to production servers.
  • Installation, configuration and troubleshooting of Jenkins on Linux environments and configured security & system in Jenkins. Added multiple nodes to Jenkins and configured SSH for continuous deployments. Implemented a Continuous Delivery (CD) pipeline with Docker, GitHub, AWS and Performed Branching, Tagging, Release Activities on Version Control (SVN, GIT).
  • Developing Bash and Ansible playbook scripts for automating configuration update on the CI/CD deployed servers like JBoss, tomcat. Worked on JIRA and Bugzilla for bug tracking issue
  • Worked on Installation and configuration of PostgreSQL database on RedHat/ Debian Servers.
  • Managing High Available Cluster nodes (HACMP) - Cluster Starting, Stopping, moving resources from one node to another, Trouble shooting and checking up logs etc.,
  • Performed Kernel and memory upgrades on Linux servers in Virtual cloud environment and used YUM and RPM to install packages.
  • Wrote Bash and Perl Scripting to migrate consumer data from one production server to another production server

Confidential, Minneapolis, MN

Systems Engineer

Responsibilities:

  • Installation, Configuration and integration for Linux (Redhat & SUSE) servers, Windows servers, AIX IBM servers. Installing and deploying the Redhat Linux enterprise server as like existing versions 5.x, 6.x versions. Design and implement large scale infrastructure projects, including Red Hat Linux 5 to 6 upgrades. Building the Red hat Linux server in VMware ESX client through vSphere client.
  • Build and manage Unix/Linux Servers running Red-hat Linux and Oracle Solaris 9/10 operating systems Mounting & unmounting the NetApp storage LUNs to the RedHat Linux servers and troubleshooting the issues encountered. Installed/Configured RedHat Linux cluster 5x version & Configuring the cluster resources.Improvised storage to the cluster disks and increasing/ decreasing the filesystem in RHEL.
  • Working with backup team for the Legato backup & Restore, Legato client Installation in the Redhat Linux server. Monitoring the System Performance and tuning the Kernel to enhance system performance.
  • Experience working on boot loaders like GRUB and LILO and also upgrading Kernel on Redhat Linux servers.
  • Creating the server profile and making the network and SAN virtual configuration using Virtual Connect in blade center C7000. VMWare ESXi 5.5 installation on Confidential and HP hardware
  • Installation/Configuration of ESXi 3.5, 4.x servers and applying Security/Vulnerability patches to the ESXi servers
  • Installing and configuring various services like DNS, DHCP, NFS, Sendmail, Apache Web Server, Samba, SSH, http, RPM, package management. Configure and manage external storage on EMC Clarion CX360/380/480/VNX, and Nimble storage array via SAN fabric and a dedicated ISCSI network.
  • Monitoring the System Performance and tuning the Kernel to enhance system performance.
  • Installed and configured web servers like Apache, IIS and integrated them with WebLogic and Managed Active Directory forests including 150+ users and 500 servers.
  • Performance improvement recommendations, including optimal RAID configurations and server, 16 Gb HBA and SAN switch design, for a media exploitation application retrieving text, photo, audio and terabyte videos from storage. System based on StorNext file system manager and EMC fiber channel SAN
  • Implemented, Configured and Creating the VM shell and using Jumpstart & Kickstart to install OS.
  • Creation of Fence devices in the cluster, Creation Failover domains for the cluster and Flip over/ Failover test in between the nodes in the cluster. VNX Block Storage Provisioning and Management including Raid Groups and Traditional LUNs, Pools and Thin LUNs, LUN Migration, FAST VP, FAST Cache, Storage Groups, Host Initiator Registration. Implemented Vup 5.0, Sybase 11.5.1, and Perl Scripting, NIS, NFS, LDAP server, WebLogic, Apache Tomcat, and EMC Storage SAN.

Hire Now