We provide IT Staff Augmentation Services!

Devops Engineer  Resume

4.00/5 (Submit Your Rating)

Manhattan Beach, CA

SUMMARY:

  • Over 9 years o­f experience in IT industry with major focus on Configuration, SCM and Build/Release Management and as AWS DevOps Operations, Production and cross platform environments.
  • Experience in Software Development Life Cycle (SDLC) methodologies including Agile.
  • Experience in using Data Pipeline for moving the data between AWS services (Storage, Compute) as well as the on - premise data sources.
  • Experience in Creating secured cloud infra using Virtual Private Cloud (VPC) for Staging and Development environment on AWS
  • Proficient in using all Confidential like EC2, ECS, EFS, IAM, S3, ELB, API Gateway, RDS, Route 53, Cloud Watch, Cloud Formation, Redshift etc.
  • Proficient in using openstack core services such as Nova, Neutron, Glance, Cinder, Swift and Keystone.
  • Proficiency in setting up Security and Identity across the Azure through Active Directory (AD) including Key Vault, AD B2C and security center.
  • Worked on Azure Storage Services (Storage, Data Lake Store, Backup), Database (Document DB, SQL Data Warehouse) and Networking services (Vnet, LB, DNS and CDN).
  • Expertise in writing ARM templates and also, well-versed in using the Azure Compute Services.
  • Extensively used to write PowerShell scripts for automating the services inside the Azure.
  • Expertise in working under Azure Active Directory for creating roles, tenants, and assigning various security policies.
  • Extensive experience in Linux/Unix System Administration, System and Server Builds, installations, upgrades, tuning, migration and trouble shooting.
  • Experience in Administration/Maintenance of source control management systems, such as GIT, Sub Version (SVN), Bitbucket and knowledge of IBM Rational Clear Case.
  • Experience in using configuration management tools like Chef, Puppet and Ansible.
  • Expertise in writing Ruby, Python, Shell and PowerShell scripting.
  • Using Apigee and Microservices, we do deployment, scaling and stack implementation of the components in an independent way.
  • Experience in working over Apache Spark, Kafka, Hadoop, Cassandra under the environment of Apache Mesos. Also used Apache Oozie and Airflow.
  • Experience in Database softwares such as Oracle RDBMS, IBM DB2, MYSQL and Microsoft SQL server.
  • Experience on container management tools Docker, Mesos, Marathon and Kubernetes. Also used to manage clusters of nodes using docker swarm, compose, DC/OS and kubernetes clusters.
  • Extensively experienced in using Build Automation tools like Maven, Ant, Gradle, SBT and frameworks like Spring Boot, Spring MVC with Jenkins for build and deployment.
  • Knowledge on SOA and, SSL Certs, testing protocols.
  • Extensive experience in continuous integration tools like Bamboo, Jenkins and Build Forge involved in JAVA and J2EE programming.
  • Experience in deploying the code through Web/Application servers like Web Sphere, Web Logic,Apache Tomcat and JBoss.
  • Extensive experience of working with the release and deployment of large-scale .Net, Java/J2EE, android and IOS applications.
  • Knowledge with networking concepts and protocols such as TCP/IP, UDP, ICMP, etc., MAC addresses, IP packets, DNS, OSI layers, and load balancing is also needed.
  • Extensive interest in exploring the improvement of Business applications in integration with SFDC Wave.
  • Impressed with Blueprinting and orchestration of workflows in automated fashion over vRealize.
  • Experience in using both StarTeam and Polarion, Application Lifecycle Managements for providing collaboration, traceability, and workflow during the projects.
  • Experience in working under Healthcare domain and always open to any kind of positions. So, pretty aware of HIPPA environment and it's benefits after the integration with AWS.
  • Team player with excellent interpersonal skills, self-motivated, dedicated and understanding the demands of 24/7 system maintenance and has good customer support experience.

TECHNICAL SKILLS:

Operating Systems: RHEL, RH Linux CentOS, Ubuntu, Apache Mesos, Unix, Windows

Version Control Tools: GIT, Bitbucket, SVN and IBM Rational Clear Case

Web/Application Servers: WebLogic, Apache Tomcat, WebSphere, IIS and JBoss

Database Server: Oracle RDBMS, IBM DB2, Microsoft SQL, PostgreSQL and MYSQL server

Hardware Server: Mac, Blade Servers, Dell Servers

Web Technologies: SOAP, REST

Database Analaytics: Amazon RedShift, MemSQL, Greenplum, Terradata, Datameer and IBM Netezza

Data Streaming: Apache kafka, Netflix Asgard, Kinesis, MPEG

Automation Tools: Jenkins/Hudson, Build Forge, Capistrano, IBM uDeploy and Bamboo

Database Frameworks: Hadoop, Apache Spark, InfluxDB, Hibernate, Cassandra, MongoDB and Hive

Content Delivery: CloudFront, Akamai

Workflow and Job Scheduling: Apache Oozie, Apache Airflow, Kibana, RUNDECK, Logstash

Frameworks: Flask, Django, Angular.js, Node.js, Boto3, MCollective

API Management: Apigee, Kong

Testing: HP QTP, JMeter and Selenium

BI Tools: Qlik Sense, Qlik View, Looker

Build Tools: Maven, Ant, Gradle, SBT, Java Spring Boot and MS Build

Repository Manager: JForg, Nexus

Configuration Tools: Chef, Puppet, Ansible, Salt, Foreman, UCS Director and Terraform

Messaging Services: RabbitMQ, Apache Kafka and Amazon SNS

Bug Tracking Tools: JIRA, Confluence, Alfresco and IBM Clear Quest

Scripting: Shell, Ruby, Power Shell, Perl and Python

Virtualization Tools: Oracle VM Virtual Box, Vagrant, Hyper-V and VMware

Container Platforms: Docker, Kubernetes, ECS, Packer, Mesos, Marathon and CoreOS

Monitoring Tools: Nagios, AppDynamics, Cloud watch, DataDog, Elasticsearch, Sensu, Solarwinds and Splunk

ALM: Polarion, StarTeam, DOORS

Cloud Platform: AWS, Microsoft Azure, Google Cloud, openstack

Cloud Platform Management: vRealize

Paas: Openshift, Cloud Foundry, Heroku

Languages: C/C++, Java, C#, .NET, android, Swift, HTML, CSS, Java Script, php, Go, Scala

PROFESSIONAL EXPERIENCE:

DevOps Engineer

Confidential,Manhattan Beach,CA

Responsibilities:

  • Worked on Blueprint for setting up new AWS environment with certain targets as per the requirements.
  • Used Terraform for provisioning the AWS infastructure, as they were relying over on-premises before.
  • Administered BitBucket for deploying the web applications into AWS through CodeDeploy.
  • Supported by EC2, S3, ElasticBeanStalk, CodeDeploy, VPC, IAM, ELB, RDS and Route 53 from AWS, for successfully running the web apps.
  • Added Node.js and php into the instances for providing the necessary environment for running the applications.
  • Inside the cloud, the backend database support for web apps was given by RDS-MySQL instances.
  • Monitored the status of the project through daily stand-ups and weekly meetings.

Environment: Linux, Node.js, php, AWS, BitBucket, Terraform, Shell, MySQL..

DevOps Engineer

Confidential,Santa Monica, CA

Responsibilities:

  • Followed Agile methodologies and implemented them on various projects by setting up Sprint for every two weeks and daily stand-up meetings.
  • Used Maven and SBT as build tools on Java and Scala based projects, and further development of
  • Responsible for build and deployment using Docker and ECS containers.
  • Developed a generic Jenkins job in Jenkins for continuously building all kinds of maven projects through the webhook. And, I was replacing the existing individual maven jobs.
  • Extended the generic process by attaching the Jenkins job webhook to all the current Java and Scala-based projects in GitHub.
  • For any kind of commit /push had done in the current projects by the various developers and data engineers, it triggers my generic job for building projects.
  • To make this happen peacefully, I created two Jenkins jobs (webhook and generic job) in which one is for getting the payload response from the webhook, then converting it to JSON using ruby script and the other job is getting down streamed, based on the parameter values that were resolved from the first job.
  • Inside the generic job, I had included the building, deploying artifacts to JFrog and copying logs to Amazon S3.
  • Had built a docker image by writing a Dockerfile, which provides the necessary environment required for generic job.
  • To automate the process inside the job, I had to pull the docker image from our docker registry and then, running the containers above that image. For further deploy and S3 copying, I wrote a shell script inside the job.
  • And this job, runs over the slave node (docker was pre-installed) with some set of executors underneath and this node was running over an EC2 instance launched from Jenkins.
  • During this process, it takes certain time to complete the process because of downloading the large sets of mavendependencies from local JFrog repositories. To speedup this, I added the Amazon EFS (Elastic File System) for caching all the dependencies and storing the docker images.
  • Similarly, the above process was also repeated for Scala-based projects, where the maven got replaced by SBT tool.
  • After this, our first version 0.1 got released and it, had received a good feedback from the production teams. This approach, had made our team realize that the projects were able to build inside the docker containers.
  • As a major update for the next release, I was decided to run the Jenkins jobs over ECS (EC2 Container Service) containers with EFS storage (cache support). Finally, ECR (EC2 Container Registry) from AWS was used for the image storage.
  • Had discussions about Qlick Sense for visually conveying your Business solutions.
  • Worked on Okta for orchestrating the Active Directory part using LDAP to be updated with employees details.
  • Worked on creating the docker images that included the OpenCV, which was quite essential for some projects.
  • Discussed with the team of Infrastructure Engineers, regarding Terraform templates in provisioning the AWS resources such as EMR clusters, S3, ECS etc.
  • Worked for restarting various services such as HDFS, Oozie, Spark, MapReduce etc to reflect the necessary configuration changes inside the EMR clusters.
  • Used JIRA for assigning tickets to our team, and had to update the status, story of the tickets as per the sprint setup.
  • Depended on Confluence for documenting about the progress of Projects and Sprints.
  • Used to interact with Java Developers and Data Engineers, for fixing their issues and made them use the required infrastructure effectively.
  • After the end of every Sprint, I had to close the tickets and then, had to perform both internal demo and external demo in front of various teams.

Environment: Agile, Linux, RHEL, Unix, Ubuntu, JIRA, Confluence, Slack, AWS, Jenkins, Git, xcode, Maven, SBT, Groovy, Java, IOS, Scala, vRealize, Blueprint, docker, Amazon EMR, Terraform, WAF, ruby, shell, OpenCV, JFrog, Datadog, Splunk, Hadoop, Kafka, Spark, Oozie, New Relic.

AWS DevOps Engineer

Confidential,Palo Alto, CA

Responsibilities:
  • Responsible for design and maintenance of the Subversion/GIT Repositories, views, and the access control strategies.
  • Used Gradle, xcode and Maven as a build tools on Java, android and swift based projects for the development of build artifacts on the source code.
  • Responsible for build and deployment automation using Docker and Kubernetes containers and Chef.
  • Developed Linux, UNIX, Perl and Shell Scripts for manual deployment of the code to various environments.
  • Configured Nagios to monitor EC2 Linux instances with puppet automation. And deployed Solarwinds for network monitoring and analysis.
  • Managed the software configurations using Chef Enterprise.
  • Manage configuration of Web application and Deploy to AWS cloud server through Chef.
  • Implementing a Continuous Delivery framework using Jenkins, uDeploy, Puppet, Maven&Nexus in Linux environment. And performed the scheduling of various automated jobs using RUNDECK.
  • Performing data analytics using Redshift under the guidance of various Business Intelligence tools such as Qlik view, Looker, Datameer etc. We had monitored the data pipelines using Apache Kafka and Matillion ETL.
  • Worked with a backend team over various issues related to Qlik Sense and surprised with it’s innovative features.
  • Worked on Chef cookbooks/recipes to automate Infrastructure as a code
  • Used Ansible for deploying the necessary changes on remote hosts and monitored the process using Ansible Tower.
  • Used Ansible Tower for running playbooks stream in real-time and amazed to see the status of every running job without any further reloads.
  • Integration of Maven/Nexus, Jenkins, GIT, Confluence and JIRA.
  • Worked on Azure Resource Manager for creating and deploying templates, assigning roles, and getting activity logs.
  • To create and manage AD tenants, and had effectively implemented Azure container service (DC/OS, Docker, Kubernetes) and functions.
  • Implemented AWS solutions using EC2, S3, Redshift, Lambda, RDS, EBS, Elastic Load Balancer, Auto scaling groups, SNS, Optimized volumes and Cloud Formation templates.
  • Worked on Amazon API gateway in creating endpoints for various applications.
  • Implemented DNS service (Route 53) in effectively coordinating the load balancing, fail-over and scaling functions.
  • Understanding of secure-cloud configuration (CloudTrail, AWS Config), networking services (VPC, Security Groups, VPN etc.) and created roles (IAM)
  • Experienced AWS Developer tools such as CodeCommit, CodePipeline, CodeDeploy, CodeBuild etc
  • Configured S3 versioning and life cycle policies to add backup files and archive files in Glacier.
  • And implemented Python Boto3 to access various AWS services from the CLI.
  • Also used Netflix Asgard for filling the gaps between web developers and AWS cloud.
  • Worked on Blueprints for automating the capture of content from various environments using vRCS pack of vRealize.
  • Monitored and created alarms and notifications for EC2 hosts using Cloud Watch.
  • Generated workflows through Apache Airflow, then Apache Oozie for scheduling the hadoop jobs which controls large data transformations.
  • Implemented Hadoop clusters on processing big data pipelines using Amazon EMR and Cloudera whereas it depended on Apache Spark for fast processing and for the integration of APIs. At the end, we managed the above resources using Apache Mesos.
  • Created and moved the data between different AWS compute (EC2, Lambda) and storage services (S3, DynamoDB, EMR), as well as on-premise data sources using Data Pipeline.
  • Experience in Azurevirtual machines so that they will be able to connect to on-premises environments.
  • Created and managed AD tenants, and then configure application to be able to integrate with Azure AD (Active Directory).
  • Experience in working with Storage, Security and Compute services inside the Azure cloud.
  • Implemented AWS solutions using EC2, S3, Redshift, Lambda, RDS, EBS, Elastic Load Balancer, Auto scaling groups, Optimized volumes and EC2 instances
  • Worked on RESTful APIs using Node.js and Spring MVC for communicating between applications or systems.
  • Needed JBoss server for managing and administering the various J2EE applications.
  • Maintained Apigee for building applications, providing security over cyber threats in giving better assistance to the developer teams. Finally, it supports for the betterment of Microservices.
  • To overcome the loss of Monolithic applications, we launched several Microservices and it helped us to maintain scale, deploy and resilence.
  • Implemented Apache Jmeter and AppDynamics for testing the performance of applications/server/protocols.
  • Worked with JIRA and Alfresco for ticket management and documentation for our Sprints.
  • Worked on WebLogic for managing a cluster of logic server pieces.
  • Assisted by OEM 12 for effectively handling the Oracle DB using agents, LB with OMS.
  • Created the SQL, PL/SQL scripts (DML and DDL) in Oracle database, MySQL and revising them in SVN.

Environment: Java/J2ee, IOS, Subversion, Ant, Maven, xcode, Jenkins, uDeploy, GIT, SVN, Chef, Puppet, Ansible, RHEL, Cloudwatch, AWS, Azure, Node.js, Asgard, vRealize, Blueprint, Spring MVC, Qlik Sense, Microservices, WAF, Hadoop, Spark, Kafka, Mesos, Oozie, Python, ruby, Alfresco, Flask, Shell Scripting, MPEG, Ruby, PUTTY, SSL certs,Confluence, HP QTP, Selenium, JMeter, JBoss, Oracle DB, MySQL and SQL.

DevOps Engineer

Confidential,Mahwah, NJ

Responsibilities:
  • Maintained and Administered GIT Source Code Tool.
  • Managed Build results in Jenkins and Build Forge and deployed using workflows.
  • Maintained and tracked inventory using Jenkins and set alerts when the servers are full and need attention.
  • Modeled the structure for Multi-tiered applications by orchestrating the processes to deploy each tier using IBM UrbanCode Deploy .
  • Developed builds using MAVEN and Gradle in coordination with Spring Boot , where the build packages need to be integrated with Tomcat server spontaneously.
  • While coordinating with developer teams, Spring Boot helped us to create several RESTful applications and for deployment of J2EE in production environments.
  • Using Knife from Chef , we used to bootstrap the nodes and managed roles for automating the chef-client run using ruby recipes .
  • Wrote Ansible playbooks to manage Web applications and also used Ansible Tower . We coordinated with Terraform DSL for automating inside the AWS cloud.
  • Orchestrated several CloudFormation templates using openstack Heat and got the block storage support from Cinder .
  • Launched Memcached and Redis kind of services using AWS ElastciCache .
  • Worked on REST APIs in configuring the changes and to maintain Index points.
  • Integrated Openshift to run Docker containers and Kubernetes clusters.
  • Experience in writing various data models in coordination with the Data Analysts.
  • In addition with supporting large-scale web applications, we indexed database queries using MYSQL server by writing SQL queries. We worked on Apache Cassandra , Spark along with Terradata for managing large data-sets of structured data which also performed ETL .
  • Had worked over Qlik Sense for making interactive visuals to coordinate with various teams.
  • For data replication over Terradata, we included the HVR Software.
  • Included Mesos and Kafka for managing the real-time data streamlines under proper environments. Depended on Zookeeper for any assistance.
  • Launched Apache Tomcat along with Hibernate for controlling incoming user requests regarding Web Applications and their persistence with the RDBS .
  • Monitored application requests across IIS server by creating worker process and containerized the process through an application pool.
  • Monitored and analyzed the data streams using SFDC Wave Analytics integrated with Various BI Tools.
  • Assisted the Data Analysts in improving the Business logic using MemSQL behind those applications.
  • Worked on creating automated pipelines for code streaming using vRealize .
  • Maintained WebSphere for creating jobs in deploying them in various nodes through Job Manager. And, it provides better security when compared to its contemporaries.
  • Implemented RabbitMQ for driving towards better user interactions with our applications as well as between the Microservices .
  • Worked on HTTP API s and Service Discovery relevant to various Microservices .
  • Maintained Polarion ALM for highlighting requirements, coding, testing, and release for the applications so that the teams can work in a timely manner.
  • Used Alfresco for creating demo pages and documenting the projects in JIRA.
  • Provided assistance to the Testing environment for rigorous testing using Selenium .
  • Very strong Project Management experience performing ITILRM / SCM activities.

Environment: Jenkins, RHEL, AWS, openstack, CentOS, GIT, Chef, Ansible, Terraform, Maven, Groovy, JIRA, ruby, Python, Shell/Bash, JAVA/J2EE, IOS, Web Sphere, WebLogic, Microservices, RabbitMQ, SQL Scripts, MPEG, Selenium, vRealize, Blueprinting, SFDC Wave, Alfresco, nagios, sensu, Apache Cassandra, Qlik Sense, Apache Mesos, HVR, Apache Kafka, MemSQL, Terradata, MYSQL, StarTeam, Polarion, DOORS, PostgreSQL, IIS and Apache Tomcat.

Senior Build & Release Engineer

Confidential,Pella,IA

Responsibilities:

  • Managing Confidential infrastructure with automation and configuration management tools such as Puppet or custom-built.
  • Designing cloud-hosted solutions, specific AWS product suite experience.
  • Analyze and resolve conflicts related to merging of source code for GIT.
  • Installed Jenkins plugins for GIT Repository, Setup SCM Polling for Immediate Build with Maven. We had required Bamboo, uDeploy for continuous integration and deployment.
  • Implemented Spring Boot in building Spring Java applications with less code (having dependencies included) and helps in providing metrics, health checks, packaging and configuration
  • Installed/Configured/Managed Puppet Master/Agent. Wrote custom Modules and Manifests, downloaded pre-written modules from puppet-forge.
  • Provided execution plans and helped in provisioning the resources derived other tools (Chef, Puppet) using Terraform.
  • Integrated MCollective, Hiera with Puppet to provide comprehensive automation and management of mission-critical enterprise infrastructures.
  • Repository (Nexus, Artifactory) and deployed apps using custom ruby modules through Puppet.
  • Developed and supported the Red Hat Enterprise Linux based infrastructure in the cloud environment.
  • Worked under HIPPA environment and got attracted over the future challenges in this domain, which can create miracles.
  • Research project in automating developer environment configuration using container-based technologies like Docker, Packer, Vagrant, AMI (EC2 images) etc.
  • Launched log analytics through AWS Elasticsearch and then visualize the data pipelines using Kibana, Logstash, Lambda and CloudWatch. Further, you can go from raw data to actionable insights quickly.
  • Designed Docker for continuous integration and automatic deployment. Sometimes, we deploy those images using Mesos.
  • Deployed Netflix Eureka for the purpose of locating the fail-over services inside the AWS cloud.
  • Sometimes, Go gives us the benefit of compile time type checking, and catches a whole class of potential bugs in terms of real-time scope.
  • Implemented RESTful APIs and developed various browser based applications using Google Web Toolkit(GWT)
  • Deployed and supported the building of Oracle SOA based Java Applications using WebLogic.
  • Coordinated with the both IBM DB2 server and IBM Netezza for managing various complex Applications and their analytics.
  • Developed Python, Shell Scripts and Powershell for automation purpose.
  • Integrated with MsBuild tool for deploying .NET based web applications to IIS server.
  • Implemented multi-tier application provisioning in Amazon cloud Services, integrating it with Puppet.
  • Installed and configured Splunk monitoring tool, while using it for monitoring network services and host resources.
  • Releasing code to testing regions or staging areas according to the schedule published using IBM Urban Code deploy.

Environment: Maven, MsBuild, git, CVS, Puppet, Chef, Ansible, Terraform, Foreman, Linux/Unix, Java, uDeploy, Spring Boot, AWS, Shell/Bash and PowerShell Scripts, IIS, HIPPA, GWT, Splunk, Kibana, LogStash, Python, WebLogic, MySQL, IBM DB2, Netezza, MongoDB, Docker, Packer, SCM, Apache Tomcat,Apache Mesos, Jira, Hudson, Bamboo, WebLogic.

Build and Release Engineer

Confidential,San Diego, CA

Responsibilities:

  • Installation, administration and monitoring of WINDOWS and LINUX (CentOS, Ubuntu and Red Hat) server.
  • Resolving merging issues during build and release by conducting meetings with developers and managers.
  • Used the Actor Model from Akka, we raised the abstraction level and provide a better platform to build scalable, resilient and responsive applications.
  • Configured and Maintained the Jenkins, Terraform in automation jobs.
  • Accessed Azure cloud services through PowerShell scripts for managing various services from the CLI.
  • Monitored the applications and logs using Nagios and Sensu.
  • Branching, Tagging, Release Activities are coordinated with the help of git and IBM clear case.
  • Managed and automated the pools of computer resources using Nova and got the storage support from Swift under the secure access provided by Keyvault.
  • Launched Trove service for provisioning RDB engine and performed data processing using Sahara.
  • Maven as build tool on Java projects for the development of build artifacts on the source code.
  • Included Go as an alternative for some Java-based applications for faster execution and high performance.
  • Created and maintained Ruby scripts for building applications.
  • Checking the code to GitHub repository and updating the status.
  • Maintain and track inventory using Jenkins and set alerts when the servers are full and need attention.
  • Supported the large data sets using RedShift and Greenplum and in additional, they supported for business logic and analytics.
  • Developed and deployment scripts using ANT and MAVEN as build tools in Jenkins to move from one environment to other environments
  • Required to run SQL scripts under MemSQL and index the databases suitable for analysis using Mesos environment.
  • Deployed the archives to Tomcat and WebLogic Servers.

Environment: Windows, Linux, UNIX, openstack, Azure, AWS, Hyper-V, Terraform, android, WebLogic, C++, Java, Go, GitHub, IBM clear case, Maven, nagios, sensu, Akka, Jenkins, Greenplum, MemSQL, Capistrano, MySQL, Tomcat, Jira, Shell, PowerShell and Ruby.

Linux Administrator

Confidential

Responsibilities:

  • Installing Red Hat Linux 5/6/7 using kickstart servers and interactive installation.
  • Supporting infrastructure environment comprising of RHEL and Solaris.
  • Work as a JIRA Admin for creating the Jira Stories and assign them to the developer and QA teams.
  • Worked on various tools such as Logstash, Logcheck for analyzing and monitoring the logs.
  • Used python scripts to update content in the database and manipulate files.
  • Administration of repetitive Jobs using Cron scheduling, and using bash Scripts to accomplish the task on multiple servers.
  • Installed LDAP for user management and configured crowd for SSO.
  • Configuration of Monitoring tools in Linux and Solaris servers.
  • Used Share point integrated with Microsoft Office for managing file systems.
  • Troubleshooting and resolving issues in JIRA and Confluence as users.
  • Integrated Bamboo with Jira, Created Plans and Failed Builds will create a ticket.
  • Created AWS Launch configurations based on customized AMI and use this launch configuration to configure auto scaling groups.
  • Worked on Managing the Private Cloud Environment using Puppet modules (ruby- based).
  • Worked with different Active directory databases like Microsoft AD, Tivoli Directory server with LDAP.
  • Worked on making DNS entries to establish connection from server to DB2 database.
  • Performed patching, backups on multiple environments of Solaris, Linux and VMware.
  • Installed and configured Apache / Tomcat web server and WebLogic application server.
  • Developed entire front-end and back-end modules using Python on Django Web Framework.
  • Created and modified users and groups with SUDO permission.
  • JIRA is used as ticket tracking, change management and Agile/Scrum tool.
  • Managed TCP/IP packets and DHCP servers.
  • Installation of third party tools using packages.

Environment: RHEL, JIRA, DHCP, LVM, LDAP, Vagrant, AD, DNS, Networking, Ubuntu, Cent OS, SCM, Bash/Shell, Ruby, Python

We'd love your feedback!