We provide IT Staff Augmentation Services!

Devops Engineer Resume

Alpharetta, GA


  • I Have 9 years of professional experience in the areas of DevOps, AWS Cloud Computing, Build and Release Engineer in automating, building, deploying, and releasing of code from one environment to another environment.
  • Experience in working on Continuous Integration and Delivery platform as DevOps Engineer.
  • In - depth knowledge of DevOps management methodologies and production deployment which include Compiling, Packaging, Deploying and Application Configurations.
  • Experience in AWS Cloud Computing services, such as EC2, S3, Lamda, API, Dynamo, EBS, VPC, ELB, Route53, Cloud Watch, Security Groups, Cloud Trail, IAM, Cloud Front, Snowball, EMR, RDS and Glacier.
  • Experience on Amazon Web Services for deploying using Code commit and Code deploy of EC2 instances consisting of various flavors like Amazon Linux AMI, Red Hat Linux Enterprise, SUSE Linux, Ubuntu server, Microsoft Window Server2012 and many more.
  • Experience in creating User/Group Accounts and attaching policies to User/Group Accounts using AWS IAM service.
  • Have work experience on MultipleAWSinstances, Creating Elastic Load Balancer and Auto scaling to design cost effective, fault tolerant and highly available systems.
  • DefinedAWSSecurity Groups which acted as virtual firewalls that controlled the traffic allowed to reach one or moreAWSEC2 instances.
  • Setting up databases inAWSusing RDS, storage using S3 bucket and configuring instance backups to S3 bucket.
  • Experience in Creating a snapshot of an EBS volume and stores it in Amazon S3.
  • Experience in migrating databases to Amazon Web Services (AWS) using AWS DMS service.
  • OSI network protocols like UDP, POP, FTP, TCP/IP, and SMTP, NIS, NFS, SSH, SFTP. Expertise in working with Layer 7 protocols like HTTP, DHCP, DNS, and SSL/TLS.
  • Expertise on shell and python scripting with focus on DevOps tools, CI/CD and AWS, Azure Cloud Architecture and hands-on Engineering.
  • Deployed and maintained Chef role-based application servers, including Apache, Resin, Nginx and Tomcat.
  • Experience in using Tomcat and Apache web servers for deployment and for hosting tools.
  • Experience with Build Management Tools Ant and Maven for writing build.xmls and pom.xmls
  • Worked on Build & Release activities for technologies like Java, .Net, Oracle & ETL.
  • Installed/Configured/Managed Puppet Master/Agent. Wrote custom Modules and Manifests, downloaded pre-written modules from puppet-forge.
  • Using Puppet Enterprise to Manage Application configurations and utilizing BASH Scripts and Right scale to do initial server provisioning, and puppet to deploy and update applications including Apache, Tomcat, MySQL and other Proprietary applications.
  • Experience in software build tools like Apache Maven, Apache Ant to write pom.xml and build.xml respectively.
  • Installed, configured and administered CI tools like Hudson and Jenkins, Bamboo, TFS for automated builds. Experience in Creation and managing user accounts, security, rights, disk space and process monitoring in Solaris, Ubuntu, Centos and Red Hat Linux.
  • Worked on Jenkins and Bamboo to deploy code in Staging and Production environments and managed artifacts generated by Jenkins and nexus factory.
  • Ability to build deployment, build scripts and automated solutions using ¬¬Ruby, Python, Shell.
  • Solid understanding of Operating systems like Linux, UNIX, windows.
  • Experienced in deploying through web application servers like WebSphere, WebLogic, JBOSS, Apache Tomcat Servers.
  • Ability to write scripts in Bash/SHELL, PERL, Ruby, Python to automate Cron jobs and system maintenance. Deployed code and data in various sandbox instances of Demand ware as a daily process.
  • Experience in Source Code Management tools such as Git and SVN and TFS
  • Experience in system monitoring with Nagios.Good knowledge in automation tools like Docker, and OpenStack.


Cloud Platform: AWS. Azure, IaaS

Configuration Management: Chef, Puppet, Maven, Ansible, Docker.

Database: Oracle, MSSQL

Build Tools: ANT, MAVEN, Jenkins. MS Build, Nant

Version Control Tools: Subversion (SVN), GIT, CVS, PerforceGIT Hub, Code Commit.

Web Servers: Apache, Tomcat, JBOSS, WebSphere. AWS, Weblogic.

Languages/Scripts: Java, C#, Shell Script, Ruby, Phyton, UNIX /LINUX, Power shell.

Continuous Integration tools: Jenkins, Hudson, Bamboo, TFS

Web Technologies.: HTML, CSS, Java Script, Bootstrap, XML, JSON.

Operating Systems: Red Hat Linux, UNIX, Ubuntu, CentOS, Solaris,SUSE.Windows98/XP/NT/ 2000/2003/2008

Network Protocols: TCP/IP, DHCP, VPN, FTP, SSH, WinSCP.


Confidential, Alpharetta,GA

DevOps Engineer


  • Followed Agile methodologies and implemented them on various projects by setting up Sprint for every two weeks and daily stand-up meetings.
  • Used Maven and SBT as build tools on Java and Scala based projects, and further development of
  • Responsible for build and deployment using Docker and ECS containers.
  • Developed a generic Jenkins job in Jenkins for continuously building all kinds of maven projects through the webhook. And, I was replacing the existing individual maven jobs.
  • Extended the generic process by attaching the Jenkins job webhook to all the current Java and Scala-based projects in GitHub.
  • For any kind of commit /push had done in the current projects by the various developers and data engineers, it triggers my generic job for building projects.
  • To make this happen peacefully, I created two Jenkins jobs (webhook and generic job) in which one is for getting the payload response from the webhook, then converting it to JSON using ruby script and the other job is getting down streamed, based on the parameter values that were resolved from the first job.
  • Inside the generic job, I had included the building, deploying artifacts to JFrog and copying logs to Amazon S3.
  • Had built a docker image by writing a Dockerfile, which provides the necessary environment required for generic job.
  • To automate the process inside the job, I had to pull the docker image from our docker registry and then, running the containers above that image. For further deploy and S3 copying, I wrote a shell script inside the job.
  • And this job, runs over the slave node (docker was pre-installed) with some set of executors underneath and this node was running over an EC2 instance launched from Jenkins.
  • During this process, it takes certain time to complete the process because of downloading the large sets of mavendependencies from local JFrog repositories. To speedup this, I added the Amazon EFS (Elastic File System) for caching all the dependencies and storing the docker images.
  • Similarly, the above process was also repeated for Scala-based projects, where the maven got replaced by SBT tool.
  • After this, our first version 0.1 got released and it, had received a good feedback from the production teams. This approach, had made our team realize that the projects were able to build inside the docker containers.
  • As a major update for the next release, I was decided to run the Jenkins jobs over ECS (EC2 Container Service) containers with EFS storage (cache support). Finally, ECR (EC2 Container Registry) from AWS was used for the image storage.
  • Had discussions about Qlick Sense for visually conveying your Business solutions.
  • Worked on Okta for orchestrating the Active Directory part using LDAP to be updated with employees details.
  • Worked on creating the docker images that included the OpenCV, which was quite essential for some projects.
  • Discussed with the team of Infrastructure Engineers, regarding Terraform templates in provisioning the AWS resources such as EMR clusters, S3, ECS etc.
  • Worked for restarting various services such as HDFS, Oozie, Spark, MapReduce etc to reflect the necessary configuration changes inside the EMR clusters.
  • Used JIRA for assigning tickets to our team, and had to update the status, story of the tickets as per the sprint setup.
  • Depended on Confluence for documenting about the progress of Projects and Sprints.
  • Used to interact with Java Developers and Data Engineers, for fixing their issues and made them use the required infrastructure effectively.
  • After the end of every Sprint, I had to close the tickets and then, had to perform both internal demo and external demo in front of various teams.

Environment: Agile, Linux, RHEL, Unix, Ubuntu, JIRA, Confluence, Slack, AWS, Jenkins, Git, xcode, Maven, SBT, Groovy, Java, IOS, Scala, vRealize, Blueprint, docker, Amazon EMR, Terraform, WAF, ruby, shell, OpenCV, JFrog, Datadog, Splunk, Hadoop, Kafka, Spark, Oozie, New Relic.

Confidential, Miami, FL

AWS DevOps Engineer


  • Responsible for design and maintenance of the Subversion/GIT Repositories, views, and the access control strategies.
  • Used Gradle, xcode and Maven as a build tools on Java, android and swift based projects for the development of build artifacts on the source code.
  • Responsible for build and deployment automation using Docker and Kubernetes containers and Chef.
  • Developed Linux, UNIX, Perl and Shell Scripts for manual deployment of the code to various environments.
  • Configured Nagios to monitor EC2 Linux instances with puppet automation. And deployed Solarwinds for network monitoring and analysis.
  • Managed the software configurations using Chef Enterprise.
  • Manage configuration of Web application and Deploy to AWS cloud server through Chef.
  • Implementing a Continuous Delivery framework using Jenkins, Puppet, Maven&Nexus in Linux environment. And performed the scheduling of various automated jobs using RUNDECK.
  • Performing data analytics using Redshift under the guidance of various Business Intelligence tools such as Qlik view, Looker, Datameer etc. We had monitored the data pipelines using Apache Kafka and
  • Matillion ETL.
  • Worked with a backend team over various issues related to Qlik Sense and surprised with it’s
  • Innovative features.
  • Worked on Chef cookbooks/recipes to automate Infrastructure as a code
  • Used Ansible for deploying the necessary changes on remote hosts and monitored the process using
  • Ansible Tower.
  • Used Ansible Tower for running playbooks stream in real-time and amazed to see the status of every running job without any further reloads.
  • Integration of Maven/Nexus, Jenkins, GIT, Confluence and JIRA.
  • Worked on Azure Resource Manager for creating and deploying templates, assigning roles, and getting activity logs.
  • To create and manage AD tenants, and had effectively implemented Azure container service
  • (DC/OS,Docker, Kubernetes) and functions.
  • Implemented AWS solutions using EC2, S3, Redshift, Lambda, RDS, EBS, Elastic Load Balancer, Auto scaling groups, SNS, Optimized volumes and Cloud Formation templates.
  • Worked on Amazon API gateway in creating endpoints for various applications.
  • Implemented DNS service (Route 53) in effectively coordinating the load balancing, fail-over and scaling functions.
  • Understanding of secure-cloud configuration (CloudTrail, AWS Config), networking services
  • (VPC,Security Groups, VPN etc.) and created roles (IAM)
  • Experienced AWS Developer tools such as CodeCommit, CodePipeline, CodeDeploy, CodeBuild etc
  • Configured S3 versioning and life cycle policies to add backup files and archive files in Glacier.
  • And implemented Python Boto3 to access various AWS services from the CLI.
  • Also used Netflix Asgard for filling the gaps between web developers and AWS cloud.
  • Worked on Blueprints for automating the capture of content from various environments using vRCS pack of vRealize.
  • Monitored and created alarms and notifications for EC2 hosts using Cloud Watch.
  • Generated workflows through Apache Airflow, then Apache Oozie for scheduling the hadoop jobs which controls large data transformations.
  • Implemented Hadoop clusters on processing big data pipelines using Amazon EMR and Cloudera whereas it depended on Apache Spark for fast processing and for the integration of APIs. Confidential the end, we managed the above resources using Apache Mesos.
  • Created and moved the data between different AWS compute (EC2, Lambda) and storage services (S3,
  • DynamoDB, EMR), as well as on-premise data sources using Data Pipeline.
  • Experience in Azurevirtual machines so that they will be able to connect to on-premises environments.
  • Created and managed AD tenants, and then configure application to be able to integrate with AzureAD(Active Directory).
  • Experience in working with Storage, Security and Compute services inside the Azure cloud.
  • Implemented AWS solutions using EC2, S3, Redshift, Lambda, RDS, EBS, Elastic Load Balancer, Auto scaling groups, Optimized volumes and EC2 instances
  • Worked on RESTful APIs using Node.js and Spring MVC for communicating between applications or systems.
  • Needed JBoss server for managing and administering the various J2EE applications.
  • Maintained Apigee for building applications, providing security over cyber threats in giving better assistance to the developer teams. Finally, it supports for the betterment of Microservices.
  • To overcome the loss of Monolithic applications, we launched several Microservices and it helped us to maintain scale, deploy and resilence.
  • Implemented Apache Jmeter and AppDynamics for testing the performance of applications/server/protocols.
  • Worked with JIRA and Alfresco for ticket management and documentation for our Sprints.
  • Worked on WebLogic for managing a cluster of logic server pieces.
  • Assisted by OEM 12 for effectively handling the Oracle DB using agents, LB with OMS.
  • Created the SQL, PL/SQL scripts (DML and DDL) in Oracle database, MySQL and revising them in SVN.

Environment: Java/J2ee, IOS, Subversion, Ant, Maven, xcode, Jenkins, GIT, SVN, Chef, Puppet, AnsibleRHEL, Cloudwatch, AWS, Azure, Node.js, Asgard, vRealize, Blueprint, Spring MVC, Qlik Sense, Microservices, WAF, Hadoop, Spark, Kafka, Mesos, Oozie, Python, ruby, Alfresco, Flask, Shell Scripting, MPEG, Ruby, PUTTY, SSL certs,Confluence, HP QTP, Selenium, JMeter, JBoss, Oracle DB, MySQL and SQL.

Confidential, Denver, Colorado

AWS DevOps Engineer


  • Maintained and Administered GIT Source Code Tool.
  • Managed Build results in Jenkins and Build Forge and deployed using workflows.
  • Maintained and tracked inventory using Jenkins and set alerts when the servers are full and need attention.
  • Modeled the structure for Multi-tiered applications by orchestrating the processes to deploy each tier using IBM UrbanCode Deploy.
  • Developed builds using MAVEN and Gradle in coordination with Spring Boot, where the build packages need to be integrated with Tomcat server spontaneously.
  • While coordinating with developer teams, Spring Boot helped us to create several RESTful applications and for deployment of J2EE in production environments.
  • Using Knife from Chef, we used to bootstrap the nodes and managed roles for automating the chef-client run using ruby recipes.
  • Wrote Ansible playbooks to manage Web applications and also used Ansible Tower. We coordinated with Terraform DSL for automating inside the AWS cloud.
  • Orchestrated several CloudFormation templates using openstack Heat and got the block storage support from Cinder.
  • Launched Memcached and Redis kind of services using AWS ElastciCache.
  • Worked on REST APIs in configuring the changes and to maintainIndex points.
  • Integrated Openshift to run Docker containers and Kubernetes clusters.
  • Experience in writing various data models in coordination with the Data Analysts.
  • In addition with supporting large-scale web applications, we indexed database queries using MYSQLserver by writing SQL queries. We worked on Apache Cassandra, Spark along with Terradata for managing large data-sets of structured data which also performed ETL.
  • Had worked over Qlik Sense for making interactive visuals to coordinate with various teams.
  • For data replication over Terradata, we included the HVR Software.
  • Included Mesos and Kafka for managing the real-time data streamlines under proper environments. Depended on Zookeeper for any assistance.
  • Launched Apache Tomcat along with Hibernate for controlling incoming user requests regarding Web Applications and their persistence with the RDBS.
  • Monitored application requests across IIS server by creating worker process and containerized the process through an application pool.
  • Monitored and analyzed the data streams using SFDC Wave Analytics integrated with Various BI Tools.
  • Assisted the Data Analysts in improving the Business logic using MemSQL behind those applications.
  • Worked on creating automated pipelines for code streaming using vRealize.
  • Maintained WebSphere for creating jobs in deploying them in various nodes through Job Manager. And, it provides better security when compared to its contemporaries.
  • Implemented RabbitMQ for driving towards better user interactions with our applications as well as between the Microservices.
  • Worked on HTTP APIs and Service Discovery relevant to various Microservices.
  • Maintained Polarion ALM for highlighting requirements, coding, testing, and release for the applications so that the teams can work in a timely manner.
  • Used Alfresco for creating demo pages and documenting the projects in JIRA.
  • Provided assistance to the Testing environment for rigorous testing using Selenium.
  • Very strong Project Management experience performing ITILRM /SCM activities.

Environment: Jenkins, RHEL, AWS, openstack, CentOS, GIT, Chef, Ansible, Terraform, Maven, Groovy, JIRA, ruby, Python, Shell/Bash, JAVA/J2EE, IOS, Web Sphere, WebLogic, Microservices, RabbitMQ, SQL Scripts, MPEG, Selenium, vRealize, Blueprinting, SFDC Wave, Alfresco, nagios, sensu, Apache Cassandra, Qlik Sense, Apache Mesos, HVR, Apache Kafka, MemSQL, Terradata, MYSQL, StarTeam, Polarion, DOORS, PostgreSQL, IIS and Apache Tomcat.

Confidential, Madison, WI

Build/Release Engineer


  • Monitored software, hardware, and/or middleware updates and utilizing technologies like Jenkins/Hudson, Ant, MS Build, TFSTeamExplorer, and Subversion (SVN).
  • Integrated Eclipse IDE with different versioning tools like ClearCase, Subversion (SVN), CVS, and GIT.Managing SCM tools Subversion (SVN)and GIT including installation, configuration & maintenance.
  • Created TFS Work items for Bugs and Task for the Test Cases and pulling the reports and sending the same to the project management and QATeams
  • Managed all the environment and application level configs using puppet git and hiera.
  • Performed installation, configuration and administration of Clearcase, Subversion(SVN)and afterwards migrated src, config and website code over to Git in Windows and Linux environment.
  • Evaluated build automation tools (Open Make and AnthillPro), recommended AnthillPro.
  • Built Java applications using ANT and Maven and deployed JAVA/J2EE applications through Tomcat Applicationservers.
  • Installed Jenkins in standalone mode as a windows service and integrated Jenkins with different tools for continuous smooth development and release process.
  • Worked on Jenkins and AnthillPro by creating and scheduling jobs and Builds and deployments using the same.
  • Specialized in automating tasks and process through scripts using ANT/Maven/Make and Shell/Perl.
  • Providing configuration services on multiple platforms in the test environment running on one or more IT Platforms: Maven, Client/server, Jenkins, MSBuild, Microsoft Windows NT, OS/390, UNIX,Wrote automation scripts in PERL for the generation of HTML files for different clients and saved several hundred man-hours in administration related tasks.
  • Created User mailbox, Managing Password Policy, Provisioning users using Windows PowerShell.
  • Assisted with maintenance of SharePoint environments through Powershell, as well as possessing the ability to write PowerShell scripts for SharePoint solutions deployment, predeployment configuration, post deployment configuration etc.
  • Documented and published complete migration process of Subversion (SVN admin dumps) to UCM Clear Case (VOBS).
  • Converted and automated builds using Ant and Maven. Scheduled automated nightly builds using Hudson and maintained Continuous integration effort with Hudson along with scheduled builds.
  • Integrated Build dependencies and dependency blocking strategy in Bamboo.
  • Troubleshooted problems related to Authentication, Authorization, Logins, End-User, Web Server and Web sphere Server.

Environment: SVN(Subversion), Anthill Pro, ANT, NAnt, and Maven, TFS, Perl, MS Build, TFS, Perforce, Unix, Linux, Bash, Phyton, PHP, Bamboo, Hudson, Git, JIRA, Shell Script, Websphere server, Weblogic, Tomcat, Jenkins, Sharepoint.

Confidential, Maynard, MA

Build /Release Engineer


  • Developed and supported the software Release Management and procedures.
  • Responsible for design and maintenance of the GIT Repositories and the access control strategies.
  • Performed all necessary day-to-day GIT support for different projects.
  • Implemented & Maintained the Branching and Build/Release strategies utilizing GIT source code management.
  • Worked on Administration, maintenance and support of Red Hat Enterprise Linux (RHEL) Servers.
  • Used Ant and Maven as a build tools on java projects for the development of build Artifacts on the source code.
  • Manage deployment automation using Puppet, MCollective, Custom Puppet Modules and Ruby.Automated the build and release management process including monitoring changes between releases.
  • Delivered architecture designs and solutions for public, private and hybrid clouds covering the cloud architecture tiers and portfolios of cloud services.
  • Worked with Custom AMI's, created AMI tags and modified AMI permissions.
  • DNS and load balancing experience on Route53.
  • Configured Elastic Load Balancers with EC2 auto scaling groups.
  • Integration of Automated Build with Deployment Pipeline. Currently installed Chef Server and clients to pick up the Build from Jenkins repository and deploy in target environments (Integration, QA, and Production)
  • Implementing a Continuous Delivery Framework using Jenkins, Puppet, Maven & Nexus in Linux Environment.
  • Configuring the Docker Containers and Creating Docker files for different Environment.
  • Migrated different projects from TFS to GIT and SVN to GIT.
  • Lead and assisted with the scoping, sequencing, planning, and creating GIT environments.
  • Developed procedures to unify, streamline and automate application development and deployment procedures with Linux Container technology using Docker.
  • Involved in Implementing Workflows, components, screens and Notification schemes in Jira projects setup.
  • Deployed the java applications into Apache Tomcat Application Servers.
  • Experience in writing Maven pom.xml and Ant build.xml for build scripts.
  • Executed user administration and maintenance tasks including creating users and groups.
  • Utilized WAR and JAR files for deployment of enterprise apps.
  • Provided assistance for management of AWS storage infrastructure systems.
  • Managed Nexus Maven repositories to download the artifacts during the build.
  • Created and maintained the Shell deployment scripts for Web Logic Web Applications servers.
  • Built Python apps that allowed developers to build proprietary solutions without requiring standard components.
  • Worked as a system administrator for the build and deployments process on the enterprise servers.
  • Developed, Maintained, and Distributed release notes for each scheduled release.
  • Skilled to Write, debug maintain scripts in Shell.
  • Build Artifacts are deployed into Tomcat instances which were integrated using shell scripts.
  • Involved in periodic archiving and storage of the source code for disaster recovery.
  • Prepared Junit test cases and executed server configurations.
  • Supported and developed tools for integration, Automated Testing and Release Management.
  • Responsible for User Management, Administration, Group Management, Slave Management, new job setup in Jenkins.

Environment: Red Hat Enterprise Linux 5.4, GIT, ANT, Jenkins, Maven, Apache Tomcat, Shell, Puppet, Nexus, AWS, EC2, Jira, Python, Docker.


Linux Admin


  • Managing Amazon Web Services(AWS) infrastructure with automation and configuration management tools such as Puppet or custom-built.
  • Designing cloud-hosted solutions, specific AWS product suite experience.
  • Analyze and resolve conflicts related to merging of source code for GIT.
  • Installed Jenkins plugins for GIT Repository, Setup SCM Polling for Immediate Build with Maven. We had required Bamboo for continuous integration and deployment.
  • Implemented Spring Boot in building Spring Java applications with less code (having dependencies included) and helps in providing metrics, health checks, packaging and configuration
  • Installed/Configured/Managed Puppet Master/Agent. Wrote custom Modules and Manifests, downloaded pre-written modules from puppet-forge.
  • Provided execution plans and helped in provisioning the resources derived other tools (Chef, Puppet) using Terraform.
  • Integrated MCollective, Hiera with Puppet to provide comprehensive automation and management of mission-critical enterprise infrastructures.
  • Repository (Nexus, Artifactory) and deployed apps using custom ruby modules through Puppet.
  • Developed and supported the Red Hat Enterprise Linux based infrastructure in the cloud environment.
  • Worked under HIPPA environment and got attracted over the future challenges in this domain, which can create miracles.
  • Research project in automating developer environment configuration using container-based technologies like Docker, Packer, Vagrant, AMI (EC2 images) etc.
  • Launched log analytics through AWS Elasticsearch and then visualize the data pipelines using Kibana, Logstash, Lambda and CloudWatch. Further, you can go from raw data to actionable insights quickly.
  • Designed Docker for continuous integration and automatic deployment. Sometimes, we deploy those images using Mesos.
  • Deployed Netflix Eureka for the purpose of locating the fail-over services inside the AWS cloud.
  • Sometimes, Go gives us the benefit of compile time type checking, and catches a whole class of potential bugs in terms of real-time scope.
  • Implemented RESTful APIs and developed various browser based applications using Google Web Toolkit(GWT)
  • Deployed and supported the building of Oracle SOA based Java Applications using WebLogic.
  • Coordinated with the both IBM DB2 server and IBM Netezza for managing various complex Applications and their analytics.
  • Developed Python, Shell Scripts and Powershell for automation purpose.
  • Integrated with MsBuild tool for deploying .NET based web applications to IIS server.
  • Implemented multi-tier application provisioning in Amazon cloud Services, integrating it with Puppet.
  • Installed and configured Splunk monitoring tool, while using it for monitoring network services and host resources.
  • Releasing code to testing regions or staging areas according to the schedule published using IBM Urban Code deploy.

Environment: Maven, MsBuild, git, CVS, Puppet, Chef, Ansible, Terraform, Foreman, Linux/Unix, Java, Spring Boot, AWS, Shell/Bash and PowerShell Scripts, IIS, HIPPA, GWT, Splunk, Kibana, LogStash, Python, WebLogic, MySQL, IBM DB2, Netezza, MongoDB, Docker, Packer, SCM, Apache Tomcat,Apache Mesos, Jira, Hudson, Bamboo, WebLogic.

Hire Now