We provide IT Staff Augmentation Services!

Devops Engineer Resume

4.00/5 (Submit Your Rating)

Cleveland, OH

PROFESSIONAL SUMMARY:

  • 8 years of IT industry experience with a focus on Cloud & DevOps tools and technologies, Continuous Integration, Continuous Delivery, Continuous Deployment (CI/CD pipeline), Configuration Management, Version Control, monitoring tools and technologies, Build and Release management, Linux/Windows System Administration & Automation.
  • Responsible for configuring shared access signatures (SAS) tokens, storage access policies and implementing Azure Cloud Infrastructure.
  • Worked on Azure Development services like Azure web application, Azure storage, Azure SQL DB, App services, Azure Virtual Machines, Azure DNS, Azure VPN Gateway, Azure AD, Azure search, and Notification hub.
  • Experiencein Azure PlatformDevelopment, Deployment Concepts, hosted Cloud Services, platform service and close interface withWindows Azure Multi - factor Authentications and Continually changing architecture to shift software system offerings to a distributed environment and service-based architecture.
  • Extensive working experience on AWS Cloud Services like EC2, ELB, Auto Scaling, IAM, SNS, SQS, Dynamo DB, VPC, Route53, RDS, S3, Elastic search, Elastic Filesystem (EFS), Cloud Watch, Cloud Trail, Cloud Security, Lambda, Service Catalog, Kinesis, Redshift.
  • For automated application deployment, I have experience using Amazon Lambda and Elastic Beanstalk.
  • Experience inGCPservices likeGCPCompute Engine,GCPCloud Load Balancing,GCPcloud storage,GCPcloud SQL,GCPCloud Monitoring,GCPCloud Deployment Manager, andGCPKubernetes Engine.
  • Experience in managing IAM policies with active directory integration to manage security in GCP.
  • Terraform, an infrastructure as code tool, isused to configure AWS virtual cloud and Resources.
  • Experience designing a Terraform and deploying it in Google Cloud Platform's cloud deployment manager to spin up resources such as cloud virtual networks, Compute Engines in public and private subnets, and AutoScaler.
  • Experience with airgap installation of Terraform along with configuration with Forseti and Cloud Armor.
  • Good understanding of the Pivotal Cloud Foundry Architecture (Diego Architecture), as well as PCF components and their functions, PCF CLI experience for an application deployment and other CF management tasks.
  • Expertise in utilizing OpenStack REST APIs to script playbooks to create networks, routers, and VMs. OpenStack services were deployed in Linux containers.
  • Configured Cloud Compute systems using OpenStack on Ubuntu, collaboration using Orchestration with Keystone, Kubernetes within OpenStack.
  • Production experience in large environments using configuration management tools Chef, Ansible and Puppet.
  • Experience with Ansible playbooks, vault, and Tower as a configuration management tool to automate repetitive activities, deploy apps, manage changes, and automate software updates and functionality verification.
  • Experience in Chef Server Enterprise on premise/workstation/bootstrapped the nodes using knife and automated by testing Chef Recipes/Cookbooks with Test-kitchen.
  • Worked on a configuration management Puppet tool which includes installing Puppet master, agents and writing manifests from scratch and pushing them to agents for CI &CD.
  • Experience in integrating Jenkins with various tools like Maven (Build tool), Git (Repository), SonarQube (code verification), Nexus (Artifactory) and implementing CI/CD automation for creating Jenkins pipelines programmatically architecting Jenkins Clusters.
  • Responsible for User Management, Plug-in Management, END-END automation of Build and Deploy and troubleshooting the build issues during the build process using Jenkins.
  • Build management, setting up CI/CD pipeline and deployed using Azure DevOps (both classic and yaml pipelines).
  • Migrating builds from TeamCity and Octopus Deploy into Azure DevOps by creating new pipelines in conjunction with Microsoft Azure.
  • Experienced in Configuring, and managing Docker Containers, Docker Images for Web Servers, andApplications servers such as Apache, Tomcat using Docker and integrated with Amazon RDS database.
  • Worked on installing and implementing VMware ESX Server, VMware vSphere, VMware Virtual Center and other products for Virtualization.
  • Experience in scheduling, deploying, & managing container replicas onto a node cluster usingKubernetes, worked on building K8’s run time environment of the CI/CD system to build, test & deploy in an open-source platform.
  • Expertise in MAVEN, ANT and Gradle code build tools.
  • Experience with Splunk architecture and various components including Search Heads, Indexers, Deployment server, Deployer, License Master, Heavy/Universal Forwarders.
  • ELK clusters (Elasticsearch, Logstash, and Kibana) is created and maintained for enterprise logging and involved in evaluating system logs using ELK stack to assess the infrastructure needs for each application and deploy it on Azure platform.
  • Automated the process using Shell, Python, Ruby, PowerShell, JSON, YAML, and Groovy scripting languages.
  • Enhanced and deployed Microservices based applications using Spring Boot and Spring Cloud and created dynamic documentation for RESTFUL web service using Swagger.
  • Experience in creating and using schema for SQL & NoSQL databases, MySQL, MongoDB, and Dynamo DB in AWS.
  • Designed Network SecurityGroups (NSGs) to control inbound and outbound traffic access to network interfaces (NICs), VMs and subnets.
  • Experience with security and scanning tools such as HP fortify, SonarQube and Black Duck.
  • Proficient in managing the source code control of multiple development efforts using, Subversion (SVN), TFS (for Windows Environment), GITLAB, Bitbucket and Perforce version control tools.
  • Experienced in Branching, Merging, Tagging & Maintaining the version across the environments using SCM tools like Git and Subversion (SVN) on Linux platforms.
  • Hands on experience in Configuration Management (CM) policies and approaches with regards to software development life cycle (SDLC) along with automated scripting using Bash, PowerShell, Perl, Python scripting.
  • Exposure to all aspects of Software Development Life Cycle (SDLC) such as Analysis, Planning, Development, Testing, Implementation, Post-production analysis of the projects.
  • Experienced in development methodologies including Waterfall, Scrum, Kanban, Agile, and hybrid.
  • Knowledge of using Routed Protocols like FTP, SSH, HTTP, HTTPS and direct connect and experience with Kickstart installations, support, configuration and maintenance of Red Hat Enterprise Linux, CentOS, Ubuntu.

TECHNICAL SKILLS:

Operating system: Windows, Linux (Red Hat 6/7/8, CentOS), Ubuntu

Programming & Scripting Languages: Java, C, Python, .Net, C++, XML, HTML, CSS, Bash, shell, Power shell scripting, Perl, Ruby, PHP, JSON, Groovy, YAML

Application Servers: Apache Tomcat, Web Logic Application Server, Web Sphere

Version control tools: Git, Gitlab, Bitbucket, TFS, Subversion SVN

CI/CD & Build Tools: Jenkins, Bamboo, Maven, Gradle, ANT, Concourse,U-Deploy,Octopus,Azure DevOps

Automation Tools: Ansible, Terraform, Chef, Puppet, kickstart

Containerization Tools: Docker, Kubernetes, ECR, ACR, EKS, AKS, Openshift

Artifact Managing Tools: Nexus, JFrog

Web servers: Tomcat, Nginx, Azure, Apache, Web Logic, Web Sphere, IIS

Networking/protocol: TCP/IP, DNS, FTP/SFTP, HTTP/HTTPS, LAN, WAN, NIS, NFS, NDS, SMTP

Virtualization Technologies: VMWare, Windows Hyper-V, Virtual box

Cloud Environments: AWS, AZURE, GCP, Pivotal Cloud Foundry (PCF), OpenStack

Infrastructure Tools: Terraform,Terragrunt,CloudFormation, Azure Resource Manager, Hashi Corp Packer

Database: Oracle MySQL, MongoDB, AWS RDS, Dynamo DB, PostgreSQL

Code Security/Security tools: SonarQube, Forseti Security, Cloud Armor

Vaults: Hashi corp Vault, Azure key Vault, Azure Recovery Services Vault

Monitoring Tools: Splunk, Nagios, CloudWatch, ELK, Prometheus, Grafana

Reporting & Ticketing Tools: ServiceNow, JIRA, Azure Boards, Confluence

PROFESSIONAL EXPERIENCE:

Confidential, Cleveland, OH

DevOps Engineer

Responsibilities:

  • Designed, configured, and deployed Microsoft Azure for a multitude of applications utilizing the Azure stack (Including Computer, Web & Mobile, Blobs, ADF, Resource Groups, Azure SQL DW, Cloud Services, and ARM focusing on high-availability, Redundancy, fault tolerance, and auto-scaling).
  • Configured Azure web apps, Azure App services, Azure Application insights, Azure Application gateway, Azure DNS, Azure Traffic manager, Azure Network Watcher.
  • Managed VM Backup and Recover from a Recovery Services Vault using Azure PowerShell and Azure Portal.
  • Designed, Planned, and created Azure virtual machines, Implemented, and connected to on-premises environments by managing a Cross Premises Azure Virtual network.
  • Involved in creating, validating, and reviewing solutions for data center migration to Azure cloud environment.
  • Setup Azure Virtual Machines (VMs) to meet security requirements like software-based functions (firewall, WAN optimization and intrusion detections).
  • Involved in migrating infrastructure and application from on premise to Microsoft Azure.
  • Created preliminary PowerShellcode for moving Azure Classic workloads to Azure Resource Manager version.
  • Involvement in working on AWS Cloud Services like EC2, ELB, Auto Scaling, IAM, SNS, SQS, Dynamo DB, VPC, Route53, RDS, S3, Elastic search, Elastic Filesystem (EFS), Cloud Watch, Cloud Trail, Cloud Security, Lambda, Service Catalog, Kinesis, Redshift.
  • Responsible for managing Amazon instances by taking AMIs and performing administration and monitoring of Amazon instances using Amazon Cloud Watch.
  • Expertise in Auto Scaling using AWS command line tools and AWS cloud environment for Dev/QA environments.
  • Managed AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing and Glacier for our QA and UAT environments as well as infrastructure servers for GIT and Chef and experienced in automating CI and CD pipeline with AWS Code Pipeline, Jenkins and AWS Code Deploy.
  • Worked with Amazon Macie to manage data security and data privacy of our clients.
  • Coordinate and assist developers with establishing and applying appropriate branching, labeling/naming conventions using GitLab and analyzed and resolved conflicts related to merging of source code for Git.
  • Created development and test environments for different micro services by provisioning Kubernetes clusters on Azure using Docker, Ansible, and Terraform.
  • Worked on creating and deploying Azure Infrastructure as a Code using Terraform modules and ARM Templates.
  • Used Terragrunt to keep our Terraform code and backend configuration dry to make our deployment simple with single immutable remote terraform code.
  • Worked with Terraform extensively, wrapped it in Terragrunt, and maintained the back-end state in S3 Bucket.
  • Wrote Ansible playbooks from scratch in YAML. Installing, setting up & Troubleshooting Ansible.
  • Updated the existing scripts to Ansible playbooks to install configurations on multiple servers in Azure.
  • Proficient in using Ansible Tower, which provides an easy-to-use dashboard and role-based access control, so that it is easier to allow individual teams access to use Ansible for their deployments.
  • Developed Ansible scripts for an automated server provisioning and Docker images for isolation, reducing the time between provisioning and deployment from over 2 hours to less than 10 minutes.
  • Worked on GitHub to store the code and integrate it to Ansible to deploy the playbooks and managing servers and Docker containers with OS/Applications/Services/Packages and keeping them in a loop.
  • Used Ansible Control server to deploy playbooks to the machines and systems in the inventory.
  • Integrated Jenkins with various DevOps tools Nexus, SonarQube, Ansible and used CI/CD system of Jenkins on Kubernetes container environment, utilizing Kubernetes and Docker for the runtime environment for the CI/CD system to build and test and deploy.
  • Used Kubernetes to deploy scale, load balance, scale and manage Docker containers with multiple namespace versions.
  • Prototype CI/CD system with GitHub utilizing Kubernetes and docker for the runtime environment for the CI/CD systems to build and test and deploy.
  • Building docker images using Azure pipelines and push to Artifactory and deploy to OpenShiftcontainers using Kubernetes for the Microservices.
  • Created various build and deployment scenarios such as jobs to build from various branches, deploy tasks to development server or QA server or Staging/Production server using Azure DevOps.
  • Enabled Continuous Delivery through Deployment into several environments of Development, Test and Production using Maven and SonarQube.
  • Installed, Configured, Managed Monitoring Tools Splunk, and Nagios for Resource Monitoring/Network Monitoring/Log Trace Monitoring and used JIRA for change control & ticketing.
  • Worked with Cloud Watch and ELK to monitor OS metrics, server health checks, file system usage, and for logging and monitoring our systems end to end Using Beats.
  • Worked in designing and implementing continuous integration system using Azure DevOps by creating pipelines using Python and Shell scripts.
  • Involved in writing Bash/Shell scripts for managing day-to-day transactions and automation of routine tasks.

Environment: & Tools: AZURE, AWS (EC2/AMIs/VPC/S3/IAM/S3,EBScloud trails, Cloud Watch, EMR, Cloud Formation, SQS, SNS, Snowball, Lambda, Kinesis, Redshift, Route53,RDS,MYSQL), AWS Direct connect, OpenStack, Node.js, Ansible, Terraform, Docker, Docker Swarm, Kubernetes, Jenkins, SonarQube, Git, GitLab, Python, Nagios, Splunk, TFS, JIRA, ELK Stack, Kafka, zookeeper, PowerShell, OpenShift.

Confidential, Houston, TX

DevOps Engineer / SRE

Responsibilities:

  • Monitored resources and applications using AWS cloud watch including creating alarms to monitor metrics such as EBS, EC2, ELB, RDS, S3 and configured notifications for the alarms generated based on events defined.
  • Created Users, Groups, Roles, Policies, and Identity providers in AWS Identity Access Management (IAM) for improved login authentication.
  • Involved in creating AWS AMI, have used Hashi Corp Packer to create and manage the AMI's.
  • Developed AWS Cloud Formation templates to create custom sized VPC, EMR, Dynamo DB, subnets, EC2 instances, ELB and security groups.
  • Designed Network SecurityGroups (NSGs) to control inbound and outbound traffic access to network interfaces (NICs), VMs and subnets.
  • Worked on Google Cloud Platform (GCP) services includes cloud storage,cloud SQL, Compute Engine, cloud load balancing, stack driver monitoring, Cloud Armor and cloud deployment manager.
  • Worked on Kinesis Data Streams, Kinesis Firehouse, and integrated with AWS Lambda for serverless data collection.
  • Deployed AWS Lambda code from Amazon S3 buckets. Created a Lambda Deployment function and configured it to receive events from S3 bucket.
  • Created Snowflake Schemas by normalizing the dimension tables as appropriate and creating a Sub Dimension named Demographic as a subset to the Customer Dimension.
  • Performed bulk load of JSON data from S3 bucket to snowflake.
  • Used Snowflake functions to perform semi structures data parsing entirely with SQL statements.
  • Configured Ansible to manage AWS workflow environments and automate the build process for core AMIs used by all application deployments including Auto scaling, and Cloud formation scripts.
  • Managed and created autoscaling groups on top of AWS instances and also managed configuration and policies to CloudWatch Alarms and SNS messages by using Forseti.
  • Used Maven as a build tool on java projects for the development of build artifacts on the source code by including necessary dependencies.
  • Maintained Artifacts in binary repositories using JFrog Artifactory and pushed new Artifacts by configuring the Jenkins project that uses Jenkins Artifactory plugin.
  • Responsible for installation & configuration of AWS Code Build to support various Java builds and Jenkins plugins to automate continuous builds and publishing Docker images to the JFrog repository.
  • Involved in buildingpipeline to drive all microservices builds to Docker registry and deploy to Kubernetes using AWS Code Build, Code Deploy and Code Pipeline.
  • Managed the source code for various applications in Bitbucket and configured it with Jenkins to trigger automated builds.
  • Extensively worked with Scheduling, deploying, managing container replicas onto a node using Kubernetes and created Kubernetes clusters work with Helm charts running on the same cluster resources, managed releases of Helm packages.
  • Hands-on Experience in OpenShift for creating new Projects, Services for load balancing and adding them to Routes to be accessible from outside, troubleshooting pods through ssh and logs, modification of Buildconfigs, templates.
  • Deployed multiple java applications in Kubernetes using EKS and OpenShift to maintain high availability.
  • Configured pipelines on Jenkins & Concourse CI with the integration of Hashi Corp Vaultfor secrets, dynamic injection of environment-specific keys, along with blue/green and canary style traffic shaping.
  • Designed and distributed Data across all the Nodes and Clusters on different availability zones in AWS Redshift and PostgreSQL experience in automating the infrastructure using Terraform in AWS console.
  • Integrated SonarQube with AWS Code Pipelinefor continuous inspection of code quality and analysis with SonarQube scanner for Maven.
  • Migrated application services from Heroku Platform and deployed it into AWS Cloud Platform.
  • Involved in Automation by using Python, Shell, bash scripting and developed YAML scripts from scratch.
  • Created alerts and monitoring dashboards using Prometheus and Grafana for microservices deployed in AWS.
  • Tracking and prioritizing issues and new features for later releases of software using JIRA and Confluence.

Environment: & Tools: AWS, AWS Cloud watch, Git, Bitbucket, Kinesis, Snowflake, S3, AWS Lambda, Dynamo DB, GCP, Cloud Armor, Heroku Platform, Bamboo, Ansible, Terraform, Maven, Jenkins, Docker, JFrog Artifactory, Kubernetes, EKS, Helm charts, Openshift, Concourse CI, Hashi Corp Packer, Hashi Corp Vault, AWS Redshift, PostgreSQL, SonarQube, Python, Shell Scripting, YAML, Prometheus, Grafana, JIRA, Confluence.

Confidential, Atlanta, GA

Cloud Engineer

Responsibilities:

  • Worked on AWS EC2, VPC, Cloud watch, IAM and Elastic Beanstalk for provisioning and managing infrastructure through automation.
  • Designed and developed AWS Cloud Formation templates to create custom VPC,Subnets, NAT to ensure deployment of web applications.
  • Worked on Multiple AWS EC2 instances, set the security groups, Elastic Load Balancer and AMIs, Auto scaling to design cost effective, fault tolerant and highly available systems.
  • Implemented EFS storage to mount on different ec2 instances for sharing the Volume.
  • Developed PowerShell scripts to automate the project creation, setting permissions for users, groups in TFS.
  • Implemented a server less architecture using API Gateway, Lambda and Dynamo DB and deployed AWSLambda code from Amazon S3 buckets.
  • Created many Lambda scripts to encrypt the un-encrypted EBS volumes and explicitly mention to stop the instances which does not have proper tags. Created Lambda using Boto3 to cleanup old EBS snapshot, un-used EBS volumes, and removed old AMI for cost optimization.
  • Deployed and monitored Micro services using Pivotal Cloud Foundry (PCF), managed domains and routes with the cloud foundry also worked on PCF by using Dockers Swarm and deployed spring boot applications.
  • Created additional Docker Slave Nodes for Jenkins using custom Docker Images and pulled them to ECR. Worked on all major components of Dockerlike,DockerDaemon, Hub, Images, Registry, Swarm etc.
  • Worked with Red Hat OpenShift Container Platform forDocker and Kubernetes, Used Kubernetes to manage containerized applications using its nodes, ConfigMaps, selector, Services, and deployed application containers as Pods.
  • Working experience of deployment of Java applications through WebLogic/WebSphereApplication servers.
  • Worked on CI/CD tools - Jenkins, Git, Jira for bug tracking, Maven & Nexus Artifactory configuration management, automation using Puppet, Shell scripting in Linux environment and Integration of Nexus, Jenkins, Git and JIRA.
  • Deployed and configured Git repositories with branching, forks, tagging, and notifications. Experienced and proficient in version controlling and administering GitHub.
  • Extensive experience in creating VMs, setting up VM priorities,Creating Templates and Snapshots.
  • Developed automation scripting in Python to deploy and manage Java applications across Linux servers.
  • Implemented Shell, Perl and Python scripts for release and build automation.
  • Involved in ensuring efficient functioning of data storage and processing functions in accordance with company security policies and best practices in Cloudsecurity.
  • Continuously monitored the performance of the applications on the production environment using Nagios.
  • Worked with development team to migrate Ant scripts to Maven and Worked on authoring pom.xml files, performing releases with the Maven release plugin and managing Maven repositories.
  • Built Kinesis dashboards and applications that reacted to incoming data using AWS provided SDKs, and exported data from Kinesis to other AWS services including EMR for analytics, S3 for storage, Redshift for big data and Lambda for event driven actions.

Environment: & Tools: Maven, Ant, Jenkins, AWS Cloud, Cloud watch, PCF, puppet, Shell, Python, JIIRA, Web logic server, VMware, Apache tomcat, Load balancer, VPC, ElasticBeanstalk, Docker, Kubernetes, OpenShift, TFS, Nagios, MYSQL and Dynamo Databases, Redshift, Git.

Confidential

Build and Release Engineer

Responsibilities:

  • Designed and maintained Subversion Repositories with branching, forks, merging, tagging, notifications, and the access control strategies.
  • Worked closely with Developers, QA teams, Production, and other stakeholders to deliver software through the build and deployment system.
  • Created and maintained Continuous Build and Continuous Integration environments in scrum and agile projects.
  • Experienced in writing Gradle, Maven and ANT scripts to perform continuous build and integration of Java applications using Jenkins with the use of necessary plugins and deploy using uDeploy.
  • Configured NEXUS Artifactory repository to store artifacts and created build configuration files including POM.xml.
  • Managed dependencies in Maven project by creating parent-child relationships between required projects and collaborate the deployment process with Shell scripting.
  • Experience in Design and Automation of uDeploy Application process, component process, Environment resources model, Plugins in uDeploy, Notification Schemes, Environment Gates, and Approval Process.
  • Involved in automating uDeploy agent installation, configuration process and integration of Subversion into uDeploy to automate the code check-out process.
  • Used Bamboo for automating Builds and Automating Deployments.
  • Worked with ANT tool to do the builds and integrated with Bamboo for the builds as the continuous integration process.
  • Involved in repository management in ANT to share snapshots and releases of internal projects using Nexus tool.
  • Managed separate VMware clusters like QA, Development and Production.
  • Extensively worked on VMware Update Manger for Host upgrades and patches.
  • Involved in troubleshooting Network, memory, CPU, swap &file system issues, TCP/IP, NFS, DNS and SMTP in Linux Servers.
  • Administered Configurations/Jobs/Kernel/Boot and Hardware for Red Hat servers in and Installing, Upgrading, Patching the Red Hat servers by using YUM.
  • Managed System Administration tasks during high deliverables for Linux servers.
  • Managed and created SLO’s according to the requirement from the developers.
  • Researched and developed Apache SSL proxy front end proof of concept that provides SSL encrypted communications between our Tomcat 5.5, WebSphere servers and our Apache front ends.
  • Involved in managing project application in GIT and end user for effective utilization and verifying methods used to create reliable and repeatable software builds.
  • Integrated Selenium test automation execution with Jenkins on different environment as part of CI Process.
  • Installed, Upgraded, Patched the Red Hat servers by using YUM.
  • Involved in setting up JIRA as defect tracking system and configured various workflows, customizations, and plugins.

Environment: & Tools: Apache web Server, Tomcat, Jenkins, Subversion (SVN), GIT, ANT, MAVEN, Agile, JIRA, Bash/Shell Script, Python, Puppet, Bamboo, Maven, Nexus, Apache tomcat, TCP/IP, NFS, RedHat, CentOS, VMware, U-Deploy.

Confidential

Linux System Administrator

Responsibilities:

  • Installed, maintained and upgraded Red hat Linux and Solaris Servers using kick-start based network installation.
  • Perl development to interact with LDAP (AD) servers to query & modify person, host & net group data.
  • Installation, setup, configuration, security administration and maintenance for flavours of servers.
  • Coded various Perl/Shell scripts to automate backup and recovery.
  • Worked closely with SAN team, allocated storage to the server and shared with its cluster nodes.
  • Configured and administered multiple production Red hat servers across multiple platforms.
  • Worked on various Unix/Linux clusters such as Oracle RAC, SAP service guard S2S and Local, Service guard CFS S2S and Local, Linux VCS HA Cluster S2S.
  • Preparation and execution of server patching and upgrade on more than 500 servers including HPUX, AIX, Solaris and Red Hat Linux servers.
  • Most of these tasks are performed using native Korn Shell and Perl scripts system resources from one partition to another partition.
  • Working on Volume management, Disk Management, software RAID solutions using VERITAS Volume manager, File system Tuning and growing using VERITAS File System (VxFS).
  • Configured, maintained, applied changes and fail-over schedules of HA servers running Veritas Cluster Server and Red Hat Cluster Server.
  • Configured the yum repositories for installing and updating the packages on the Red Hat Linux.
  • Worked on networking with LAN, WAN, Routers, Gateways, and configuring, maintaining System Securities using IPTABLES.
  • Expert in setting up SSH, SCP, SFTP connectivity between Linux hosts and experienced in DNS, NIS, NFS, CIFS, FTP, NIS, Samba Server, LDAP, remote access, security management, and system troubleshooting skills.
  • Installed application connectivity software to enable IBM print services. Configured and administered firewall rules, including the use of SNORT, NMAP to effectively monitor system files, port security, and network traffic activity coming through the firewall.
  • Involved in UNIX Shell/Perl scripting for automation job and creating zones for application.
  • Created ZFS file system in Solaris 10 using Zpool and ZFS and created Veritas file systems, RAID 0, 1, 5 volumes.
  • Implemented Oracle RAC high availability application cluster on RHEL 4.5.

Environment: & Tools: Linux instances, AIX Servers, HP-DL380 G6/G8 and DELL R710/720 and 920, p8/p7 Series, IBM-Blade Centre (Chassis Blades HS23, HS-22 & Racks), HMC V7R7.7.0, ILO, KVM, RHEL 6/5, VMware-vSphere.

Confidential

 Java/J2EE Developer

Responsibilities:

  • Involved in preparation of functional definition documents and Involved in the discussions with business users, testing team to finalize the technical design documents.
  • Developed the application using with JSP, Servlets, AJAX, JavaBeans, and XML.
  • Designed components for the project using Model-View-Controller (MVC), Data Access Object (DAO).
  • DevelopedJavaServlet that acts as a controller and maintains the session state and handles user requests in Middle Tier.
  • Implemented Business Delegate, Session Facade, DAO, Singleton, Factory and DTO Design Patterns.
  • Implemented Swing and JAVAFX frames in developing user interfaces.
  • Utilized Servlets to handle various requests from the client browser and send responses.
  • Used HTTP Servlet to track sessions of the users visiting the web pages.
  • Deployed this web application via WebSphere server.
  • Involved in batch processing using JDBC Batch to extract data from database and load into corresponding Application Tables.
  • Building of Dynamic Single Page Application (SPA) using MEAN (Mongo, Express Angular, and Node) full-stack JavaScript framework is experienced.
  • Involved in the team with Senior Developers to write JVM memory management code using different Object s and Garbage collector methods.
  • Used Log4j and commons-logging frameworks for logging the application flow.
  • Used SVN for source code and project documents version control.

Environment: & Tools: J2EE, JSP, Servlets, JDBC, AJAX, XML, WebSphere, Eclipse, SDLC, Oracle, Windows, Log4j, SVN.

We'd love your feedback!