We provide IT Staff Augmentation Services!

Devops Engineer / Lead Aws Engineer Resume

3.50/5 (Submit Your Rating)

SUMMARY

  • 9+ Years of IT industry Experience in Linux Administration, with Software Configuration Management, Change Management, build automation, Release Management and DevOps experience in large and small software development organizations.
  • Experience in using Build Automation tools and Continuous Integration concepts by using tools like ANT , Jenkins and Maven.
  • Experience in using Configuration Management tools like Puppet , Chef, Ansible.
  • Developed Puppet modules to automate application installation and configuration management.
  • Expertise on all aspects of chef server, workstations, Nodes, chef clients and various components like Ohai, push jobs, super market etc.
  • Extensively worked on Vagrant & Docker based container deployments to create environments for dev teams and containerization of env’s delivery for releases.
  • Experience in working on Docker Hub , creating Docker images and handling multiple images primarily for middleware installations and domain configuration.
  • Knowledge on various Docker components like Docker Hub, Machine, Compose and Docker Registry.
  • Maintained Jenkins masters with over 80+ jobs for over 10+ different applications supported Several Quarterly and project releases in parallel.
  • Implemented OpenShiftContainer Storage (Gluster) ClusterforautomationandHigh
  • Availability of applicati ons.
  • Deployed Hawkular metrics and Prometheus with Grafana dashboard tomonitorthe
  • Openshiftcluster.
  • Benchmark Elasticsearch - 5.6.4 for the required scenarios.
  • WorkedonAWSCloudplatformandits features includingEC2,VPC,EBS,AMI, SNS,RDS,
  • EBS, CloudWatch, CloudTrail,CloudFormation,AutoScaling,CloudFront,S3
  • Installed and configuredLAMP (LINUX, Apache, MySQLandPHP)stackenvironmenttohost WordPresswebsites .
  • Infrastructure design for the ELK Clusters. Responsible to designing and deploying new ELK clusters (Elasticsearch, logstash, Kibana, beats, Kafka, zookeeper etc.
  • Worked with Kubernetes: Cluster maintenance, logging monitoring, security, troubleshoot in g.
  • Used elasticsearch for name pattern matching customizing to the requirement.
  • Installed logstash-forwarder and run logstash-forwarder to push data
  • Used Kibana plugin to visualize for elasticsearch.
  • Setup Kubernetes local cluster and made itupandrunningwithDocker,Minikubeand Kubec t l.
  • Monitored the applications onOpenShiftenvironmentusingDynatrace Application performance management(APM)tools. Developed APIs for integration with various data sources.
  • POC on Implementing AWS Kinesis based consumers in Lambda and pipeline data to a data lake while allowing real time analytics using DynamoDB and ordering based on the location of mobile users and ordering trends
  • Experienced in Gitlab CI and Jenkins for CI and for End-to-End automation for all build and CD.
  • Expertise in using Nexus and Arti factory Repository server for Maven and Gradle builds.
  • Ability to build deployment, build scripts and automated solutions using Shell Scripting.
  • Experience in using monitoring tools like Icinga, Nagios.
  • Elasticsearch and Logstash performance and configuration tuning.
  • Develop ongoing test automation using Ansible, Python based framework
  • Using Ansible to Setup/teardown of ELK stack (ElasticSearch, Logstash, Kibana)
  • Identify and remedy any indexing issues, crawl errors, SEO penalties, etc.
  • Install, configure, and maintain ELK stack systems.
  • Work with engineering teams to optimize Elasticsearch data ingest and search.
  • Collaborate with business intelligence teams to optimize visualizations.
  • Architect horizontally scalable solutions (terabyte scale or larger).
  • Been an active team member and has worked on developing utilities using C C, SYBASE, ORACLE PL/SQL, with UNIX PERL,, UNIX Shell Scripting
  • Involved in supporting applications running in Production environment and providing on time help to the application users..
  • Operating Systems HP-UNIX 11i, Solaris 5.8, AIX UNIX, windows.
  • Strong knowledge in Sybase ASE database programming using T-SQLStored Procedures, Buit in Functions, Triggers, Cursors etc.
  • Elasticsearch and Logstash performance and configuration tuning.
  • Respond to and resolve access and performance issues.
  • Experienced in branching, tagging and maintaining the version across the environments using, Software Configuration Management tools like GITHUB , Subversions (SVN) like GIT , and Team Foundation Server (TFS) on Linux and Windows platforms.
  • 2+ years of experience in compiling and deploying the different applications (Jboss, Apache, Tomcat, windows etc) on different linux and windows servers through Devops pipeline which includes Chef, Jenkins, Artifactory and RedhatOpenstack.
  • Expertise in deploying Jboss, tomcat and apache servers through the Devops pipeline including tools Github, Jenkins, Artifactory and CA release automation.
  • Involved in development of front-end components using Spring MVC, JSP, JavaScript, JAVA, and JSON.
  • Developed JUnit test cases for all the developed modules.
  • Intermediate experience with Advanced JavaScript including prototype-based inheritance, AJAX, JSON and familiar with JavaScript frameworks such as, JQuery and JQuery- UI.
  • Experienced migrating SVN repositories to GIT .
  • Used Pagination component of jQuery for navigation and used DOM and AJAX to display page contents.
  • Designed user interface for the product of gift card using Angular JS, jQuery, CSS3, HTML5 and JavaScript.
  • Attended daily Scrum meetings, kept up-to-date on best practices for JavaScript frameworks and techniques.
  • Ensured successful architecture and deployment of enterprise grade PaaS solutions using Pivotal Cloud Foundry (PCF) as well as proper operation during initial application migration and set new development.
  • Worked in G IT implementation containing various Remote Repositories for a single application.
  • Experienced with handling Cloud environments AWS and Open Stack.
  • Well experience in setting up VPC peering between two VPCs and remote VPN.
  • Worked in all areas of Jenkins setting up CI for new branches, build automation, plugin management and securing Jenkins and setting up master/slave configurations.
  • Analyze and evaluate existing architecture at Customer on Premise Datacenters and Design, Configure and Migrate complex network architectures to AWS Public Cloud.
  • Frameworks: Spring, Spring AOP, Spring Boot, DAO in Spring Frameworks, Angular, Hibernate (ORM) 3.0/4.0
  • Proficient in AWS services EC2, IAM, S3, Elastic Bean stalk,VPC, ELB, RDS, EBS, Route 53 .
  • Provisioning EC2 instances and have knowledge on all resources areas of EC2 like instances, Dedicated hosts, volumes, Keypairs, Elastic IP’s, Snapshots, Load Balancers and Security Groups.
  • Worked in managing VMs in Amazon using AWS and EC2 .
  • Developed and tested many features for dashboard using Flask, CSS and JavaScript.
  • Developed server side application to interact with database using Spring Boot and Hibernate.
  • Developed POJOs for Data Model and made individual HBM records to delineate Java Objects with Relational database tables.
  • Developed backend of the application using the flask framework.
  • Utilize PyUnit, the Python unit test framework, for all Python applications.
  • Proficient in persistence framework like Hibernate and JPA.
  • Integrated Spring Hibernate and JPA frameworks.
  • Extensively worked on Python scripting and development. CSS is used to style Web pages, XHTML and XML markup.
  • Interaction with the users at all stages of development to ensure that development was as per user specifications. Designed and Implemented the Python in Eclipse PyDev.
  • Strong programming skills in designing and implementation of multi-tier applications using web-based technologies like Spring MVC and Spring Boot.
  • Used JPA and Hibernate with entity beans for interacting persistence layer for CRUD operations.
  • Strong programming skills in designing and implementation of multi-tier applications using web-based technologies like Spring MVC and Spring Boot.
  • Hands on Experience in AWS provisioning and good knowledge of AWS services like EC2 , S3, Glacier , ELB , RDS .
  • Good Knowledge in Bash, Ruby, Python and Perl scripting.
  • Staying up-to-date with current Web application and development technologies and services.
  • Working Experience on various Python packages such as NumPy, SQLAlchemy, Beautiful soup, pickle, Pyside, Pymongo, SciPy, PyTables.
  • Responsible for delivery of new environments with various middleware configuration for newly assigned projects and performed backfill activities on all environments to bring the env’sup to current release cycles.
  • Created AWS EBS volumes for storing application files for use with AWS EC2 instances whenever they are mounted to them and installed Pivotal Cloud Foundry (PCF) on EC2 to manage containers created by PCF.
  • Designed and maintained databases using Python and developed Python based API (RESTful Web Service) using Flask, SQLAlchemy and PostgreSQL.

TECHNICAL SKILLS

DevOps Tools: Nexus Repository, SonarQube, Jenkins, Puppet, Chef, Ansible, Docker,Kinesis, Nagios, Icinga, GIT.

Infrastructure as A service: AWS, open stack (basic understanding).

Virtualization Platforms: Virtual Box, VMware, Vagrant.

Operating Systems: UNIX, Linux, Windows, FreeBSD.

Automation Tools: Jenkins, Cruise Control

Scripting Languages: Bash, Perl, Python, Ruby.

Version Control Software: Subversion, GIT, Perforce.

CD Tools: Cruise, Urban CodeuDeploy, UrbanCode Release/Build

Logging: Sumo Logic, Splunk, Salesforce.Openshift.

Monitoring 24/7: Nagios, Page Duty.

PROFESSIONAL EXPERIENCE

Confidential

DevOps Engineer / Lead AWS Engineer

Responsibilities:

  • Leveraged various AWS solutions like EC2, S3, IAM, EBS, Elastic Load Balancer(ELB), Security Group, Auto Scaling and RDS in cloud Formation JSON templates
  • Defined AWS Lambda functions for making changes to Amazon S3 buckets and updating Amazon DynamoDB table.
  • Created snapshots and Amazon machine images (AMI) of the instances for backup and created Identity Access Management(IAM) policies for delegated administration within AWS
  • Creating Python scripts to fully automate AWS services which include ELB, Cloud Front Distribution, EC2, Security Groups and S3. This script creates stacks, single servers and joins web servers to stacks.
  • Wrote python scripts to manage AWS resources from API calls using BOTO SDK also worked with AWS CLI.
  • Used AWS Route53, to route the traffic between different availability zones. Deployed and supported Mem-cache/AWS Elastic Cache and then configured Elastic Load Balancing (ELB) for routing traffic between zones.
  • Used IAM to create new accounts, roles and groups and policies and developed critical modules like generating amazon resource numbers and integration points with DynamoDB, RDS.
  • Wrote Chef Cookbooks to install and configure IIS7, Nginx and maintained legacy bash scripts to configure environments and then converted them to Ruby Scripts.
  • In-depth knowledge of Data Sharing in Snowflake.
  • Deployed applications on AWS by using Elastic Beanstalk.
  • Worked on Auto Scaling, Cloud watch (monitoring), AWS Elastic Beanstalk (app deployment), AWS EBS (persistent disk storage).
  • Extensively used Cloud Formation templates for deploying the infrastructures.
  • Written the Cloud Formation scripts for data lake components that use various AWS services such as Data pipeline, Lambda, Elastic Beanstalk, SQS,SNS and RDS database.
  • Worked as Cloud Administrator on Microsoft Azure, involved in configuring virtual machines, storage accounts, resource groups.
  • Developed the performance of Automation, Installation and overall Configuration Management of servers using Puppet and Chef.
  • Created AWS Multi-Factor Authentication (MFA) for instance RDP/SSH logon, worked with teams to lock down security groups.
  • Used ETL tools such as snap logic, Using S3 Data pipeline to move data to AWS Redshift.
  • Developed API for using AWS Lambda to manage the servers and run the code in AWS.
  • Working with GITHUB to store the code and integrated it to Ansible Tower to deploy the Playbooks.
  • Created internal and external stage and t ransformed data during load.
  • Used FLATTEN table function to produce lateral view of VARIENT, OBECT and ARRAY column.
  • Worked with both Maximized and Auto-scale functionality.
  • Used Temporary and Transient tables on diff datasets.
  • Cloned Production data for code modifications and testing.
  • Shared sample data using grant access to customer for UAT.
  • Working with analysts and Department of Community Health personnel to gather business requirements.
  • Developed Apache Maven Project management tool POM file to automate the build process for the entire application such as manage project libraries, compiling, preparing war file and deploying in JBOSS EAP 6.2.
  • Configured and developed applicationContext of Spring framework to manage DAO, Service, Security and view by annotations.
  • Developed and configured ServletFilters, Listeners.
  • Time traveled to 56 days to recover missed data.
  • Automated various infrastructure activities like Continuous Deployment, Application Server setup, Stack monitoring using Ansible Playbooks and has integrated Ansible with Jenkins.
  • Monitoring the jobs to analyze performance statistics.
  • Managing and scheduling batch Jobs on a Hadoop Cluster using Oozie.
  • Trained the team members regarding different data ingestion patterns.
  • Used Kibana for data analysis and product metric visualizations.
  • Wrote CI/CD pipeline in Groovy scripts to enable end to end setup of build & deployment using Jenkins.
  • Wrote Ansible Playbooks using Python SSH as Wrapper for Managing Configurations of my servers, Nodes, Test Playbooks on AWS instances using Python.
  • Developed various Unix Shell Scripts to manipulate data files, set-up environment variables, custom FTP utility, file archiving.
  • Good Work experience of Embedded SQL and C Sybase Open servers and remote Procedures.
  • Solid knowledge of Object Oriented Programming concepts ..
  • Proficient in translating business requirements into logical data models, creating modularized enterprise applications
  • Good Work experience of Embedded SQL and C Sybase Open servers and remote Procedures.
  • Writing scripts that perform optimizations and automation of daily batch jobs
  • Domain And Functional Summery:
  • Out of my total 10 yr experience I have worked around 8 years in Equity/Derivative trading, clearance and settlement
  • Have worked on real time low latency high frequency trading engine for 2 years..
  • Having Very good expose to Trading, clearance and settlement..
  • Worked with OpenShift platform in managing Docker containers and Kubernetes Clusters.
  • Created and maintained continuous Integration (CI) using tools Jenkins/Bamboo over different environments to facilitate an agile development process which is automated enabling teams to safely deploy code repeatedly.
  • Experience in all facets of full CM process with tools such as SVN, GIT, PVCS, Clear Case, Clear Quest, Pre - force, Cruise Control, Jenkins, Bamboo, Chef and Puppet.
  • Created SonarQube reporting dashboard to run analysis for every project.
  • Written GRADLE, MAVEN scripts to automate build processes and managed Maven repository using Nexus Tool and used the same to share snapshots and releases.
  • Managed Maven project dependencies by creating Parent-child relationships between all projects.
  • Used Ansible to Setup/teardown of Confidential (Elasticsearch, Log stash, Kibana) and troubleshooted the build issues with ELK and work towards the solution.
  • Written Chef Cookbook, recipes to automate installation of Middleware Infrastructure like Apache tomcat, jdk and configuration tasks for new environments etc. Experience in using Cruise Control and Bamboo as CI Tools.
  • Working on Deployment and Configuration of Confidential for log analytics, full text search, application monitoring.
  • Configured network and server monitoring using Grafana, Confidential with Logspout and Nagios for notifications.
  • Developed Splunk queries and Splunk dashboards targeted at understanding applications performance and capacity analysis.
  • Subject matter Expert and administrator for the Tidal Enterprise Scheduler. Manage the application and patching along with creating guidelines and best practices for building jobs.
  • Derived the detailed Business, Technical, Security, Data, and Metadata Architecture design for the EDW (up to a Terabyte) with multiple data marts ranging in size from 200-400 GB's each.
  • Scheduled jobs of SSIS packages to ETL daily transactions from company's ERP system OLTP database to OLAP Data Warehouse.
  • Created Triggers to maintain log history of all tables and major changes to the existing production databases.
  • Customized the newly implemented netezza framework (NZDIF) for the project needs.
  • Developed ELT scripts to load data warehouse on Netezza 6.0.
  • Worked extensively on the netezza framework on Linux platform.
  • Developed data warehouse model in snowflake for over 100 datasets using whereScape.
  • Heavily involved in testing Snowflake to understand best possible way to use the cloud resources.
  • Developed ELT workflows using NiFI to load data into Hive and Teradata.
  • Worked on Migrating jobs from NiFi development to Pre-PROD and Production cluster.
  • Scheduled different Snowflake jobs using NiFi.
  • Used NiFi to ping snowflake to keep Client Session alive.
  • Installed, configured, modified, tested & deployed applications on Apache Webserver, Nginx & Tomcat, JBoss app servers.
  • Maintained JIRA for tracking and updating project defects and tasks ensuring successful completion of tasks in a sprint.
  • Planned release schedules with agile methodology & coordinated releases with engineering & SQA for timely delivery.

Environment: AWS, S3, EC2, ELB, IAM, RDS, VPC, SES, SNS, EBS, Cloud Trail, Auto Scaling, Chef, Jenkins, Maven, JIRA, Linux, Java, Kubernetes, Terraform, Docker, AppDynamics, Nagios, ELK, SonarQube, Nexus, JaCoCo, JBOSS, Nginx, PowerShell, Bash, Ruby and Python.

Confidential

DevOps Architect / AWS Engineer

Responsibilities:

  • Created new Ec2 instances with desired role in a VPC that is dedicated to Dev environment.
  • Wrote cucumber test scripts that checks the data ingested into various applications.
  • Worked on design of the clinical data lake that is created in AWS S3 which has various zones helps for storing data which is used by analytical tools.
  • Worked with JIRA for creating Projects, assigning permissions to users and groups for the projects & Created Mail handlers and notification Schemes for JIRA.
  • Expertise in deploying Jboss, tomcat and apache servers through the Devops pipeline including tools Github, Jenkins, Artifactory and CA release automation.
  • Building the CI/CD process from scratch.
  • Extensive knowledge in continuous integration tool Jenkins with different plugins like Github, Artifactory, SonarQube, SauceLab, CARA etc.
  • Responsible for creating database object like Procedure, Function, Trigger and Cursor.
  • Responsible for code optimization and database performance tuning.
  • TSQL Programming and identifies data sources, constructs data decomposition diagrams, provides data flow diagrams and documents the process. Additonally this will write code for database access, modifications, and constructions including stored procedures.
  • Requirement gathering and putting in technical frame work.
  • Supporting existing application of DTCC RISK engine ex- FIR, IMM,NYPC VAR,SPAN etc ..
  • Used Perl scripting for data validation.
  • Volunteered in designing an architecture for a dataset in Hadoop with estimated data size of 2PT/day.
  • Integrated Splunk reporting services with Hadoop eco system to monitor different datasets.
  • Used Avro, Parquet and ORC data formats to store in to HDFS.
  • Developed workflow in SSIS to automate the tasks of loading the data into HDFS and processing using hive.
  • Develop alerts and timed reports Develop and manage Splunk applications.
  • Provide leadership and key stakeholders with the information and venues to make effective, timely decisions.
  • Establish and ensure adoption of best practices and development standards.
  • Communicate with peers and supervisors routinely, document work, meetings, and decisions.
  • Work with multiple data sources.
  • Designed and Created Hive external tables using shared Meta-store instead of derby with partitioning, dynamic partitioning and buckets.
  • Implemented Apache PIG scripts to load data to Hive.
  • DevOps pipeline was set to compile and deploy the code through chef using Jenkins as continuous integration, artifactory as a binaries holder and Red hatOpenstack as provisioning cloud servers.
  • Completed set up of CD environment with focus on UrbanCodeuDeploy.
  • Possessed the domain knowledge on all the platforms of Microsoft Azure Cloud Technology
  • Worked on installation, configuration and maintenance Red hat, Centos
  • Monitoring the Azure clusters health Pre and Post deployments
  • Onboarding different applications into Jenkins environment for CI and managing Jenkins server.
  • Experienced with CI tools and Version Control Tools or Source Code tools.
  • Managing day to day activity of the cloud environment, supporting development teams with their requirements.
  • Worked on Versions controller like GIT and integration tools Jenkins.
  • Worked on Java and .net applications by automating their build from end to end, even by integrating test tools like SonarQube and Visual Studio using Jenkins
  • Developed Ansible Recipes to configure, deploy and maintain software components of the existing infrastructure.
  • Creation and maintenance of content for the Ansible community and implementation of Ansible modules based on customer and community requirements.
  • Working on migrating legacy, on premise applications on various Cloud platforms like OpenStack, OpenShift, Container orchestration .
  • Very good understanding and working knowledge of Orchestrating Applications Deployments with Ansible.
  • Worked with Ansible playbooks for virtual and physical instance provisioning, Configuration management and patching through Ansible.
  • Used Jira as ticket tracking and workflow tool
  • Wrote Ansible playbooks with python SSH as the wrapper to manage Configuration of AWS nodes and tested playbooks on AWS instances using python. Run Ansible scripts to provide Dev servers.
  • Ansible setup, managing hosts file, authoring various playbooks and custom modules.
  • Managing Nexus and Sonarqube server for uploading the artifacts and code quality analysis.
  • Deploy and monitor scalable infrastructure on Amazon Web Services (AWS)& configuration management using Puppet.
  • Performance Monitoring, resolving network issues and tuning the system using tools such as Top, Iostat, Vmstat, Netstat, Truss, Sar, Ndd, Ithtool, Dtrace, Strace.
  • Designed and implemented automation deployment using Urbancode and Cruise to promote major releases, monthly releases, and patches from Development -> Integration/QA -> Staging/UAT -> Production.
  • Installed and configured monitoring tools Nagios for monitoring the network bandwidth and the hard drives status.
  • Worked on DevOps group running Jenkins in a Docker container with slaves in Amazon AWS cloud configuration.
  • Managing Amazon Web Services (AWS) infrastructure with automation and configuration management tools as Ansible designing cloud-hosted solutions, specific AWS product suite experience.
  • Worked with Vmstat, Iostat, SAR, TNsping, Netstat and TCP dump to determine the system and network health, deployed Linux and windows virtual machines from pre-configured templates as VM Deployed.
  • Configuring the Docker containers and creating Docker files for different environments.
  • Experienced in Gitlab CI and Jenkins for CI and for End-to-End automation for all build and CD.
  • Dockerizingof existing applications, Docker images development and size and speed optimization.
  • Used Docker containers for eliminating a source of friction between development and operations.
  • Installation, Configuration and administration of VMware.
  • Configured Yum repository server for installing packages from centralized server.
  • Installed Fuse to mount the keys on every Debian Production Server for Password-less authentication.
  • Installed and Configured DCHP server to give IP leases to production servers.
  • Applied the clustering Topology that meets High Availability and Failover requirements for performance and functionality.
  • Installation, Configuration and administration of DNS, LDAP, NFS, NIS and send mail on Red hat Linux/Debian severs.
  • Installation and configuration of PostgreSQL and MariaDB Database on Red Hat/Debian Servers.
  • Configuration and Administration of Apache Web Server and SSL.
  • Created and maintained network users, user environment, directories, and security.
  • Provide the support of building the server, patching, user administration tasks, deployment, software installation, performance tuning and troubleshooting and KVM.

Environment: s: AWS, Ansible, Puppet, Red Hat, Centos, VMware, GIT, Bash Scripting, Shell Scripting, DHCP Server, KVM, Jenkins, SonarQube, Nexus

Confidential

DevOps Developer / Engineer

Responsibilities:

  • Interacted with client teams to understand client deployment requests.
  • Coordinate with Development, Database Administration, QA, and IT Operations teams to ensure there are no resource conflicts.
  • Worked closely with project management to discuss code/configuration release scope and how to confirm a successful release.
  • Build, manage, and continuously improve the build infrastructure for global software development engineering teams including implementations of build Scripts, continuous integration infrastructure and deployment tools.
  • Managing the code migration from TFS,CVS and star team Subversions repository.
  • Implemented continuous integration using Jenkins.
  • Installed, configured, managed and monitoring tools such as Splunk, Nagios and Graphite for Resource monitoring, network monitoring, log trace monitoring.
  • Using Jira, Confluence as the project management tools.
  • Successfully collaborated with cross-functional teams in design and development of software features for enterprise satellite networks using C /C++, leading to senior role in the organization
  • Created repositories according to the structure required with branches, tags and trunks.
  • Attended sprint planning sessions and daily sprint stand-up-meetings.
  • Configured applications servers (Apache Tomcat) to deploy the code.
  • Installation and configuration and setup of Docker container environment.
  • Created a Docker Image for a complete stack and created a mechanism via Git workflow to push the code into the container, setup reverse proxy to access it.
  • Used Kubernetes to deploy scale, load balance, scale and manage Docker containers with multiple namespace versions.
  • Prototype CI/CD system with GIT on GKE utilizing Kubernetes and Docker for the runtime environment for the CI/CD systems to build and test and deploy.
  • Experienced in Docker orchestration framework, Troubleshooting of Docker based applications.
  • Exposure to MesosMarathon&Zookeeper cluster environment for the application deployments and Docker containers.
  • Designed and Developed Bamboo Build deployments on Docker containers.
  • Installed Docker registry for local upload and download images and even from Docker Hub.
  • Used Submodules in the GIT and educated users working with sub modules in GIT.
  • Involved in migration of Bamboo server, Arti factory & GIT server.
  • Used Chef to configure and manage infrastructure, wrote cookbooks to automate the configuration setups, Deployments and implementation of Chef for infrastructure as code initiative.
  • Good in provisioning and deployment tools like Chef.
  • Worked on installation and configuration of Chef server and Chef-client(Nodes).
  • Repaired broken Chef Recipes and corrected configuration problems with other Chef objects.
  • Installed applications and load balance packages on different server using Chef
  • Developed unit and functional tests in Python and Ruby.
  • Developed and maintained Perl/Shell scripts for build and release tasks.
  • Integrated Maven with Jenkins for the Builds as the continuous Integration process.
  • Involved in Upgrade of Bamboo& Arti factory Server.
  • Maintained JIRA for tracking and updating project defects and tasks.
  • Manage and document all post deployment issues utilizing the post Deployments Issue Log.

Environment : Chef, Apache Tomcat, GIT, Python, Ruby, Bamboo, Perl, Shell, Maven, Jenkins, JIRA, Kubernetes, Docker.

Confidential

SAP consultant

Responsibilities:

  • Installed SUSE Linux on Cisco Hardware for SAP HANA deployment.
  • Experience with Linux installation, configuration management and patch administration as member of a production support team.
  • Strong knowledge of Linux Kernel configuration, performance monitoring, and tuning.
  • Good knowledge of LVM, which include creating PVs, VGs, LVs and file systems and trouble shooting.
  • Configuration and maintenance of common applications such as NFS, DHCP, NTP, SSH, DNS, and SNMP.
  • Strong knowledge of large-scale Linux deployment methodologies, kernel configuration, performance monitoring, and tuning.
  • Experience with SAN/DATA Centre Migration and Consolidation implementations.
  • Experienced in Strong Consolidation/Migration in an ENTERPRISE environment.
  • Involved in complete Administration tasks on UNIX, Red Hat Linux and Solaris and documentation for the projects executed.
  • Responsible for installation, configuration and administration of sun Solaris 9 and Red Hat enterprise Linux on X86 architecture,
  • Installed required software patches and software.
  • Used RPMs to install, update, verify, query and erase packages from Linux Servers.
  • Configured Kick start server to Install Red Hat Linux on multiple machines.
  • Experience using Kick start and modified Kick start based on server profiles and hardware specifications.
  • Experienced Installing, Configuring and supporting VMware ESX 3.4 and 4 versions.
  • Installed, monitored and supported Web and application Servers on AIX and Linux environments.
  • Implementations and setup of local Linux disk backups using open Source applications.
  • Consolidating multiple Linux Servers, into 2 physical virtual servers, using Servers.
  • Installation and troubleshooting on VMware running Linux (Red Hat) and Windows (windows 7, XP, Vista).
  • Worked with DBA for installation Oracle on Linux and Solaris.
  • Worked on installations on Power path on all Linux boxes.
  • Participate in installing and configuring UNIX/LINUX based Oracle 10g products.

Environment: s: VMware, Solaris, Kick start, SUSE Linux, LVM, Oracle 10g product, NFS, DHCP, NTP, SSH, DNS, SNMP.

We'd love your feedback!