Aws Devops Engineer Resume
SUMMARY
- I have over all 10+ years of experience in IT industry management, Continuous Integration and Continuous Delivery management and AWS Infrastructure - as-a-Code using Terraform, AWS Cloud Formation.
- Extensively working with AWS services like S3, EC2, ELB, EBS, Lambda, Auto-Scaling, Route53, Cloud Front, IAM, Cloud Watch, and RDS etc.
- Developed Jenkins pipelines to automate the Chef cookbook upload, generate environments. json file based on the applied Chef recipe attribute changes.
- Experience in development of Ansible playbooks to provision AWS EC2, S3 buckets, ELB, Route 53. Integrated Ansible with Jenkins to perform continuous deployment of multiple environments.
- Experience in using AWS including EC2, Auto-Scaling in launching EC2 instances, Elastic Load Balancer, Elastic Beanstalk, S3, Glacier, Cloud Front, RDS, VPC, Route53, Cloud Watch, Security Group, Cloud Formation, IAM, SNS.
- Experience in branching, tagging and maintaining the version across the environments using SCM tools like Subversion (SVN) and GIT on UNIX and Windows environment.
- Orchestrated and migrated CI/CD processes using Cloud Formation and Terraform Templates and Containerized the infrastructure using Docker.
- Involved in set up for Continuous Integration and Continuous Deployments with Jenkins and Hudson from Scratch for end to end automation.
- Learned Queues using MQ Commands and the MQMON monitoring tool.
- Experience working with Apache Hadoop, Kafka, Spark and Log stash.
- Worked with Apache Kafka forHigh throughput for both publishing and subscribing, with disk.
- Experience in securing AWS accounts through provisioning IAM roles, policies, cross account roles, KMS, AWS Config, S3 buckets encryption, VPC endpoints.
- Experience in deployment of the Kubernetes Cluster using Kops, Kubeadm Ansible roles and kubernetes Cloud services like AWS EKS, Azure AKS.
- Assistance in the Elastic search, Jira and Chef Server setup for partners and training other DevOps with best practice methods and procedures.
- Experience in creating Docker Images using Docker Hub and handling multiple images primarily for middleware installations and domain configurations.
- Used Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes
- Involved in Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker, on AWS. Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test deploy.
- Orchestrated and migrated CI/CD processes using Cloud Formation and Terraform Templates and Containerized the infrastructure using Docker.
- Created customized dashboards for teams.
- Developed confluence pages.
- Provided security and managed user access and quota using AWS Identity and Access Management (IAM) which included creating new Policies for user management in JSON.
- Experienced in centralized logging and monitoring stack of Elastic Search, Logstash and Kibana.
- Involved in container-based deployments using Docker working with Docker images, Docker Hub and Docker registries and Kubernetes.
- Worked extensively on automation engine using Ansible that automates cloud provisioning, configuration management, application deployment, intra-service orchestration, and many other IT needs.
- Experienced in writing Shell, Bash, Perl, Python and PowerShell scripts to automate the deployments.
- Proficient introubleshooting and system Patching.
- Experience in working withSoftware Development Life Cycle (SDLC)and Agile Methodologies & Validations to ensure the Quality Assurance Control.
- Skilled in Bug tracking and raising tickets in JIRA and escalating tickets in ServiceNow.
- Experienced with ITIL AND ITSM in Service Now.
- Proficient in Networking and configuring TCP/IP, DNS, SAMBA, SSH, SMTP, HTTP/HTTPS, FTP.
- Skilled in Administration of Production, Development and Test environments carrying different operating system platforms like Windows, Ubuntu, Red Hat Linux, Centos, Unix, IOS.
TECHNICAL SKILLS
Source Code Management: GIT, Bit bucket, SVN
Build/ CI Tools: Maven, Jenkins.
Monitoring Tools: ELK, Splunk, Cloud Watch, Grafana.
Database Management: Oracle,SQL, PL/SQL, Redshift
Web/Application Servers: Apache Tomcat, HTTP Server.
Containerization tools: Docker, Kubernetes, EKS, AWS, AWSEKS
Orchestration Tools: Terraform, Cloud Formation
Configuration Tools: Ansible
Cloud Services: AWS, GCP
Bug Tracking: Jira Kanban.
Programming Languages: Shell, Python, Bash Scripting
Operating Systems: Windows, UNIX, Linux, Ubuntu
Testing Tools: Selenium
Version Control Tools: Subversion, GitHub, Git, GitLab, SVN
PROFESSIONAL EXPERIENCE
Confidential
AWS Devops Engineer
Responsibilities:
- Involved with implementation team, to build & deploy for multiple releases like maintenance release, enhancement release, emergency releases for Linux and Windows OS on AWS.
- Configured and Tested SSO (Single Sign-on) connections between client and third party vendors.
- Coordinated with team for update code deployment with the help of Jenkins and AWS code deploy into the different environments.
- Configured AWS EC2 Instances using AMIs and launched instances with requirements of specific applications.
- Developed AWS Infrastructure from AWS CLI to support Data Warehouse hosting including AWS EC2, Virtual Private Cloud (VPC), S3 Buckets, Public and Private Subnets, Security Groups, Elastic Container Service, Route Tables, Elastic Load Balancer, Cloud Watch, Cloud Trail, and Security Management VPC.
- Provided security and managed user access and quota using AWS Identity and Access Management (IAM) which included creating new Policies for user management in JSON.
- Involved in automating backups using python scripts to transfer data in S3 bucket.
- Automated the data flow from S3 buckets to the different designed data with lambda scripts.
- Developed an existing python script to add a new field and loaded it in the Athena table and created tables using Dynamo DB and S3 buckets.
- Validated scripts for different code development for data ingestion process.
- Created a S3 bucket with bucket policy to store the logfiles.
- Worked on Monitoring/Alerting tools such as Prometheus, Grafana.
- Worked with team to build out automation templates in AWS Cloud Formation.
- Configured and managed Elastic Load Balancing to avoid single point of failure of applications, thus providing high availability and network load balancing.
- Configured Elastic Load Balancers with EC2 Auto scaling groups.
- Managed security groups, VPC specific to the environment.
- Involved in maintenance and performance of Confidential instances.
- Helping Scrum master across the company to customize JIRA for their requirements.
- On rotational calls for deployments.
- Worked with containerization tools, can implement the transition to Docker and develop a distributed cloud system using Kubernetes.
- Responsible for converting existing systems to infrastructure as code (Terraform) while maintaining platform stability.
- Monitored and alerted of production and corporate servers such as EC2 and storage such as S3 buckets using AWSCloud Watch, Cloud Front. Developed Dev, Test and Prod environments of different applications on AWS by provisioning Kubernetes clusters on EC2 instances using Docker, Ansible, and Terraform.
- Provided consistent environment using Kubernetes for deployment scaling and load balancing to the application from dev through production, easing the code development and deployment pipeline by implementing Docker containerization with multiple namespaces to divide cluster resources between multiple users.
Confidential
DevOps Engineer
Responsibilities:
- Experience in working with Ansible versions 1.8 & 2.0, Tower version 2.1.
- Experience in creating inventory, job templates and scheduling jobs using Ansible Tower.
- Worked as a Release Engineer for a team that involved different development teams and multiple simultaneous software releases.
- Configured and maintained Jenkins to implement the CI process and integrated the tool with Maven to schedule the builds.
- Designed, built and coordinate an automated build & releaseCI/CDprocess usingGitlab, Jenkins and Ansibleon hybrid IT infrastructure
- Used Flume, Kafka to aggregate log data into HDFS.
- Developed a stream filtering system using Spark streaming on top of Apache Kafka.
- Designed a system using Kafka to auto - scale the backend servers based on the events throughput.
- Built Jenkins pipeline to drive all microservices builds out to the Docker registry and then deployed to Kubernetes by creating and managing pods using Kubernetes.
- Worked with Red Hat Open Shift Container Platform for Docker and Kubernetes, to manage containerized applications using its nodes, Config Maps, selector, Services & deployed application containers as Pods.
- SCM is performed using GIT from master repository and knowledge on container management using Docker by creating Docker images.
- Build, test and deploy patches for OpenStack and Kubernetes codebase.
- Scan identified and Fixed security defects in OpenStack codebase.
- Created additional Docker Slave Nodes for Jenkins using custom Docker Images and pulled them to ECR.
- Setting up Splunk Monitoring on project environments to monitor application logs and system logs.
- Setting automated email rules in splunk for error messages.
- Installing Splunk forwarders on project instances to monitor application logs and systems logs.
- Managing Nagios for notifications.
- Able to create scripts for system administration and AWS using languages such as BASH and Python.
- Building RESTful APIs in front of diverse types of NoSQLstorage engines.
- Installing, configuring and administering Jenkins CI tool on Linux machines.
- Worked with Development Team Leads and Testing teams to establish a build schedule, execute the builds and troubleshoot build failures, if any.
- Implemented Ansible to manage all existing servers and automate the build/configuration of new servers.
Confidential
Oracle Developer/Analyst
Responsibilities:
- Wrote conversion scripts usingSQL, PL/SQL, stored procedures, functionsandpackagesto migrate data from SQL server database to Oracle database.
- Performed Database Administration of all database objects includingtables, clusters, indexes, views, sequences packagesandprocedures.
- Implemented 11g and upgraded the existing database from Oracle 11g.
- Involved inLogical & Physical Database Layout Design.
- Set-up and Design of Backup and Recovery Strategy for various databases.
- Performance tuning of Oracle Databases and User applications.
- UsedSQL*Loaderas an ETL tool to load data into the staging tables.
- Provided user training and production support.
- Improved the performance of the application by rewriting theSQL queries.
- Wrote packages to fetch complex data from different tables in remote databases using joins, sub queries and database links.
- Creation of database objects liketables, views, materialized views, procedures and packagesusing oracle tools likeToad, PL/SQL DeveloperandSQL* plus.
- Partitionedthe fact tables andmaterialized viewsto enhance the performance.
- Extensively usedbulk collectionin PL/SQL objects for improving the performing.
- Createdrecords, tables, collections(nested tables and arrays) for improving Query performance by reducingcontext switching.
- UsedPragma Autonomous Transactionto avoid mutating problem in database trigger.
- Extensively used the advanced features of PL/SQL likeRecords, Tables, Object typesandDynamic SQL.
- Working on user opened tickets and job failures.
- It is important that the job failure is retriggered as soon as possible to pervert a downstream effect.
- Periodically check the Control M Jobs, to ensure no failures has been missed.
- Manually running the important schema procedures if job fails continuously.
- Working on automation for continuously failing jobs.
- If the script got failed getting the script from server and modifying and migrated into server.
- Deep investigation on scripts which are not updating the data.
- Involved in modifying various existing packages, Procedures, functions according to the new business needs.
Confidential
Responsibilities:
- Requirement gathering from the Business Team.
- Convert the functional requirements into technical document as per the business standards.
- Analyze the Existing code and do the impact analysis.
- Created Database Objects like tables, Views, Indexes, sequences, Synonyms, Stored Procedures, functions, Packages, Cursors, Ref Cursor and Triggers as per the Business requirements.
- Involved in modifying various existing packages, Procedures, functions, triggers according to the new business needs.
- Wrote SQL Queries using Joins, Sub Queries and correlated sub Queries to retrieve data from the database.
- Used SQL Loader to upload the information into the database and using UTL FILE packages write data to files.
- Involved in explain plan for query optimization. Code validation as per client requirement