Sr. Aws Devops Engineer Resume
Pleasanton, CA
PROFESSIONAL SUMMARY:
- Over 10 + years of experience in DevOps, Release Engineering, Configuration Management, Cloud Infrastructure, Automation, Amazon Web Services (AWS).
- Experience in Infrastructure Development and Operations involving AWS Cloud platforms, EC2, EBS, S3,VPC, RDS, SES, SNS, SQS, ELB, Autoscaling, Microsoft Azure Cloud, CloudFront, CloudFormation, ElacticCache, CloudWatch, CI/CD pipelines.
- Experience in Core Java concepts like object - oriented programming, JDBC, Multi-Threading, and advanced Java concepts like JSP, Servlets, Hibernate, Struts, Spring and Webservices .
- Having extensive GUI design development capabilities for financial and banking applications using HTML5, CSS3, AJAX, JavaScript, XML .
- Extensive experience on DevOps essential skills like continuous integration, continuous deployment, continuous delivery ,supporting Build Pipelines Release management,configuration management( Infrastructure as a code) and cloud computing.
- Have a good experience in writing many ad-hoc scripts using different languages like Python, Ruby and Shell based scripting.
- Solid understanding of Software Development Life Cycle like Waterfall methodology and Agile methodology and Scrum .
- Experience on monitoring tools such as Splunk and Nagios , used Cloud Watch to monitor AWS infrastructure, and used to analyse and monitor the data.
- Expert in deploying the code through web application servers like WebSphere, WebLogic, Apache Tomcat, JBOSS .
- Deep expertise in building and breaking cloud-scale systems, with focus on Information Security, User Authentication, Network Security, Key Management, Resource Isolation.
- Highly organized, able to multi-task, the ability to work individually, within a team, and with other groups.
- Experiences working with various services in Azure like Data Lake to store and analyze the data.
- Special interest and experience in AWS cloud infrastructure database migrations, Postgresql and converting existing ORACLE and MS SQL Server databases to PostgreSQL, MySQL and Aurora.
- Experience writing Ansible playbooks to deploy automated tasks to multiple servers using Python , Managing Configurations of AWS Nodes and Test Playbooks on AWS instances using Python and written Ansible Scripts to assist Dev servers.
- Extensively worked with change tracking tools like Remedy, JIRA and used these for tracking, reporting and managing bugs.
- Proficiency in authoring and managing configuration management frameworks such as Chef, Ansible and Puppet .
- Experience in creating virtual instances with Docker , worked on several Docker components like Docker Engine, Hub, Machine, Compose and Docker Registry .
- Created and managed a Docker deployment pipeline for custom application images in the cloud using Jenkins.
- Experience in Setting up the build and deployment automation for Terraform scripts using Jenkins.
- Provisioned the highly available EC2 Instances using Terraform and cloud formation and wrote new plugins t
- Migration of on premise data (Oracle/ SQL Server/ DB2/ MongoDB) to Azure Data Lake Store(ADLS) using Azure Data Factory(ADF V1/V2).
- Automate and extended continuous delivery for applications using Chef and responsible for creation and management of Chef Cookbooks , written several cookbooks in Chef to automate the environment provisioning, Middleware Infrastructure Installations.
- Prior experience working with common developer tool chains to achieve Continuous Integration. ( Jenkins,Bamboo, TeamCity ).
- Cloud Platforms AZURE (API Management Services, Data Factories, App Services, Data Lake Store, SQL Databases & Virtual Machines)
- Extensive experience in ETL and BI testing for many data extractions and data migration projects.
- Develop automation and processes to enable teams to deploy, manage, configure, scale and monitor their applications in data centers and in cloud. with common developer tool chains to achieve Continuous Integration. ( Jenkins,Bamboo, TeamCity ).
- Implemented Continuous Integration and Continuous deployment using Jenkins for End-to-End automation for all build and deployments and to implement Blue - Green deployment methodology to minimize down time in Production environment.
- Implemented a 'server less' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS Lambda code from Amazon S3 buckets. Created a Lambda Deployment function, and configured it to receive events from your S3 bucket.
- Transformed traditional environment to virtualized environments with, AWS - EC2, S3, EBS, EMR, ELB,EBS, Kinesis, Redshift, Matillion, chef, Puppet, Jenkins, Jira, Dockers, Vagrant, OpenStack - Nova, Neutron, Swift, Cinder, and VMware .
- Extensively experienced in build automation tools like ANT, MAVEN and Gradle and have a good working knowledge on these tools.
- Self-starter with an in-depth level of understanding in the strategy and practical implementation of AWS cloud-specific and OpenStack technologies.Hands on experience in AWS provisioning and good knowledge of AWS services like EC2, S3, Route 53, CloudFormation, Elastic Bean Stalk, VPC, EBS etc., Knowledge of application deployment and data migration on AWS.
- Experience in Branching, Merging, Tagging and maintaining the version across the environments using Source Code Management tools like SVN, GIT and CVS .
- Implemented Kubernetes for container cluster management in the Jenkins CI/CD pipeline.
- Worked on many aspects of Kubernetes cluster deployment and cluster health services, from developing cloud services to deployment on top of runtime infrastructure.
TECHNICAL SKILLS:
Operating Systems: Windows server 2000/2003/2008/ XP, Windows 7, LINUX (RHEL), Ubuntu, CentOS.
CI Tools: Jenkins, Bamboo.
Build Tools: ANT, MAVEN, Gradle.
Version Tools: SVN, Bitbucket, GIT, TFS.
CM Tools: Chef, Ansible, Puppet, Terraform.
Cloud Technologies: Amazon Web Services (AWS), EC2, EBS, VPC, RDS, ELB, Autoscaling, S3, Microsoft Azure Cloud, CloudFront, CloudFormation, ElastiCache, SNS.
Database: MySQL, MongoDB, Cassandra, SQL Server, Oracle.
Languages/ Scripting: C, C++, Java/J2EE, HTML5, CSS3, XML, JavaScript, Python, Ruby, Shell Scripting.
SDLC: Agile, Scrum, Waterfall.
Ticketing Tools: Jira, Remedy.
Repositories: Nexus, Artifactory, JFrog.
Monitoring Tools: Nagios, Splunk, CloudWatch.
Web / Application Servers: Apache Tomcat, JBOSS, WebSphere Application Server.
Web technologies: Servlets, JDBC, JSP, HTML, JavaScript, XML.
Virtualization: VMware, VMware vSphere, XEN, Orchestrator, Vagrant
PROFESSIONAL EXPERIENCE:
Sr. AWS DevOps Engineer
Confidential, Pleasanton, CA
Responsibilities:
- Designed, configured and managed public/private cloud infrastructures utilizing Amazon Web Services, including EC2, VPC, Route53, S3, SNS, Glacier, IAM, EBS, Redshift, Lambda, Dynamo DB, Subnets, Security Groups, Route Tables, Elastic Load Balancer, Cloud Watch and Cloud Trail .
- Experience with EC2, Cloud Watch, Elastic Load Balancing and managing securities on AWS .
- Launching and configuring of Amazon EC2 Cloud Servers using AMI's (Linux/Ubuntu) and configuring the servers for specified applications using Jenkins .
- Configured Identity and Access Management (IAM) to handle users and groups for secured access into AWS services.
- Guided and migrated Postgresql and MySql databases to AWS Aurora.
- Designed AWS Cloud Formation templates to generate custom sized VPC, subnets, NAT to check successful deployment of Web applications and database templates.
- Design and implement DevOps and CI/CD process using GIT, Bitbucket, Jenkins, Docker, Kubernetes and AWS Services.
- Creating S3 buckets and managing policies for S3 buckets , utilized S3 bucket and Glacier for storage and backup on AWS.
- Troubleshoot and resolve system problems including AWS services.
- Maintained build related scripts developed in shell for Maven builds. Created and modified build configuration files including POM .xml.
- Configured Inbound/Outbound in AWS Security groups according to the requirements.
- Used MySQL, DynamoDB and ElastiCache to perform basic database administration and wrote Python scripts to move data from DynamoDB to MySQL Database.
- Involved in AWS EC2/VPC/S3/SQS/SNS based automation through Terraform, Ansible, Python, Bash Scripts. Adopted new features as they were released by Amazon, including ELB & EBS.
- Experience in working across DEV, QA, Staging, PROD systems in addition to managing requests and tickets for IT process management through Jira ticketing tool .
- Worked with version control tools like GIT and SVN and integrated GIT into Jenkins to automate the code check-out process.
- Responsible for Continuous Integration ( CI ) and Continuous Delivery ( CD ) process implementation using Jenkins along with Python and Shell scripts to automate daily routine jobs.
- Worked on Ansible for configuration management and infrastructure automation.
- Wrote Ansible Playbooks with Python SSH as the Wrapper to manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python.
- Automatically installed packages from repository, changing the configuration of that are remotely accessed machines. This process is done by creating the Ansible Playbooks. worked on developing Ansible playbooks to install and configure Apache Tomcat, Jenkins.
- Utilized Ansible and GIT in the release pipelines for automated workflow.
- Worked with Docker and help improve our Continuous Delivery framework to streamline releases and reliability.
- Implemented AWS bigdata datalake and data mart using Lambda, Gateway API, Apache Presto, Spark based ETL and storing trillion rows tables in parquet.
- Used AWS Redshift, S3, Spectrum and Athena services to query large amount data stored on S3 to create a Virtual Data Lake without having to go through ETL process.
- Implemented Docker container clusters which are managed by Kubernetes. Utilized Kubernetes and Docker for the runtime environment of the CI/CD system.
- Involved in writing Maven build scripts and POM files for generating artifacts such as JAR, WAR and EAR .
- Working with Ambassador API gateway for microservices, authentication, routing.
- DevOps role converting existing AWS infrastructure to Server-less architecture (AWS Lambda, Kinesis) deployed via CloudFormation .
- Utilized Kubernetes to integrate with docker container for creating pods, configmaps, deployments.
- Used Kubernetes to deploy scale, load balance, and manage Docker containers with multiple name spaced versions.
- Implement Jenkins in building pipeline to deploy all builds to the docker registry and then deployed into Kubernetes.
- Validated the ETL Scripts in the Target Database (Oracle) for creating the cube before migrating it to SQL Server.
- Developed Test Cases for Deployment Verification, ETL Data Validation, Cube Testing and Report testing.
- Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
- Worked on Oracle Data Integration ETL Data mappings for source data extractions
- Implemented Keycloak SSO tool for Authenticating and Authorizing multiple services.
- Having knowledge on SAML and OAuth2 protocols for SSO.
- Used Cloud watch logs to move application logs to S3 and create alarms based on a few exceptions raised by applications. Managed network security using Load balancer , Auto scaling , Security groups and NACLs.
Environment: EC2, VPC, S3, Route53, Glacier, SNS, IAM, Lambda, RDS DB, Elastic Load Balance, CloudWatch, CloudTrail, GIT, Jenkins, Docker, Ansible, Terraform, Ambassador API gateway, Kubernetes, Jira, Bitbucket, Shell, Maven, SAML, OAuth2, ETL, Aurora, Keycloak.
AWS DevOps Engineer
Confidential, Desmoines, IA
Responsibilities:
- Responsible for design and maintenance of the GIT repositories, views, and access control strategies and used GIT to keep track of all changes in source code.
- Created release branches in GIT Hub by GIT administration including branching, reporting and assisting with project and end user support.
- Installation, integration and configuration of Jenkins CI/CD, including installation of Jenkins plugins.
- Responsible for Continuous Integration ( CI ) and Continuous Delivery ( CD ) process implementation using Jenkins along with Python and Shell scripts to automate daily routine jobs.
- Exposed to Development life cycle and participated in Agile team, hands on experience with quality assurance methods.
- Implemented a CI/CD pipeline with Docker, Jenkins and GitHub by virtualizing the servers using Docker for the Dev and Test environments by achieving needs through configuring automation using Containerization.
- Working with ETL tool suite to design and develop work flows, transformation mappings, and UNIX scripting.
- Created a Python process hosted on Elastic Beanstalk to load the Redshift database daily from several source.
- Setting up the build and deployment automation for Terraform scripts using Jenkins.
- Provisioned the highly available EC2 Instances using Terraform and Ansible Playbooks.
- Developed environments of different applications on AWS by provisioning on EC2 instances using Docker, Bash and Terraform.
- Objective of this project is to build a data lake as a cloud based solution in AWS using Apache Spark and provide visualization of the ETL orchestration using CDAP tool.
- Data Profiling, Mapping and Integration from multiple sources to AWS S3/RDS/Redshift.
- Prepared Master Detailed ETL and BI Test Plan for the entire Project.
- Current development toolset includes stored functions and triggers, SnapLogic, Aurora PostgreSQL, S3, Redshift, Snowflake.
- Performed End-to-End testing for the entire ETL and Business Intelligence.
- Developed a testing strategy for all upcoming data feeds for both ETL and BI arena.
- Setting up Jenkins master, add the necessary plugins and adding more slaves to support scalability and agility.
- Designed the pipeline using Jenkins for continuous integration and continuous deployment into different Web and Application Servers.
- Managed automation tasks from Jenkins when the code is developed, performs unit tests locally and checks it into GIT .
- Experience in build management tools like ANT for writing build.xml and Pom.xml files.
- Used ANT for building and deploying artifacts in JFrog Artifactory and Nexus .
- Experience in handling web application servers like WebSphere, WebLogic, JBoss, and Apache Tomcat Servers for deployment
- Ability to write scripts in Bash/Shell, PERL, Ruby, Python to automate Cronjobs and system maintenance.
- Helped in migration the existing MYSQL and ORACLE databases to the cloud using AWS Database Migration service and schema conversion
- Integration between DB2, Oracle and SQL Server utilizing Web Services and MQ/JMS.
- Worked on Oracle Data Integration ETL Data mappings for source data extractions
- As a BI Test responsible for the business requirements, ETL Analysis, ETL Routines, developed test strategies and design of the flow and the logic for the Data warehouse project.
- Written Shell scripts for automation purpose and Written Python scripts to automate log rotation of multiple logs from servers.
- Developed multiple Chef Cookbooks from scratch.
- Implemented automation with Vagrant, Chef on AWS for application testing, deployment, and development. Prepared documentation describing the configuration state of each item that was under CM control.
- Installed, Configured and administered Oracle WebLogic Server 10.0 MP1, 10.0 MP2, 11g and Webservers like Apache in Development, Test and Production Environments.
- Deployed web applications using Chef by developing the Cookbook. Also responsible for creating and importing all the environments required to run the project.
- Installed Workstation, Bootstrapped Nodes, Wrote Recipes and Cookbooks and uploaded them to Chef-server, Managed On-site OS/Applications/Services/ Packages using Chef as well as AWS for EC2/S3/ELB with Chef Cookbooks.
- Experience in Chef Configuration management tool & created Chef Cookbooks using recipes to automate system deployment and scaling operations.
- Experience with setting up Chef Infra, bootstrapping nodes, creating and uploading recipes, node convergence, data bags, attributes, cookbooks and templates in Chef SCM.
- Experience working on several Docker components like Docker Engine, Docker Hub, Docker Machine, Docker Swarm and Docker Registry .
- Worked on Docker Container snapshots, attaching to a running container, removing images, managing director structures and containers.
- Used Docker to implement a high-level API to provide lightweight containers that run processes isolation and created Docker container images, tagging and pushing the images to the Docker repository.
- Virtualized the servers using Docker for the test environment and development environment needs and also configuring automation using Docker containers.
- Installed and configured Nagios monitoring tool to monitor the network bandwidth and the hard drives status.
- Used Jira tool for issue tracking and bug tracking.
- Good Knowledge in Software Testing Life Cycle and use of Agile Methodology.
- Experience with Linux systems, virtualization in a large-scale environment, experience with Linux Containers and Docker.
- Developed Splunk queries and Splunk dashboards target for understanding applications performance and capacity analysis.
- Installed, tested and deployed monitoring solutions with Splunk for log analyzing and improving the performance of servers.
- Experience working with LDAP Services and using various network protocols like HTTP, TCP/IP, FTP, SSH, UDP and SMTP.
Environment: GIT, Jenkins, ANT, Bash/Shell, Perl, Ruby, Python, WebSphere, WebLogic, JBoss, Tomcat, Agile, JFrog, Nexus, Redshift, Chef, Docker, Splunk, Nagios, Jira, LDAP.
Confidential, Huston, TX
Build and Release Engineer
Responsibilities:
- Responsible for creating software builds and releases, including the design and development of builds, scripts, installation procedures and systems, including source code control and issue tracking.
- Wrote Shell and Perl scripts for the automation of deployments, including build and release tasks, and integrating them with continuous integration tools.
- Maintain Jira for issue reporting, status, and activity planning.
- Developed build and deployment scripts using MAVEN as build tools in Jenkins to move from one environment to other environments.
- Configured email and messaging notifications, managed users and permissions, and system settings by using Jenkins .
- Experience in creating and managing pipelines using AWS Data Factory, copying data, configuring data flow in and out of AWS Data Lake Stores according to technical requirements.
- Educate developers on how to commit their work and how can they make use of the CI/CD pipelines that are in place.
- Helped individual teams to set up their repositories in bit bucket and maintain their code and help them setting up jobs which can make use of CI/CD environment.
- Prepared role back strategies for various deployment activities in Subversion ( SVN ) and Performed all necessary Subversion support like branching, merging and tagging, checkouts, import, export.
- Managed the code migration from TFS, CVS to Subversion repository.
- Used Nexus for periodic archiving and storage of the source code for disaster recovery, sharing artifacts and handling dependency management within the company.
- Working experience with release engineering, build or configuration management in JBoss web application environment.
- Managed the implementation team to coordinate installation, template/table development, interface efforts, training, roll-out dates and on-going support for EPM/EMR.
- Provided implementation transition of EMR systems for providers and associated end users.
- Worked on build management and re . lease engineering in JBoss web application.
- Manually creating and configuring a JBoss template for deployment onto multiple servers.
- Managed Maven pom.xml files and scripts for repository management tools Artifactory and Nexus .
- Configured SSH, SMTP, Build Tools, and Source Control repositories in Jenkins. Installed multiple plugins to Jenkins. Configured Proxy to get auto updates.
- Used to maintain RedHat Linux servers on VMware.
- Experience in implementing Data warehouse solutions in AWS Redshift, worked on various projects to migrate data from one database to AWS Redshift, RDS, ELB, EMR, Dynamo DB and S3.
- Created Documentation for Application Deployment ( WAR, JAR, EAR ) in Domain and Clustered environments to achieve High Availability and Fail-over functionality.
- Discussed with the team of Infrastructure Engineers, regarding Terraform templates in provisioning the AWS resources such as EMR clusters, S3, ECS etc.
- Creation and scheduling of Cronjobs for Backup, System Monitoring and removal of unnecessary files.
- Responsible for maintaining Nagios monitoring tools to check the log files and rectifies the errors.
- Provided 24/7 support for troubleshooting production issues and involved in Monitoring, tracking, coordinating and managing issues.
Environment: Maven, Jira, Red hat Linux, JBoss, Subversion, Jenkins, VMware, EMR, CVS, TFS, Shell, Perl, Nagios.
Confidential, St.Paul, Minnesota
Linux/System Administrator
Responsibilities:
- Worked Extensively on various networking protocols like TCP/IP, FTP, HTTP, HTTPS, DHCP .
- Wrote Shell scripts for automation of daily tasks, documenting the changes that happen in the environment and in each server, analysing the error logs, analysing the User logs.
- Worked in setting up LDAP, DNS, DHCP Server along with effective group and System Level policies and roaming profile features by using Samba and NFS servers.
- Worked on troubleshooting network problems. Designed and developed Jenkins Build deployments.
- Developed UNIX and Bash scripts for manual deployment of the code to the different environments and keep the team updated when the build is completed.
- Disaster Recovery activities and extracting Disk ISO for critical production environment.
- Involved in design, configuration, installation, implementation, management, maintain and support for the corporate Linux servers RHEL 3, 4, 5, CENTOS 5, Ubuntu .
- Managed systems routine backup, scheduling jobs, enabling Cronjobs , enabling system logging and network logging of servers for maintenance.
- Provided 24x7 System Administration support for RedHat Linux 3.x, 4.x servers and resolved trouble tickets on shift rotation basis Provided the support of building the server, patching, user administration tasks, deployment, software installation, performance tuning and troubleshooting.
- Updating and upgrading Red Hat Enterprise Linux servers with YUM and RPM repositories. Involved in creating Logical Volume Manager ( LVM ) for Linux operating systems. Managed Virtual Memory and swap space on RedHat Linux Servers.
- Configuring sudo and granting root permission to backup admins/DBA’s to perform root related activities.
- Installed and Configured the network servers DNS, NIS, NFS, SENDMAIL and application servers Apache, JBoss on Linux.
Environment: Red hat Linux, CentOS, DHCP, DNS, Apache, JBoss, Logical Volume Manager, Jenkins, HTTP, Samba, NFS, NIS, Shell, Bash.
Release Engineer
Confidential
Responsibilities:
- Developed Web Applications using HTML5, CSS3, JavaScript, Angular.js and jQuery .
- Involved in team to develop windows Application(.exe) using .Net.
- Created all JavaScript logic (validation, animations, transitions, templating, date picking).
- Considered key problem solver for complex JavaScript issues.
- Created sliders, menus and front pages designs using JavaScript .
- Developed the application using Eclipse IDE and used eclipse standard/plug-in features for editing, debugging, compiling, formatting, build automation, test case template generation, mutator/accessor code generation and version control ( SVN ).
- Implemented J2EE standards, MVC2 architecture using Struts Framework.
- Used the light weight container of the Spring Framework to provide architectural flexibility for Inversion of Controller ( IOC ).
- Used Spring IOC for dependency injection to Hibernate and Spring Frameworks.
- Involved in writing Hibernate queries and Hibernate specific configuration and mapping files.
- Coded JDBC programs for connection to the MySQL/Oracle Database.
- Developed Servlets and JSP’s based on MVC pattern using Struts Action framework
- Deployed into WebSphere Application Server.
- Written unit tests using Junit Framework and Logging is done using Log4J Framework.
- Used XML, XSLT, XPATH to extract data from Web Services output XML.
- Used ANT scripts to fetch, build, and deploy application to development environment.
Environment: HTML5, CSS3, JavaScript, Angular.js, jQuery, Eclipse IDE, SVN, MVC, Struts, Spring, Hibernate, MySQL, Oracle, JSP, WebSphere, Junit, Log4J, XML, XPATH, ANT.