Aws /devops Engineer Resume
Tampa, FL
SUMMARY
- Certified AWS Solutions Architect & AWS Developer Associate with 5+ years of experience in Information Technology working as AWS Engineer, DevOps Engineer, Cloud Engineer & Systems Engineer
- Around 5+ years of experience as System Administration, Build & Release management and DevOps Engineer worked with multiple flavors of Linux like red hat, centos, Ubuntu.
- Setup Docker on Linux and configured Jenkins to run under Docker host.
- Experience in Administration/Maintenance of Version control systems SVN, GIT.
- Experience in installing and administrating CI/CD tools like Jenkins
- Experience in using configuration management tools like Chef, Puppet and Ansible.
- In - depth experience with Amazon AWS Cloud Services, (EC2, S3, EBS, ELB, CloudWatch, Elastic IP, RDS, SNS, SQS, Glacier, IAM, VPC, CloudFormation, Route53) and managing security groups on AWS.
- Experienced in Amazon Web Services specifically leveraging Docker, Cloud Formation, VPC, Route 53, Elastic Beanstalk, Elastic Load Balancers, Amazon S3, Amazon SES, Amazon SNS, Amazon IAM, Amazon Container service(ECS), API Gateways, Amazon Direct Connect, CloudWatch, CloudFormation, Amazon RDS, CloudFront, Amazon Snowball, Amazon Redshift, DynamoDB.
- Experience working on monitoring tools like Nagios, Splunk and AWS Cloud Watch to health check the various deployed resources and services.
- Experienced in designing Virtual Private Cloud (VPC) and grouping the resources
- Experience in using configuration management tools like Chef, Puppet and Ansible.
- Experience in working on version control systems like Subversion, and GIT and used Source code management client tools like Git Bash, GitHub, Clear Case, Git GUI and other command line applications in terms of branching, tagging and maintenance on UNIX and Windows environment.
- Experienced with monitoring tool Nagios.
- Experience in Jenkins and Maven for Build management to automate the software build.
- Extensive experience with Continuous Integration Tools Jenkins.
- Ability to work closely with teams, to ensure high quality and timely delivery of builds and releases.
- Experience with Scrum and Agile Environments for weekly releases.
- Knowledge on AZURE and Rackspace clouds.
- Experienced in setting up DNS Records and Hosted Zones to host the website on Route 53 using AWS to minimize the operational cost and for high availability of website.
- Configured the Elastic Load Balancer and Auto Scaling Policies for more EC2 instances to manage the load and for the fault tolerance.
- Hands on experience in setting up CloudFront Content Delivery Network (CDN) which is used to deliver the data to the edge locations.
- Hands on experience in creating S3 Buckets and storing the data and creating life cycle policies to send the data periodically to Amazon Glacier.
- Experienced on creating the peering of VPC with other VPC’s and created VPN connections.
- Experienced on working with hosting the DNS Records using Route 53, and setting up services such as HTTP, FTP, SMTP, NFS etc.
- Experienced in relocation of data centers and updating the client servers and migration of data centers to cloud.
TECHNICAL SKILLS
Operating System: Windows, UNIX, LINUX, RHEL/CentOS 5.x/6. x/7, Mac OS
Versioning Tools: Subversion, Clearcase, GIT
Configuration Management Tools: Jenkins, Chef, Puppet, Ansible
Build Tools: ANT, MAVEN
Bug Tracking Tools: JIRA, Rally, Remedy, and IBM Clear Quest.
Languages: Java/J2EE and PL/SQL
Scripting: Shell scripting, Python, Ruby, Perl.
Web Technologies: HTML, JavaScript, XML, Servlets, JDBC, JSP.
Web/App servers: Apache Tomcat, JBOSS, Weblogic, WebSphere
Database: Oracle 9i/10g/11g, SQL SERVER, MySQL.
PROFESSIONAL EXPERIENCE
Confidential, Tampa, FL
AWS /DevOps Engineer
Responsibilities:
- Managed the configurations and automated the application installation by creating and maintaining the puppet modules.
- Automated most of build related tasks using Jenkins to maximize the throughput and performance of build system.
- Automated the active directory related tasks in server to generate the logs in management level and administration level using PowerShell Script.
- Monitored the real-time logs and metrics periodically of all AWS Services with Amazon CloudWatch and Data Dog and created the individual dashboards for each service.
- Cloned the databases, files and data from on-premises to AWS by creating the virtual private cloud (VPC) and grouping all the resources.
- Setup a large environment of Cloud VPC network configured with SNS, SQS, SES, DynamoDB, S3 and Amazon IAM.
- Supported web programing tasks for development by setting up the databases for MySQL and Oracle.
- Controlled Swap by modifying the Linux Kernel and automated the system administration jobs using bash shell scripting.
- Troubleshoot and monitored the performance of application arising out of Ticketing System in remedy.
- Configured GIT for Jenkin build jobs and contributed to repositories of JSON templates used for cloud formation and ruby scripts used for Chef on GIT.
- Hosted the web servers on AWS using Route 53 by creating DNS Records and managing the name server records.
- Developed a scalable database that can be used to store any amount of data and can archive the snapshots of the database using Amazon Glacier.
- Configured GIT repositories with merging, tagging, notifications and branching.
- Created fully configured Amazon AMI’s, bundled them to Amazon S3 and attached them for the auto scaling policies to design the application as highly available.
- Managed the whole Virtual Private Cloud using AWS CLI and writing the API’s to migrate the on-premises data centers to the cloud.
- Developed scripts for Application Program Interface (API) calling to Stop, Terminate and Start EC2 instances on scheduled timings.
Environment: Puppet, Python, Cloud Watch, SQS, SNS, Route 53, Amazon Glacier, GIT, Jenkins, AWS CLI, EC2, S3, DynamoDB, MySQL, Amazon AMI, Amazon CLI.
Confidential, Dallas, TX
AWS Engineer
Responsibilities:
- Created a new Virtual Private Cloud (VPC) and setup all the resources which are required for the project using cloudformation service.
- Migrated applications and databases from on-premises to cloud using Docker and API Gateway service.
- Assigned the IAM roles to the user accounts to automate the access to Instance and developed new policies using Identity and Access Management.
- Route 53 is used to create the DNS Records, CNAME, Hosted Zones of company website after transferring it to the cloud.
- Designed the Auto scaling policy to bear the load of website in the peak times in combination with Elastic Load Balancer (ELB).
- Monitored the Metric logs of Auto Scaling policy using CloudWatch Service and sending the reports periodically to the respective persons based on the errors generated.
- Simple storage service is configured to store all the data and a life cycle policy is designed to send all the data which is greater than 30 days to Amazon glacier.
- Simple storage service is attached to cloudfront which is a content delivery service to decrease the latency between user and client.
- User data templates are created to decrease the installation time of instances and relational database storage and operating the database using instance
- Monitored the relational database and generated the backup in the form of snapshots periodically and stored in simple storage service (S3).
- Assigned the Security Groups and Network ACL’s to the instances to allow only certain users to access the project, and restricted all the inbound and outbound traffic other than the specific users.
- Provided 24/7 support in rotation to meet the requirements of the project and to support the environment that is highly available, scalable and fault tolerant.
- Designed the layout of the Front-End content of the Website and maintaining the traffic of website by installing ELB and Auto Scaling of Instances.
- Designed the end to end solution to host the web application on Amazon Web Services in combination with storage service S3.
Environment: Cloudformation, EC2, Cloudwatch, S3, AWS IAM, DynamoDB, Security Groups, VPC, GITHUB, Docker, Amazon ECS, Amazon EMR, DynamoDB
Confidential, Sacramento, CA
Build and Release Engineer
Responsibilities:
- Release Engineer for a team that involved different development teams and multiple simultaneous software releases.
- Participated in weekly release meetings with Technology stakeholders to identify and mitigate potential risks associated with the releases and using Version Control Tools likes Rational ClearCase, Rational Team Concert (RTC).
- Imported and managed multiple corporate applications into Tortoise SVN.
- Development, Quality Assurance, and Management teams to ensure cross communication and confirmed approval of all production changes.
- Provided end-user training for all TortoiseSVN, JIRA users to effectively use the tool.
- Build scripts using ANT and MAVEN build tools in Jenkins to move from one environment to other environments.
- Deployed J2EE applications to Application servers in an Agile continuous integration environment and automated the entire process.
- Build scripts using ANT and MAVEN build tools in Jenkins, Sonar to move from one environment to other environments.
- Designed Continuous Build Process using Jenkins to prevent build failures.
- Developed Perl and shell scripts for automation of the build and release process.
- Involved in editing the existing ANT/MAVEN files in case of errors or changes in the project requirements.
- Configure and Supporting monitor tools like Splunk
- Managed Maven project dependencies by creating parent-child relationships between Projects.
- Jenkins is used as a Continuous Integration tool for automation of daily process.
- Documented the entire build and release engineering process and provided on call support.
Environment: MAVEN, Tortoise SVN, Jenkins(CI/CD), SonarQube, Java/J2EE, ANT, Vagrant, WebSphere, Perl Scripts, Shell Scripts, XML, UNIX, Oracle 10g, 11g, JIRA, Python.
Confidential
Unix/Linux Administrator
Responsibilities:
- Responsible for installation, Configuration management, maintenance and systems development of Unix/Linux systems.
- Automated server building using Kickstarting RHEL 6 and Jumpstarting Sun Solaris 10.
- Monitored the systems and administered Servers for day-to-day problems, patches, user administration, and hardware failure, monitoring log files, backup, and software up gradation, configuration changes and documentation.
- Performed Disk management with the help of LVM (Logical Volume Manager).
- Installed, upgraded and configured AIX Servers using NIM Master and Red hat Enterprise Linux 5/6 using kickstart.
- Hands on experience working with production servers at multiple data centers.
- VMware Installation, configuration and administration, virtual environments with 5x Virtualcenter 5x.
- Troubleshooting production servers and configuring standalone production server for testing
- Performing Migration and/or Preservation Installations and upgrades from AIX 5.3 to AIX 6.1and AIX 7.1.
- Building software packages on RedHat Linux (RPM)
- Installed, upgraded and configured SUN Solaris 9/10 on Sun Servers using Jumpstart and Red hat Enterprise Linux 5/6 using Kickstart.
- Performed extensive Logical Volume Management (LVM) tasks
- Managed File system using VERITAS volume manager 5.0.
- Performed centralized management of Linux boxes using Puppet.
Environment: RHEL 4/5, CentOS 4/5, Fedora 9/10/11(beta), Ubuntu 8.10/9.04 Server, Debian SID, VMware ESX, Veritas File System, Veritas Volume Manager, Veritas Cluster Server.
Confidential
Quality Analyst
Responsibilities:
- Facilitated requirement sessions with Business team to capture requirements on customization of screens/reports related to Member, Enrollments, and Providers.
- Translated business needs, wants and objectives in the form of user stories, created Use Case Document and Functional Requirements Document (FRD).
- Created wireframe and process flow diagrams using Microsoft Visio to clearly communicate the business requirements.
- Regularly updated sprint and product backlogs.
- Created Use Cases, Activity Diagrams, State Chart Diagrams, Sequence Diagrams, and Collaboration Diagrams thus defining the Data Process Model and Business Process Models.
- Conducted weekly meetings and daily stand-up meetings with project team, business, SMEs and vendor.
- Worked on comparing data stored in SQL tables and reports generated from these tables, cross- referenced them with data keyed in application.
- Performed Gap Analysis, Impact Analysis, and prepared As-Is and To-Be documents.
- Wrote SQL queries to extract data from customer databases to validate customer information.
- Responsible for loading Requirements and Test Cases into HP ALM and mapping of Requirements and Test Cases.
- Developed Requirements Traceability Matrix (RTM) to track requirements against test cases during the QA Phase.
- Facilitated formal defect review meetings with project teams and developers to report, demonstrate, prioritize and suggest resolution of issues discovered during testing.
- Played a key role in the planning, testing, and implementation of system enhancements and conversions.