Aws Engineer Resume
Tampa, FL
SUMMARY
- Certified AWS Solutions Architect & AWS Developer Associate wif 5+ years of experience in Information Technology working as AWS Engineer, DevOps Engineer, Cloud Engineer & Systems Engineer
- Around 5+ years of experience as System Administration, Build & Release management and DevOps Engineer worked wif multiple flavors of Linux like red hat, centos, Ubuntu.
- Setup Docker on Linux and configured Jenkins to run under Docker host.
- Experience in Administration/Maintenance of Version control systems SVN, GIT.
- Experience in installing and administrating CI/CD tools like Jenkins
- Experience in using configuration management tools like Chef, Puppet and Ansible.
- In - depth experience wif Amazon AWS Cloud Services, (EC2, S3, EBS, ELB, CloudWatch, Elastic IP, RDS, SNS, SQS, Glacier, IAM, VPC, CloudFormation, Route53) and managing security groups on AWS.
- Experienced in Amazon Web Services specifically leveraging Docker, Cloud Formation, VPC, Route 53, Elastic Beanstalk, Elastic Load Balancers, Amazon S3, Amazon SES, Amazon SNS, Amazon IAM, Amazon Container service(ECS), API Gateways, Amazon Direct Connect, CloudWatch, CloudFormation, Amazon RDS, CloudFront, Amazon Snowball, Amazon Redshift, DynamoDB.
- Experience working on monitoring tools like Nagios, Splunk and AWS Cloud Watch to health check teh various deployed resources and services.
- Experienced in designing Virtual Private Cloud (VPC) and grouping teh resources
- Experience in using configuration management tools like Chef, Puppet and Ansible.
- Experience in working on version control systems like Subversion, and GIT and used Source code management client tools like Git Bash, GitHub, Clear Case, Git GUI and other command line applications in terms of branching, tagging and maintenance on UNIX and Windows environment.
- Experienced wif monitoring tool Nagios.
- Experience in Jenkins and Maven for Build management to automate teh software build.
- Extensive experience wif Continuous Integration Tools Jenkins.
- Ability to work closely wif teams, to ensure high quality and timely delivery of builds and releases.
- Experience wif Scrum and Agile Environments for weekly releases.
- Knowledge on AZURE and Rackspace clouds.
- Experienced in setting up DNS Records and Hosted Zones to host teh website on Route 53 using AWS to minimize teh operational cost and for high availability of website.
- Configured teh Elastic Load Balancer and Auto Scaling Policies for more EC2 instances to manage teh load and for teh fault tolerance.
- Hands on experience in setting up CloudFront Content Delivery Network (CDN) which is used to deliver teh data to teh edge locations.
- Hands on experience in creating S3 Buckets and storing teh data and creating life cycle policies to send teh data periodically to Amazon Glacier.
- Experienced on creating teh peering of VPC wif other VPC’s and created VPN connections.
- Experienced on working wif hosting teh DNS Records using Route 53, and setting up services such as HTTP, FTP, SMTP, NFS etc.
- Experienced in relocation of data centers and updating teh client servers and migration of data centers to cloud.
TECHNICAL SKILLS
Operating System: Windows, UNIX, LINUX, RHEL/CentOS 5.x/6. x/7, Mac OS
Versioning Tools: Subversion, Clearcase, GIT
Configuration Management Tools: Jenkins, Chef, Puppet, Ansible
Build Tools: ANT, MAVEN
Bug Tracking Tools: JIRA, Rally, Remedy, and IBM Clear Quest.
Languages: Java/J2EE and PL/SQL
Scripting: Shell scripting, Python, Ruby, Perl.
Web Technologies: HTML, JavaScript, XML, Servlets, JDBC, JSP.
Web/App servers: Apache Tomcat, JBOSS, Weblogic, WebSphere
Database: Oracle 9i/10g/11g, SQL SERVER, MySQL.
PROFESSIONAL EXPERIENCE
Confidential, Tampa, FL
AWS Engineer
Responsibilities:
- Managed teh configurations and automated teh application installation by creating and maintaining teh puppet modules.
- Automated most of build related tasks using Jenkins to maximize teh throughput and performance of build system.
- Automated teh active directory related tasks in server to generate teh logs in management level and administration level using PowerShell Script.
- Monitored teh real-time logs and metrics periodically of all AWS Services wif Amazon CloudWatch and Data Dog and created teh individual dashboards for each service.
- Cloned teh databases, files and data from on-premises to AWS by creating teh virtual private cloud (VPC) and grouping all teh resources.
- Setup a large environment of Cloud VPC network configured wif SNS, SQS, SES, DynamoDB, S3 and Amazon IAM.
- Supported web programing tasks for development by setting up teh databases for MySQL and Oracle.
- Controlled Swap by modifying teh Linux Kernel and automated teh system administration jobs using bash shell scripting.
- Troubleshoot and monitored teh performance of application arising out of Ticketing System in remedy.
- Configured GIT for Jenkin build jobs and contributed to repositories of JSON templates used for cloud formation and ruby scripts used for Chef on GIT.
- Hosted teh web servers on AWS using Route 53 by creating DNS Records and managing teh name server records.
- Developed a scalable database dat can be used to store any amount of data and can archive teh snapshots of teh database using Amazon Glacier.
- Configured GIT repositories wif merging, tagging, notifications and branching.
- Created fully configured Amazon AMI’s, bundled them to Amazon S3 and attached them for teh auto scaling policies to design teh application as highly available.
- Managed teh whole Virtual Private Cloud using AWS CLI and writing teh API’s to migrate teh on-premises data centers to teh cloud.
- Developed scripts for Application Program Interface (API) calling to Stop, Terminate and Start EC2 instances on scheduled timings.
Environment: Puppet, Python, Cloud Watch, SQS, SNS, Route 53, Amazon Glacier, GIT, Jenkins, AWS CLI, EC2, S3, DynamoDB, MySQL, Amazon AMI, Amazon CLI.
Confidential, Dallas, TX
AWS Engineer
Responsibilities:
- Created a new Virtual Private Cloud (VPC) and setup all teh resources which are required for teh project using cloudformation service.
- Migrated applications and databases from on-premises to cloud using Docker and API Gateway service.
- Assigned teh IAM roles to teh user accounts to automate teh access to Instance and developed new policies using Identity and Access Management.
- Route 53 is used to create teh DNS Records, CNAME, Hosted Zones of company website after transferring it to teh cloud.
- Designed teh Auto scaling policy to bear teh load of website in teh peak times in combination wif Elastic Load Balancer (ELB).
- Monitored teh Metric logs of Auto Scaling policy using CloudWatch Service and sending teh reports periodically to teh respective persons based on teh errors generated.
- Simple storage service is configured to store all teh data and a life cycle policy is designed to send all teh data which is greater TEMPthan 30 days to Amazon glacier.
- Simple storage service is attached to cloudfront which is a content delivery service to decrease teh latency between user and client.
- User data templates are created to decrease teh installation time of instances and relational database storage and operating teh database using instance
- Monitored teh relational database and generated teh backup in teh form of snapshots periodically and stored in simple storage service (S3).
- Assigned teh Security Groups and Network ACL’s to teh instances to allow only certain users to access teh project, and restricted all teh inbound and outbound traffic other TEMPthan teh specific users.
- Provided 24/7 support in rotation to meet teh requirements of teh project and to support teh environment dat is highly available, scalable and fault tolerant.
- Designed teh layout of teh Front-End content of teh Website and maintaining teh traffic of website by installing ELB and Auto Scaling of Instances.
- Designed teh end to end solution to host teh web application on Amazon Web Services in combination wif storage service S3.
Environment: Cloudformation, EC2, Cloudwatch, S3, AWS IAM, DynamoDB, Security Groups, VPC, GITHUB, Docker, Amazon ECS, Amazon EMR, DynamoDB
Confidential, Sacramento, CA
Build and Release Engineer
Responsibilities:
- Release Engineer for a team dat involved different development teams and multiple simultaneous software releases.
- Participated in weekly release meetings wif Technology stakeholders to identify and mitigate potential risks associated wif teh releases and using Version Control Tools likes Rational ClearCase, Rational Team Concert (RTC).
- Imported and managed multiple corporate applications into Tortoise SVN.
- Development, Quality Assurance, and Management teams to ensure cross communication and confirmed approval of all production changes.
- Provided end-user training for all TortoiseSVN, JIRA users to TEMPeffectively use teh tool.
- Build scripts using ANT and MAVEN build tools in Jenkins to move from one environment to other environments.
- Deployed J2EE applications to Application servers in an Agile continuous integration environment and automated teh entire process.
- Build scripts using ANT and MAVEN build tools in Jenkins, Sonar to move from one environment to other environments.
- Designed Continuous Build Process using Jenkins to prevent build failures.
- Developed Perl and shell scripts for automation of teh build and release process.
- Involved in editing teh existing ANT/MAVEN files in case of errors or changes in teh project requirements.
- Configure and Supporting monitor tools like Splunk
- Managed Maven project dependencies by creating parent-child relationships between Projects.
- Jenkins is used as a Continuous Integration tool for automation of daily process.
- Documented teh entire build and release engineering process and provided on call support.
Environment: MAVEN, Tortoise SVN, Jenkins(CI/CD), SonarQube, Java/J2EE, ANT, Vagrant, WebSphere, Perl Scripts, Shell Scripts, XML, UNIX, Oracle 10g, 11g, JIRA, Python.
Confidential
Unix/Linux Administrator
Responsibilities:
- Responsible for installation, Configuration management, maintenance and systems development of Unix/Linux systems.
- Automated server building using Kickstarting RHEL 6 and Jumpstarting Sun Solaris 10.
- Monitored teh systems and administered Servers for day-to-day problems, patches, user administration, and hardware failure, monitoring log files, backup, and software up gradation, configuration changes and documentation.
- Performed Disk management wif teh halp of LVM (Logical Volume Manager).
- Installed, upgraded and configured AIX Servers using NIM Master and Red hat Enterprise Linux 5/6 using kickstart.
- Hands on experience working wif production servers at multiple data centers.
- VMware Installation, configuration and administration, virtual environments wif 5x Virtualcenter 5x.
- Troubleshooting production servers and configuring standalone production server for testing
- Performing Migration and/or Preservation Installations and upgrades from AIX 5.3 to AIX 6.1and AIX 7.1.
- Building software packages on RedHat Linux (RPM)
- Installed, upgraded and configured SUN Solaris 9/10 on Sun Servers using Jumpstart and Red hat Enterprise Linux 5/6 using Kickstart.
- Performed extensive Logical Volume Management (LVM) tasks
- Managed File system using VERITAS volume manager 5.0.
- Performed centralized management of Linux boxes using Puppet.
Environment: RHEL 4/5, CentOS 4/5, Fedora 9/10/11(beta), Ubuntu 8.10/9.04 Server, Debian SID, VMware ESX, Veritas File System, Veritas Volume Manager, Veritas Cluster Server.
Confidential
Quality Analyst
Responsibilities:
- Facilitated requirement sessions wif Business team to capture requirements on customization of screens/reports related to Member, Enrollments, and Providers.
- Translated business needs, wants and objectives in teh form of user stories, created Use Case Document and Functional Requirements Document (FRD).
- Created wireframe and process flow diagrams using Microsoft Visio to clearly communicate teh business requirements.
- Regularly updated sprint and product backlogs.
- Created Use Cases, Activity Diagrams, State Chart Diagrams, Sequence Diagrams, and Collaboration Diagrams thus defining teh Data Process Model and Business Process Models.
- Conducted weekly meetings and daily stand-up meetings wif project team, business, SMEs and vendor.
- Worked on comparing data stored in SQL tables and reports generated from these tables, cross- referenced them wif data keyed in application.
- Performed Gap Analysis, Impact Analysis, and prepared As-Is and To-Be documents.
- Wrote SQL queries to extract data from customer databases to validate customer information.
- Responsible for loading Requirements and Test Cases into HP ALM and mapping of Requirements and Test Cases.
- Developed Requirements Traceability Matrix (RTM) to track requirements against test cases during teh QA Phase.
- Facilitated formal defect review meetings wif project teams and developers to report, demonstrate, prioritize and suggest resolution of issues discovered during testing.
- Played a key role in teh planning, testing, and implementation of system enhancements and conversions.