Aws/devops Engineer Resume
New, YorK
SUMMARY:
- Around 5 Years of varied experience in the Information Technology arena focused on Systems/Network Administration, DevOps, Cloud Computing, Virtualization and Storage technologies.
- Experienced in configuration management tools like Chef, Puppet, Ansible, Jenkins and managing code using distributed version control systems like GIT.
- Installed and maintained Jenkins for Continuous Delivery as well as automate Ansible Playbook runs against production infrastructure.
- Hands on experience on Terraform a tool for building, changing, and versioning infrastructure safely and efficiently.
- Deployed and configured Chef Server including bootstrapping of Chef Client nodes for provisioning. created roles, recipes, cookbooks and data bags for server configuration.
- Experienced on OpenShift platform in managing Docker containers, Kubernetes Clusters and Mesos.
- Developed procedures to unify streamline and automate applications development and deployment procedures with Linux container technology using Docker swarm.
- Experience in configuring and managing various AWS services like EC2, S3, Glacier, SNS, Route53, Auto Scaling etc.
- Experience in Infrastructure monitoring tools like Amazon Cloud Watch, Nagios and Solar winds.
- Experienced in creating scripts for system administration and AWS using languages such as Python, Bash and Ruby.
- Experience of working with the release and deployment in Java/J2EE, .NET, ASP.net Web applications environment.
- Experienced in automating, building, deploying and releasing code between several environments, deploying to servers.
- Working knowledge on installing and configuring Apache, Nginx web servers and rolling updates using Chef.
- Experienced in generating Docker Images from Dockerfile, creating Docker Instances and performing operations like setting up Docker Bridge, Linking and Host Networking.
- Worked with Ansible playbooks for virtual and physical instance provisioning, configuration management, patching and software deployment.
- Managed environments like DEV, QA and PROD for various releases and designed instance strategies.
- Experienced in performing VMware vMotion, storage vMotion, creating host profiles, applying patches etc.
- Installed and Configured VMware vCenter 5.0 and 2.5, ESX 4.0, ESX 3.5, ESX3.0, ESXi 3.5 for High Availability, DRS, vMotion.
- Experienced in supporting of multiple LINUX/UNIX platforms like Ubuntu, RHEL, Fedora, iOS and Windows 98/NT/XP/Vista/7/8/12 of production, test and development servers.
- Ability to multi - task and work on many complex designs and provide support for all on-going initiatives.
TECHNICAL SKILLS:
Amazon Web Services: EC2, S3, RDS, VPC, Elastic Load Balancer (ELB), Route 53, Cloud Watch, IAM, Amazon Glacier, Cloud Formation, OpsWorks
Operating systems: RHEL 7/6, CENTOS 7, Solaris 8/9, HP-UX 10.x/11.x, Windows 2000/2003/2008 R2/2012, Unix, ESXi 6.0/5.5/5.0
Scripting Languages: Python, Shell, C, Ruby, Java
DevOps Tools: Chef, Puppet, Ansible, Jenkins, OpenShift, Openstack, Docker, GIT, Kubernetes, Jenkins, TeamCity, TFS, SonarQube, Splunk
Databases: MySQL, MongoDB, Cassandra, PostgreSQL, SQL Server
Protocols: FC, FCOE, ISCSI, FCIP, TCP/IP, FTP, NFS, CIFS, NIS, DNS, SSH, HTTP, DHCP.
PROFESSIONAL EXPERIENCE:
Confidential, New York
AWS/DevOps Engineer
Responsibilities:
- Built and configured virtual data center on Amazon Web Services cloud-computing platform and hosted Enterprise Web Applications using AWS.
- Automate infrastructure in AWS using amazon web services and deploy Chef for provisioning and managing AWS EC2 instances, EBS volumes, DNS, and S3.
- Performed installation and configure chef server / workstation and nodes via cli tools to AWS nodes.
- Built various applications from the ground up in a Virtual Private Cloud on Amazon Web Services by using services like Elastic Load Balancer (ELB), EC2 instances and Amazon RDS database.
- Experienced in developing & implementing the Auto scaling with Load-balancer on Amazon EC2 instances.
- Experienced in launching and configuring of Amazon EC2 Cloud Servers using AMI's (Linux/Ubuntu) and configuring the servers for specified applications.
- Worked on migrating the current Linux environment to AWS/RHEL Linux environment and used Auto Scaling feature to maintain the application availability and scalability.
- Configured EC2 instances to send instance status check data to Amazon Cloud Watch and setup alarms in Cloud Watch to notify and alert upon reaching or exceeding the threshold limits.
- Enabled versioning and setup lifecycle policies on S3 bucket for sending old objects to Amazon Glacier for archival purposes.
- Built VPC from scratch, created public subnets, private subnets, route table and add an internet gateway to public subnet on AWS environment.
- Experienced in setting up NAT HA failover, AWS security groups for AWS Instances and Route 53 for AWS Web Instances in AWS Environment.
- Worked with Amazon identity Access Management (IAM) to create roles, users, groups and attached user policies to group to provide secure logins.
- Build and maintained docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker, on AWS, utilized kubernetes and docker for the runtime environment of the CI/CD system to build, test and deploy.
- Used Ansible to manage Web Applications, Config Files, Data Base, Commands, Users Mount Points, and Packages.
- Develop and maintain a repository of Microservices through OpenShift pods and support integration of OpenShift and Cloudforms.
- Created a deployment procedure utilizing Jenkins CI to run the unit tests, build documentation using Natural Docs, and create RPM packages for installation and setup the application and its dependencies.
- Worked on Terraform key features such as Infrastructure as code, Execution plans, Resource Graphs, Change Automation.
- Created Cloud services on new generation instances of D-series on Microsoft Azure. Performed security patching on the Microsoft Azure IAAS VMs, backup/recover Azure Virtual Machines from a recovery services vault.
- Performed various tasks in AZURE cloud like creating databases on SQL AZURE, setting up SQL Azure firewall etc.
- Setup connection Strings, connected SQL Server AZURE Databases from locally Installed SQL Server Management Studio(SSMS) for Developers.
- Performed troubleshooting on connectivity b/w different AWS services and used iperf3 to test bandwidth on EC2 instances.
- Performed several database/application level migrations from traditional environment to AWS and used python boto3 to automate the deployment of code for test, dev and prod environments.
- Created python scripts for tagging EBS volumes and attaching to the respective EC2 instances for business purposes.
- Performed installation and configuration of Red Hat Linux AS 5.0/6.0/7.0, Sun Solaris 8, 9, 10 and Oracle Linux 6.x, 7.x.
- Integrated multiple python child scripts into a Master script and scheduled to run on periodic basis using cron jobs.
- Performed daily system monitoring by verifying the integrity and availability of all hardware, server resources, systems and verifying completion of scheduled jobs such as backups.
Confidential, Little Rock, AR
AWS Engineer
Responsibilities:
- Experience working on various AWS services like VPC, EC2, ELB, Auto Scaling, S3, Route53, Cloud watch, SQS, SNS, RDS, DynamoDB, Cloud Formation, Elastic Beanstalk, DMS and IAM.
- Designed multiple EC2 instances to attain high availability and fault tolerance using Auto Scaling, Elastic Load Balancer and AMIs.
- Setup and build various AWS resources like VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling and RDS in Cloud Formation using JSON templates.
- Created and configured several S3 buckets for static website hosting and enabled versioning to protect against accidental data deletion and life cycle policies to move data to cost-efficient backup services like AWS Glacier.
- Enabled CORS (Cross-Origin Resource Sharing) on S3 buckets for web applications that are loaded in one domain to interact with resources in a different domain and used Cross Region Replication to replicate data across different S3 buckets.
- Migrated media (images and videos) to S3 and used Cloud Front to distribute content with low latency and at high data transfer rates. Also migrated Microsoft SQL Server databases to AWS RDS and setup Multi-AZ Deployments.
- Configured security parameters by managing AWS users and groups using AWS Identity and Access Management (IAM) by specifying the policies, roles to allow or deny access to AWS resources.
- Utilized AWS CLI to automate backups of ephemeral data-stores to S3 buckets, EBS and create nightly AMIs for mission critical production servers as backups.
- Utilized various python libraries like Boto3, NumPy to deploy several AWS resources like EC2 instances, RDS Databases, Subnets, Security Groups and IAM.
- Created Chef Cookbooks and modules to automate system operations. Created monitors, alarms and notifications for EC2 hosts using Cloud Watch
- Created several documentations for chef such as Chef Basics, Chef Setup, coding standards, local cookbook development and testing.
- Deployed Puppet, Puppet Dashboard, and Puppet DB for configuration management to existing infrastructure.
- Installed Puppet client software on RHEL 7.x servers and established a communication between master and client through SSL certification.
- Contributed to the Openstack project, responsible for maintaining incremental, weekly and monthly backups on Openstack databases and CSA databases.
- Utilized AWS Cloud Watch and Logic Monitor to monitor resources such as EC2, EBS, ELB, RDS, and S3 etc.
- Configured web servers (IIS, nginx) to enable caching, CDN application servers, and load balancers.
- Experienced in creating Docker images, snapshots, attaching to a running container, Docker linking and Bridge Setup.
- Created a clone of central stash repository on to the local machine using Source tree and git.
- Written python scripts for making API calls to NetApp systems and committed/pushed the changes from local system into central stash (Bit bucket server) repository using git.
- Written various python API scripts for pulling capacity/performance metrics for environment auditing like aggregate/volume space usage, volume NFS IOPS, snapshot lifecycle audit, volume no clone audit, disk status audit etc. and created cron jobs to generate daily/weekly reports & emails.
Confidential
DevOps Engineer
Responsibilities:
- Experienced in deploying central Chef Server, bootstrapping nodes, configuring workstations to communicate with Chef Server.
- Written several cookbooks, recipes to install/configure several web servers like Apache, Nginx and managed the code using the GIT.
- Automated the front-ends platform into highly scalable, consistent, repeatable infrastructure using high degree of automation using Chef, Ansible, Vagrant and Jenkins.
- Designed and built a continuous integration and deployment (CI/CD) framework for Chef Code using test-driven development.
- Built Chef Development workflow and best practices around configuration management as well as building a strong and diverse internal Chef community.
- Deployed a centralized log management system and integrated into Chef to be used by developers.
- Performed system administration and operations tasks using Chef and Nagios.
- Deployed multiple servers in testing and production environments using Jenkins, GIT and Docker.
- Generated Docker Images from Dockerfile, created Docker Instances and performed operations like setting up Docker Bridge, Linking and Host Networking.
- Developed and implemented Software Release Management strategies for various applications according to the agile process.
- Researched alternative build strategies and platforms to enhance the reliability of the build process, therefore, reducing the lag time between code check-in and QA testing.
- Responsible for the development and maintenance of processes and associated scripts/tools for automated build, testing and deployment of the products to various developments.
- Wrote custom monitoring and integrated monitoring methods into deployment processes to develop self-healing solutions.
- Installed Workstation, Bootstrapped Nodes, Wrote Recipes and Cookbooks and uploaded them to Chef-server, Managed On-site OS/Applications/Services/Packages using Chef.
- Wrote automated build scripts using ANT (build.xml) for Java and J2EE Applications.
- Diagnose and correct errors within Java/HTML/PHP code to allow for connection and utilization of proprietary applications.
- Installed NetApp NMSDK package on the CentOS system and used ZEDI tool to make API calls on NetApp system for collecting capacity and performance information.
- Installed MongoDB database on CentOS system to store the XML ZAPI output in the form of collections.
- Generated Python code to query the capacity/performance data stored in MongoDB and exported it to the dashboard.
- Scheduled cron jobs in CentOS to automate the process of ZAPI calls/MongoDB data storage and python querying.
Confidential
SCM/ Build/Release Engineer
Responsibilities:
- Prepared the initial project structures in the SCM Tool based on requirements.
- Supported development teams with respect to the SCM Tool.
- Made baselines after the Build is successful and Versioning them correctly.
- Setting up network environments using TCP/IP, NIS, NFS, DNS, SNMP agents, DHCP and Proxy.
- Administered Linux servers for several functions including managing WebSphere, Apace/Tomcat server, mail server, MySQL database, SVN, build and firewalls in both development and production.
- Establishing and maintaining the Software Configuration Management (SCM) requirements baseline for the Projects.
- Planning, implementing, documenting, and administering configuration control procedures across multiple development projects.
- Performing daily builds of the software code which involved, linking, packaging, merging, testing, verifying, documentation and finally release of the code to the testing team.
- Automated build and improved software quality through reduction of build time.
- Installed Kafka cluster with separate nodes for brokers.
- Experience in importing the real-time data to Hadoop using Kafka and implemented the Oozie job. Experience Schedule Recurring Hadoop Jobs with Apache Oozie.
- Debugging the code (Java, Unix and Linux) when build errors occur and resolving them along with the developers.
- Responsible for automated Scheduled Builds/Emergency Builds and Release using ANT and MAVEN scripts for Enterprise application (J2EE).
- Assisted the Developers in maintaining the Development Code in the Repository.
- Participated in the successful migration of the project and code transfers, artifact transfer.
- Build the components and products as and when they successfully reach completion stage.
- Provided individual working copies of the project to the team members efficiently.
- Troubleshooting Linux network, security related issues, capturing packets using tools such as Iptables, firewall, TCP wrappers, NMAP.