We provide IT Staff Augmentation Services!

Snr. Aws Cloud Architect/devops Engineer Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • Amazon Web Services (AWS) Certified Solutions - focused Professional with 8+ years in IT industry in providing highly available business applications for enterprise customers through Cloud Computing, Linux System administration, Network and Security Management.
  • Proficiency in supporting Production Cloud environments like Amazon Web Services (AWS), VMWare and traditional managed hosted environments.
  • Expertise in working with the various AWS broad set of global cloud-based products such as Compute-EC2, Networking-VPC, Scalability-ELB, Auto Scaling, Storage and Archiving-EBS, S3, Glacier, Monitoring-CloudWatch, Security-IAM, Management-CloudFormation, AWS CLI-Filter, Query.
  • Experience with Configure a set of Amazon EC2 instances that launch behind a load balancer, Monitor the health of Amazon EC2 instances, Deploy Amazon EC2 instances using command line calls and troubleshoot the most common problems with instances.
  • Expertise in AWS, implementing new instances and working with EBS, AMI and S3 storage, IAM to create new accounts, roles, and groups.
  • Experience Setting up databases in AWS using RDS, storage using S3 bucket and configuring instance backups to S3 bucket.
  • Good familiarity with Configuring Auto-Scaling Groups (ASG) using launch configuration for different use cases.
  • Expertise in Managing Amazon instances by taking (Amazon Machine Images) AMIs and performing administration and monitoring of Amazon instances using Amazon Cloud Watch.
  • Strong working experience with RDS multi-AZ and read replicas to configure high availability and automatic failover at the database level.
  • Experience with deployment, data security and troubleshooting of the applications using AWS services.
  • Good familiarity in setting up SNS notifications to track the status of resources.
  • Implemented custom monitoring dashboard and alerts through Splunk, Cloud Watch, New Relic and pager duty into auto deployment process.
  • Experience in using build tools like Apache ANT and Apache MAVEN for building of deployable artifacts such as jar, war and ear files on JAVA projects for the development from the source code.
  • Experience in setting up CRON jobs on production servers by using Jenkins.
  • Good experience in using Puppet, which helps in site-redundancy and Release Management activities for large Enterprise Applications.
  • Experience in working with Bug Tracking Tools like JIRA, Remedy, Atlassian, ServiceNow,
  • Good working experience in System health performance Monitoring, troubleshooting, and remediation, including visualization tools such as Graphite, New Relic, Nagios/Icinga, Datadog etc.
  • Experience in monitoring System/Application Logs of server using Splunk to detect Prod issues.
  • Experience in installing and configuring Linux/Unix/Windows based web/App servers like Tomcat, JBOSS, WebLogic and WebSphere for Application deployments.
  • Experienced in SQL queries and database table maintenance and support.
  • Experience in Linux/Unix system administration, system builds, server builds, installations, upgrades, patches, tuning, migration, trouble shooting on RHEL 4.x/5.x.
  • Knowledge of using various Routed Protocols like FTP, TCP, UDP, ICMP, SFTP, SSH, HTTP, HTTPS and Connect direct.
  • Experienced in SOX audits with internal and external auditors and Defined Control objects for Risk and Control items.
  • Experienced in Incident, Change and Release management using Remedy and JIRA.
  • Experienced in managing environments like DEV, QA and PROD for various releases and designed instance strategies.

TECHNICAL SKILLS

Operating Systems: Solaris, Linux, RHEL, MS - Dos, Windows 2000/ XP/ Vista/ Windows 7/ Windows 8/ Mac

Scripting Languages: Shell Scripting, Python, PowerShell

Programming Languages: Oracle SQL, PL/ SQL, C, C++

Web/App Servers: JBOSS, Web logic, Web Sphere, Apache Tomcat

Build & CI Tools: ANT, Maven, Jenkins, Bamboo, Ant.

Monitoring Tools: Nagios, Graphite, New Relic, Splunk, Remedy, Apigee, Akamai Edge Control, Citrix Applications, Gomez, Foglight,, Omniture, Tivoli,, EMC Smart tool, Autosys,Nagios,Grafana, Datadog and PRTG

Bug Tracking Tools: Jira, BugZilla, Remedy

Database: Oracle, SQL, MySQL, MSSQL, SQL Server, IBM DB2, PostgreSQL, Redis, MongoDB, Cassandra, DynamoDB

Volume Manager: VERITAS volume manager, LVM with Linux

SAS Skills: SAS/BASE, SAS/GRAPH, SAS/MACRO, SAS/SQL

PROFESSIONAL EXPERIENCE

Confidential

Snr. AWS Cloud Architect/Devops Engineer

Responsibilities:

  • Working under Watch Technology team that uses the latest technologies running on open source big data platforms and deploy on a public cloud to deliver products that provide audience measurement and analytics across all devices where content is consumed.
  • Build Spark applications that leverage massive data assets and run in cloud environments.
  • Involving in complete Bigdata flow of the application starting from data ingestion from upstream, processing and analyzing the data to S3
  • Implemented usage of Amazon EMR for processing Big Data across a Hadoop cluster of virtual servers on Amazon Elastic Compute Cloud (EC2) and Amazon Simple Storage Service (S3).
  • Develop various Python scripts to find vulnerabilities with SQL Queries by doing SQL injection, permission checks and performance analysis.
  • Partner with our data scientists to understand the logic behind Nielsen applications.
  • Engage with a team of engineers focused on leveraging the latest Agile and DevOps practices, while still maintaining high levels of quality.
  • Responsible for analyzing, designing, developing, implementing, supporting, and refining applications for the Digital Measurement Platform. collaborate with other developers, architects, and analysts in creating solutions that meet Nielsen’s business requirements and achieve high quality.
  • Develop solutions that in corporate data ETL (extract transform load) and statistical models, using specifications provided by Product Leadership.
  • Create build and deployment process scripts. Work closely with architects/designers to design and implement reusable system solutions
  • Developed Apache Airflow Data pipeline for orchestration through workflows directed acyclic graphs (DAGs) of tasks in a programmatic manner.
  • In depth understanding of the principles and best practices of software configuration management (SCM) in Agile, Scrum and Waterfall methodologies.
  • Built and deployed Docker containers for Airflow into microservices, improving developer workflow, increasing scalability, and optimizing speed
  • Experience with continuous build and integration tools such as Ansible, Docker, Jenkins, and Terraform

Confidential, Boston

Snr. AWS Cloud Architect

Responsibilities:

  • Migrated the current Linux environment to AWS/RHEL Linux environment and used auto scaling feature.
  • Design and deploy dynamically scalable, highly available, highly reliable and fault-tolerant systems on AWS
  • Migrate existing complex on-premises applications to the AWS platform
  • Select appropriate AWS services to design and deploy an application based on the business requirements
  • Implement cost control strategies on the AWS infrastructure
  • Work with Confidential -management to resolve issues and validate programming requirements within areas of responsibility
  • Participate in software development processes with quality assurance, version control and build processes
  • Assist the project manager in preparing estimates and justification for assigned tasks
  • Co-ordinating with Turbot vendor which provides guardrails, CMDB, policies and controls mapped to common compliance frameworks like NIST, HIPAA, CIS, GxP & PCI.
  • Testing and validating shared AMI’s, bootstrapping backup and security agents & supporting application teams with Turbot queries.
  • Working alongside with TCoE & IT quality team to prepare Compute, Backup & Storage management SOPs to make the GxP related applications eligible for cloud migration.
  • Responsible for POC to determine the efficient cloud backup tool for enabling point-in-time recovery capabilities for the production and GxP applications.
  • Helping cloudops team conducting POC on Symantec Cloud Workload Protection (CWP) security agent to provide Anti-malware protection & OS hardening of EC2 instances.
  • Prepared Infrastructure Quality & Operational Quality (IQOQ) documents as per the GxP standards for Rubrik backup, Nasuni storage and Symantec CWP security agent tools.
  • Cloudformation and Ansible scripts to automate the provisioning of infrastructure and configure the applications.
  • Powershell and Ansible scripts to bootstrap the EC2 instances with security & backup agents during launch time. Test and verify the custom AMIs as per companies’ standards before sharing with all AWS accounts.
  • Worked along with Devops team to design Devops process for orchestrating Test, Build, Release and Deploy phases through CI/CD pipelines using Atlassian tools like Bitbucket and Bamboo.
  • Establish the appropriate monitoring and alerting of solution events related to performance, scalability, availability, and reliability.
  • Evaluated the different monitoring tools solutions available in the market for monitoring Confidential cloud infrastructure
  • Conducted successful POCs to evaluate monitoring tools like Nagios, Grafana, Datadog,Signal-fx and PRTG
  • Contribute to cloud strategy discussions and decisions on overall Cloud design.

Confidential

Snr. AWS Cloud Engineer

Responsibilities:

  • Proposed and implemented branching strategy suitable for Agile development in Subversion.
  • Hands-on experience in working with the SaaS, PaaS, IaaS Cloud Services, Storage, Web Apps, Active Directory, in-depth practical knowledge on other cloud services like Microsoft Azure and OpenStack
  • Involved in designing and deploying applications utilizing almost the entire AWS stack (Including EC2, Route53, S3, RDS, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling.
  • Created users and groups using IAM and assigned individual policies to each group.
  • Increased EBS backed volume storage capacity when the root volume is full using AWS EBS Volume feature.
  • Created S3 backups using versioning enable and moved objects to Amazon Glacier for archiving purpose.
  • Created SNS notifications and assigned ARN to S3 for object loss notifications.
  • Created Elastic load balancers (ELB) and used Route53 with failover and latency options for high availability and fault tolerance.
  • Secured a private subnet by placing a NAT Gateway and Bastion Host in public subnet to allow internet access to private subnet via bastion host.
  • Implementing new projects builds framework using Jenkins & maven as build framework tools.
  • Expertise in implementing new AWS instances and working with EBS and S3storage, IAM
  • Utilizing S3 bucket and Glacier for storage and backup on AWS.
  • Managing users and groups using the Amazon Identity and Access Management (IAM).
  • Configured Elastic Load Balancers with EC2Autoscaling groups
  • Managed AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing and Glacier for our QA and UAT environments as well infrastructure for GIT server. Created Security groups and worked with Access Control Lists (ACLs), snapshots and Amazon Machine Images (AMIs) of the instances for backup and creating clone instances.
  • Supporting Development team to implement the process of automation of builds and deployment using chef.
  • Managed Amazon Web Services like EC2, RDS, EBS, ELB, Auto scaling, AMI, IAM through AWS console and API Integration with Chef Code.
  • Configured Apache webserver in the Linux AWS Cloud environment using Chef Automation.
  • Experience with Linux systems, virtualization in a large-scale environment, experience with Linux Containers (LXC) and Docker.
  • Scheduled the Database ETL batch jobs on a daily, weekly and monthly basis through Autosys.
  • Created machine environments with Chef and deployed cloud with node workstation.
  • Monitoring CPU, memory, physical disk, Hardware and Software RAID, multipath, file systems, network using the tools NAGIOS 4.0 monitoring.
  • Maintained JIRA for tracking and updating project defects and tasks.
  • Created JIRA issues toprioritize and take an action on whats important, and stayed up to date with whats going on around the project.
  • Used Splunk to monitoring/metric collection for applications in a cloud based environment.
  • Set up SNS notifications to track the status of resources.

Confidential

Technical Lead (AWS SysOps Administrator)

Responsibilities:

  • Participated in weekly release meetings with Technology stakeholders to identify and mitigate potential risks associated with the releases.
  • Implemented AWS solutions using EC2, S3, ElasticLoadBalancer, and Auto scaling groups, Optimized volumes and EC2 instances.
  • ConfiguredElasticLoadBalancers (ELB)with EC2 Auto scaling groups.
  • Created AWS Launch configurations based on customized AMI and use this launch configuration to configure auto scaling groups.
  • Utilized S3 bucket and Glacier for storage and backup on AWS.
  • Using Amazon Identity Access Management (IAM) tool created groups & permissions for users to work collaboratively.
  • Created, deleted and managed user accounts and their roles to interact withAWSand setup their ACLs with Amazon IAM.
  • Used Elastic Load balancers (ELB) and Auto scaling groups (ASG) to handle the traffic at peak times.
  • Constructed regular EBS snapshots and rebuilt new EBS volumes from these existing snapshots to migrate and move applications
  • Maintained the monitoring and alerting of production and corporate servers using Cloud Watch service.
  • Exported CloudWatch logs to S3 and created alarms in conjunction with SNS to notify of resource usage and billing events.
  • Used Amazon RDS Multi-AZ for automatic failover and high availability at the database tier for heavy MySQL workloads.
  • Used AWS CloudFront (content delivery network) to deliver content from AWS edge locations drastically improving user experience and latency.
  • Deployed applications on AWS using ElasticBeanstalk.
  • Authored Cronjobs to modify/initiate critical resources on the server at initial startup and reboot.

Confidential

Linux administrator/Build and Release Engineer

Responsibilities:

  • Server builds and deployment (CentOS, RHEL, Solaris), using interactive and advance installation methods (Kickstart and Jumpstart).
  • The operatingsystemsinvolved for migration wereLinuxRedhat 5.5 / 6.2 / 6.5, AIX and Solaris.
  • Installation and configuration of Redhat virtual servers using ESXi 4/5 and Solaris servers (LDOMS) using scripts and Ops Center.
  • Addition and configuration of SAN disks for LVM on Linux, and Veritas Volume Manager and ZFS on Solaris LDOMs.
  • Created test scenarios for testing NIS, NFS, DNS and other functionality of the OS.
  • Implemented open source base monitoring tool Nagios 3.0 for servers, SAN switches, EMC SAN Storage and VMware ESX and ESXi.
  • Worked with DBAs on installation of RDBMS database, restoration and log generation.
  • Proficient DB administration (Oracle, DB2, Mongo, MySql, Sybase and SQL) knowledge for maintaining, pruning, and performing required DB tasks.
  • Maintained system security, including password checks, permission scans, implementation of security ensuring tools Linux environment.
  • Day to Day activities included handling security issues like stale UNIX account cleanups, 90 day password changes, setting max age and min age, creating a list of unmask settings for various users.
  • Trouble shooting User's login & home directory related issues. Managing (adding/removing) disks and partitions (LVM).
  • Provided system performance reporting on a regular basis along with keeping all software at current versionlevels, and maintaining a log of changes for tracking.
  • Supporting engineering plans and schedules by providing Build/Release Engineering services to build, deploy, develop scripts, oversee branch and merge strategies, and build automated tools as necessary to offer services to engineering team
  • Deploy production packages to web servers and application servers per business needs.

Confidential

Linux Administrator

Responsibilities:

  • Installed, Deployed Red Hat Enterprise Linux 6.x/7.x, CentOS and installation of packages and patches for Red Hat Linux Servers.
  • Creation of Kickstart imgaes for the different version of Redhat Linux Operating system like 5.x & 6.x
  • Creating the server profile and making the network and SAN virtual configuration using Virtual Connect in blade center C7000.
  • Configuration and administration of LDAP, NFS, FTP, SAMBA and POSTFIX Servers in Red Hat Enterprise Linux.
  • Setup NFS Servers, Diskless clients and Auto clients and automated the file systems using direct and indirect automount maps.
  • Shell scripting for database startups, backups using bash shell scripting.
  • Installation of Firewall checkpoint. Installed and configured IPFilter, to protect Linux system that was placed on the Internet.
  • Monitored servers, switches, ports with Nagios monitoring tool.
  • Fine tuning of Servers and configuring networks for optimum performance.
  • Managing and troubleshooting issues with load balancers using F5 to balance the load among the application servers.
  • Experience in working with Network Engineers to install, manage, and configure NAS based storage and insure overall system and network security.

Confidential

System Administrator

Responsibilities:

  • Created and modified users, groups with SUDO permission, administered them and added necessary permissions for users.
  • Installed and configured Apache/Tomcat server and diagnosed the problems associated with DNS, DHCP, VPN, NFS, and Apache.
  • Worked with development team to load software onto Linux Enterprise servers and performed debugging on scripts.
  • Performed troubleshooting networks with the help of net stat, ping, nslookup and trace route tools.
  • Resolved TCP/IP network access problems for the clients.
  • Developed, Maintained, updated various scripts for services (start, stop, restart, recycle, Cron jobs) on UNIX based shell.
  • Copied files remotely using SFTP, FTP, SCP, WinScp, and FileZilla and managed backups of server/client data.

We'd love your feedback!