Aws Architect / Cloud Engineer Resume
5.00/5 (Submit Your Rating)
SUMMARY
- Experienced Cloud Infrastructure, Security and Database Systems Engineer with 10 years of professional experience delivering IT Infrastructure and Services required for the achievement of corporate milestones, including Operational & Cost efficiencies, Cloud Initiatives, Automation, D/R Program, Infrastructure Tech Refresh & Consolidations, Security and Audit. Maintained and operated IT in a manner which achieves customer service and timeline expectations, uptime goals, cost targets, while avoiding security/audit concerns and risks.
- Leveraged automation to increase efficiency, quality, and velocity of repeatable operations. My professional experience is completed by a number of industry recognized process and technical certifications such as AWS and Oracle.
- Infrastructure as code implementation (GIT, Jenkins, Terraform, Ansible, Cloud Formation).
- Designed and built out the environment infrastructure in AWS to support application, integration, and connectivity requirements of application.
- Strong background in Linux/Unix/Windows Administration.
- Proficient in writing Cloud Formation Templates (CFT) in YAML and JSON format to build the AWS services with the paradigm of Infrastructure as a Code.
- Designed and deployed automation to ensure operational overhead of managing the AWS environment is low and as easy and self - service as possible for the group that leverages it.
- Continuous deployment of cloud infrastructure.
- Event driven security of AWS.
- Planning and executing technology proof-of-concept (POC) for different cloud services on AWS.
- Logging and monitoring infrastructure (Splunk and Elasticsearch)
- Implementing and enforcing Government security controls and good understanding of FISMA, NIST 800-53 controls and CIS Benchmarks.
- Integrate Qualys with the existing Splunk environment.
- Created pre-requisite for onboarding new Tripwire agents that included instructions for installing agents onto Windows, Unix Operating systems.
- Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWS resources.
- Integrate Tripwire with the existing Splunk environment.
- Conducting security control assessments in parallel with the development/acquisition and implementation phases of the life cycle permits the identification of weaknesses and deficiencies early and provides the most cost-effective method for initiating corrective actions
- Developing ansible playbooks for network device hardening, system remediation, system patching and security compliance.
- Leveraged Ansible and terraform to improve deployment velocity, production reliability, and incident management while also meeting regulatory compliance.
- Validating the effectiveness of applications, system and network protections along with policy compliance.
- Security controls assessments, vulnerability assessment, audit, and risk assessment of cloud applications.
- Deployment, migration and new application environments and infrastructure on AWS (VPC, DynamoDB, SQS, ELB, EC2, SNS, SES Redshift, Lambda, EMR, S3, Glacier, etc.). Load balancing with NGINX and ELB.
- Assessed, designed, implemented, automated, and documented security processes and solutions leveraging Amazon Web Service (AWS) and other third-parties.
- Interpreted scans (Nessus, Secure Code Review, Penetration Test Security Control Assessment, and Security Configuration Compliance Data) and security documentation to prepare Federal ATO package development for Cloud-based systems.
- Design and implementation of a large-scale distributed data processing pipeline for ingesting security events in Amazon Web Services (AWS) using Ossec, Logstash, Fluentd, Kinesis, Elasticsearch and S3.
- Leveraging ansible for Security Technical Implementation Guides (STIG), Network device hardening, Remediation, Internal standards and Incident response.
PROFESSIONAL EXPERIENCE
Confidential
AWS Architect / Cloud Engineer
Responsibilities:
- Responsible for architecting solutions and providing subject matter expertise in cloud migration project.
- Develop automated processes to maintain and enforce controls for infrastructure configurations in AWS Gov Cloud.
- Designed and built out environment infrastructure in AWS to support application and integration
- Created various security policies using Cloud Custodian.
- Experienced in Automating, Configuring and deploying instances on AWS cloud environments using ansible playbooks.
- Design and deployment of systems to enforce real time compliance of security policies (like encryption and access requirements), tag policies, and cost management leveraging native AWS security services such as AWS Lambda, AWS SSM, AWS CloudWatch events, AWS Trusted Advisor Experience working with F5 to setup VIP, pools and nodes. Used rules to customize traffic. certificates, configure frontend and backend SSL termination and configure persistence profiles.
- Worked on Apache Tomcat application server for hosting web applications.
- Managed Virtual servers (Ubuntu, Linux and Windows) on AWS EC2 using Ansible.
- Create and maintain fully automated CI/CD pipelines for application deployments using Git, Ansible and Jenkins.
- Designed and developed NiFi flows for various types of data sources, flow groups, decision making, error handling, content & attribute transformation/modifications, scheduling, parameterization, and event processing.
- Designing data lake ingestion architecture (streaming and batch).
- Designing architecture and deciding standards for S3-based data lake including data ingestion, refinement, governance.
- Performed software and patches installation for Linux using RPM and YUM package manager.
- Installed and implemented new patches to the Linux operating system software.
- Worked on Linux logical volume, creating volume group, logical volumes, file systems, and troubleshooting.
- Utilized Bash and Shell scripting languages to write scripts for backup and system updates.
- Developing microservice with Amazon Lambda with AWS services( AWS SQS, AWS Kinesis, Step function, etc) with boto sdk.
- Engineering of cloud governance by codifying policies and adherence to defined controls leveraging open source tools such as Cloud custodian.
- Automating, Configuring and deploying instances on AWS cloud environments using ansible, j frog antifactory and Jenkins.
Confidential
AWS Data Engineer / Cloud Engineer
Responsibilities:
- Engineer, develop, deploy and support existing platforms and services while introducing continued process and product improvements. Focus on cost effective ways to improve services as measured from the end user perspective, while minimizing labor hours via automation and/or improved practices.
- Infrastructure development on AWS using various services like EC2, S3, RDS.
- Perform data migration to AWS cloud.
- Building DevOps delivery pipeline with infrastructure and environment provisioning, deploying mand monitoring tools that supported, enhanced and grew the DevOps model.
- Implementing detailed monitoring for cloud environment and notification system using cloud
- Watch, Splunk and Simple notification system.
- Migrated Databases from On-Prem Oracle to RDS PostgreSQL/Oracle/MySQL/Redshift in AWS using DMS.
- Configured App Dynamics for monitoring and alerting of Applications and RDS PostgreSQL and Oracle instances in the AWS
- Configured DMS for schema/table replication/conversion from On Prem Oracle to RDS PostgreSQL instance in the Cloud.
- Use Orapki utility to Set up/Configure SSL/TLs connection using TCPS protocol between Oracle Database on Prem to RDS Instances in AWS Cloud for secure movement of data in Transit.
- Acted as Trusted Advisor in areas of cloud compliance and security best practices.
- Responsible for implementing a Cloudera Hadoop deployments including deploying, configuring, and managing Hive, Spark, and Impala on AWS cloud.
- Responsible designing and implementing Data Lake Security Data Security during ingestion. Data security at rest(Encryption) Data Access Architecture.
- Used ansible, terraform and cloud formation for cloud automation and configuration management
- Continuous deployment of cloud infrastructure.
- Implemented detailed monitoring of AWS cloud environment and notification using Cloud Watch and SNS
- Event driven security in AWS.
- Planning and executing technology proof-of-concept (POC) for different cloud services on AWS.
- Logging and monitoring infrastructure (splunk and elasticsearch)
- Leveraged Ansible and terraform to improve deployment velocity, production reliability, and incident management while also meeting regulatory compliance.
- Security controls assessments, vulnerability assessment, audit, and risk assessment of cloud applications.
- Configured and automated Glue ETL pipeline to process files from S3 and push to AWS RedShift using Lambda.
- Encourage ongoing participation in the DevOps processes and activities (Createathon, BOSS of the SOC)
- Design and implementation of a large-scale distributed data processing pipeline for ingesting security events in Amazon Web Services (AWS) using ossec, logstash, fluentd, elb, kinesis, elastic search and S3.
Confidential
AWS Data Engineer
Responsibilities:
- Provided guidance and best practices in designing and implementing AWS services, as well as infrastructure as code playbooks.
- Overall Big Data strategy for Business Intelligence and Operations.
- Migrated on-premise Kafka application to AWS.
- Spinning up data pipelines using Ansible and Terraform.
- Development of Spark (RDD and DStream API)
- Used Kafka for real-time processing of clinical, payer, and claims data from all of the client’s business affiliates.
- Responsible for developing scalable and reliable data solutions using NiFi to move data across systems from multiple sources in real time as well as batch modes.
- Used Kibana for visualizing events, messages, and logs.
- Provisioning and managing of Hadoop clusters using Cloudera Manager
- Overlaid Storm and Spark for near real-time stream processing.
- Owned end to end integrated solution for Big Data products with Tableau implementations within the organization
- Led the design of the end product driving consensus among IT, User Experience, product management and customer teams
- Acted as the techno-functional consultant for all Tableau solutions
- Developed and implemented robust data models and data interfaces for high volume systems
- Pitched designs and solutions to Business enhancing awareness of Tableau
- Interacted with senior management and evangelized the departure from reporting towards visualizations and interactions with data
- Worked at all levels and with all teams - From development and code review to building solutions and creating technical presentations/delivery
- Sized hardware and define topologies and models that suited business needs
- Coordinated both onshore and offshore teams.
- Ensured solution was SOX and SSAE-16 compliant.
- Managed Oracle databases in very large (over 10 TB) and DB2 databases.
- Implementation of High Availability solutions with Oracle 12c RAC, Physical and Logical Standby Databases (Data Guard) and Replication using Golden Gate.
- Efficiently performed installation, setup and creation of 8 node cluster RAC with Oracle 12.1.0.2 database using GRID infrastructure with ASM file systems on RHEL 6.X.
- Administered (12c RAC) environments adding and removing nodes to the cluster and handled performance tuning using AWR. Space Management, Capacity Planning, Backup & Recovery, Disaster Recovery, Database Performance Tuning, Memory Tuning, Application Tuning, Security Administration, Data Guard, Oracle Advanced Replication, Real Application Clusters, Oracle Standby.
- Installed and configured Oracle identity manager and Oracle role manager.
- Installed and configured 11g and 12c oracle HTPP WebLogic Fusion Middleware AND Web Cache application Server. Applied Patches and manage middleware application servers using Oracle 12C OEM
- Performed evaluation of existing Oracle identity Manager implementation and deployed performance and process improvement.
- Organized various migration review meetings with the application team to discuss the migration strategy based on their requirements.
- Created Snapshot Materialized View for table replication. Set up and monitored table level replication using Oracle Golden Gate.
- Performed various tasks in administration of databases including capacity planning, instance creation.
- Design of databases following OFA: Creating Databases schemas, profiles, database links, Snapshots and Synonyms of objects as required to ensure data quality and security to comply with company's business rules. Oracle scripting in Windows/Linux and UNIX platform.
- Manage, tune and troubleshoot database issues both on UNIX command line, SQL*Plus, OEM client, oracle 12c OEM as well as other third party tools (TOAD, SQL Developer).
- Monitor database systems and provide technical support for database system issues and application performance enhancement/improvement using both OEM (client), Oracle 10g database control & SQL*Plus, as well as pre-developed UNIX shell scripts.