We provide IT Staff Augmentation Services!

Senior Aws Devops Engineer / Architect Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • My career can be characterized as combination of broad and deep experience covering the full spectrum of Information Technology.
  • I have always been attracted to the inherent novelty of technology hence making the process of learning new things enjoyable.
  • Follow best practices and proven methodologies with respect to Infrastructure as code, Continuous Integration, Continuous Deployment, and Business Intelligence.
  • Through my long career in IT, I have undertaken the following roles: AWS Cloud Engineer, AWS Architect, AWS DevOps. SQL/NoSQL DBA, Windows/Unix/Linux Administrator, Nodejs Programmer, Python Programmer, Tableau Developer, Informatica Developer, Data Warehouse Architect, Business Intelligence Architect, VMware Administrator, NetApp Storage Administrator, and Instructor.

TECHNOLOGIES:

Cloud: AWS

AWS Architecture: Virtual Private Cloud, Availability Zones, EC2 instances, Subnets, Routing Tables, Network Access Control List (NACL), NAT and NAT Gateway, Route 53 DNS, Elastic Beanstalk, Security Groups, Users, groups, Roles, Policies, Auto - Scaling, Elastic Load Balancer (ELB), Cloud Front, Cloud Watch, Simple Storage Service (S3), Elastic Block Store (EBS), Glacier, Elastic Load Balancer, Auto Scaling, SNS, SQS, X-Ray, Lambda, Containers, AWS Shield, AWS WAF, Serverless Framework.

AWS DevOps:

Provisioning Tools: AWS CLI, Terraform, Cloud Formation, Boto3.

Configuration Management Tools: Ansible and AWS System Manager.

Deployment Tools: Docker, Docker Compose, Kubernetes, Vagrant, Packer, Github, Jenkins, Node.Js, Python, Cloud Formation, CodeStar, CodeDeploy, CodePipeline, CodeCommit, CodeBuild, Blue/Green deployment and A/B testing.

Packaging Tools: Npm, pip, pip3, Maven, Chocolatey.

Linux flavors: Red Hat, Ubuntu, CentOS.

Serverless Technology: AWS CodeStar, Serverless Framework (SLS), AWS API Gateway, Lambda.

Real-time Streaming Analytics: AWS kinesis, Elasticsearch

Programming Languages: Python, Boto3, Node.js, Java, Groovy.

Operating Systems: Unix, Linux, Windows, VMware.

BI Tools: AWS QuickSight, AWS Glue, Tableau, Informatica, Excel.

Database: MySQL, Sql Server, Oracle, PostgreSQL, Aurora RDS, DynamoDB, Redshift.

Database Tools: Sqitch, Hive, Presto, Hue, Workbench/J, ElephantSQL, MySql Workbench, SQLyog, JDBC, ODBC, Oracle Developer, Toad.

Web Data Sources: Stackexchange, Kaggle.

Web Servers: Nginx, Tomcat, Apache.

Issue Tracking: Jira

Collaboration: Slack, Trello

Storage: NetApp

IDEs & Editors: Cloud9 IDE, Pycharm, Eclipse, IntelliJ IDE, Atom, VS Code, Sublime, Brackets, Google Developer Tools.

Architecture Tools: Lucidchart, Concept Draw, Visio, Erwin.

EMPLOYMENT HISTORY:

Senior AWS DevOps Engineer / Architect

Confidential

Responsibilities:

  • Complied with AWS recommended best practices with respect to implementing security for IAM, S3, in-flight data, data at rest, Network (VPC), Cognito, STS, End Points, and Lambda leveraging AWS Encryption SDK CLI, KMS keys, SSM parameter Store, AWS Secrets Manager, Waf, Shield, CloudFront, Guard Duty, Cloudwatch Logs, and Elasticsearch.
  • Designed and created a Multi-region Cloud Architecture and implemented the blueprint through infrastructure as code utilizing Terraform.
  • Automated the process of creating Jenkins Servers using CloudFormation.
  • Implemented Continuous Integration and Deployment utilizing Github, Jenkins, and AWS Code Pipeline. I streamlined a zero downtime, Blue/Green deployment upon which our new releases are deployed.
  • Automated the process of release deployments for a Java Maven application on Elastic Beanstalk leveraging Git, Jenkins, and Codedeploy.
  • Utilized AWS Opsworks in conjunction with Ansible to perform automatic operational tasks driven by Cron Scheduling.
  • Performed Linux operations on over 400 plus EC2 instances utilizing Ansible, Python SDK, Boto3, JMESPath Query in AWS CLI, and AWS System Manager.
  • Wrote the required lambda functions utilizing API Gateway, NodeJS, and Python incorporating AWS Cloudwatch events for various purposes such as sending SNS messages to the team's Slack site.
  • Created and configured an IPSec VPN tunnel between on-premise and AWS using Openswan VPN.
  • Utilized AWS Migration services to migrate on-premise MySQL Databases to MySQL RDS. Used AWS Secrets Manager to safely store the Database Passwords.
  • Utilized SQS and SNS to decouple an application which stores customer files in S3 and session information in DynamoDB.
  • Utilized AWS Kinesis Firehose to send Cloudtrail logs to AWS Elasticsearch to proactively identify security related issues. Also wrote SQL Statements to query the data using Kinesis Data Analytics and Athena on S3.
  • Leveraged AWS Glue to extract data from MySQL and create tables in Athena to provide source data to QuickSight Analytics.
  • Automated the process of extracting data from an Oracle Database by leveraging Glue and Athena, populating tables in a Redshift Database.
  • Created several DynamoDB tables with secondary global indexes for JSON formatted data coming from the Web API's
  • Setup a Redis Elasticache to improve Database read performance.
  • Setup a CloudFront with Geo-Restrictions to deliver the data to the edge locations with S3 Static Website as the origin access identity.
  • Configured a Backup/Restore strategy with RTO and RPO. Automated the process of Backing up EC2 EBS Volumes and RDS EBS Volumes.
  • Setup a warm standby site for the critical operations. It comprises web servers, app servers, and database servers.
  • Utilized AWS EFS as a shared file system among some of the instances for the purpose of establishing a central location to access critical log files.
  • Helped the developers with setting up an A/B testing environment to help them decide which version of the application they want to use for deployment. Utilized ghost inspector for testing.
  • Utilized AWS Config, Trusted Advisor, AWS Inspector for governance, auditing, vulnerability assessment, and security.
  • Created a Life Cycle Policy for backing up S3 objects to Glacier.
  • Installed agents on EC2 instances for Cloudwatch Logs, AWS inspector, System Manager, and Trend Micro to help with security, creating custom metrics, patching and other objectives.
  • Utilized the System Manager’s Parameter and AWS Secrets Manager to securely store sensitive information, and the Database Passwords.
  • Utilized AWS Certificate Manager and DNS for creating certificates and applying them to ALB and CloudFront for SLS/TLS/HTTPS.
  • Integrated AWS CodeStar and Jira to facilitate bug tracking.
  • Created QucikSight Dashboards accessing data in S3 through Athena and Redshift.
  • Created KMS encryption Customer Master Keys to add an additional layer of security for storing private data at rest in S3.
  • Utilizing AWS Guard Duty as a threat detection tool, AWS Shield service to counteract DDOS, and AWS Waf as an application firewall to protect against layer 7 attacks (Cross-site scripting, SQL Injection).
  • Utilizing Trend Micro Deep Security as IDS/IPS protection. Installed the agent on the EC2 instances.
  • Utilized AWS ECS Cluster and registry for Docker containers, but recently switched to AWS EKS with Kubernetes.
  • Created Monitoring Dashboards in Tableau reading data from a Redshift Database.
  • Created Monitoring Dashboards in AWS using Cloudwatch Metrics.
  • Utilized EMR Cluster and Data Pipeline as an ETL tool. Used Hive, Presto, and Hue to Query the data.
  • Created a Data Lake utilizing AWS Glue and Redshift
  • Created a serverless Python project via AWS CodeStar and automated the process of release deployment by utilizing CodeStar's native code deployment. The process incorporated Code Commit, AWS Lambda, AWS API Gateway, and Cloud9 IDE.

Technologies: Ansible 2, Jenkins 2.15, Terraform 11, Python 2 & 3, Nodejs 5 & 6, Groovy, Java 8, Docker, Boto3, Nginx 1.14, IPSec VPN, Route53, Red Hat 7, Ubuntu 16, CentOS 7, Windows 2016, Tableau 10.2, Slack 3.3.4, Jira 7.13, Trello, Hue, Hive, Presto, Redshift, MySql, Oracle, MSSQL, PostgreSQL, JMESPath Query, Sqitch, AWS QuickSight, AWS Glue, AWS Organization, AWS System Manager, AWS Config, AWS Guard Duty, AWS Inspector, AWS Trusted Advisor, AWS Cognito, AWS EMR, AWS CodeStar, AWS X-Ray, AWS Kinesis, AWS Elasticsearch, AWS Redis Elasticache, AWS KMS Encryption, AWS Secrets Manager, AWS EKS with Kubernetes, AWS ECS, AWS SDK, AWS Code Pipeline.

Data Warehouse Architect

Confidential

Responsibilities:

  • Configured Tableau to read data from MySQL and created Dashboards.
  • Designed the ETL processes.
  • Created Star schemas, and Fact tables.
  • Developed forecasting and various trend reports by extensively using Tableau Advance analytics like Reference Lines, Trend Lines and Bands.
  • Automated the process of data extracts for Tableau.
  • Developed Tableau dashboards.
  • Designed the staging area’s tables for extracting the data from disparate sources.
  • Created the logical and physical data models of the Data Marts in accordance with the requirements as outlined in the functional specifications document.
  • Created the Star schemas and Fact tables for each subject area.
  • Designed the ETL architecture and data integration aspects of the Data Warehouse.
  • Created ETL workflows using Informatica Power Center.
  • Wrote Informatica Mappings, Worklets, and Workflows.
  • Cleansed the data as they were copied to the staging area as part of implementing data quality control.
  • Transferred and transformed the data from the sources to target Data Marts using Power Center.
  • Monitored the ETL jobs and improved throughput by tuning the process and optimizing the queries.
  • Working closely with the project manager, product owner, and the scrum master in tailoring our development efforts around the scrum agile methodology; creating/grooming the product backlog, establishing sprints, and going through the sprint review process.

Technologies: Tableau Desktop 7.0.3, Red Hat Linux 6.7, Ubuntu 12.4, Informatica Power Center 9.1, Oracle 11g.2.1, Sql Server 2012& 2014, Pycharm, Visual Studio, Visio 2007, Python 2.7, Tomcat, Apache Web Server.

Data Warehouse DBA

Confidential

Responsibilities:

  • Configured Esxi Hosts and created VMware 5.1 virtual machines.
  • Installed Oracle 12C on Red Hat Linux VM on VMware.
  • Managing VMware 5.1 Servers (VMguests) that host the on-premise Databases.
  • Worked in DBA capacity for the project in charge of Oracle and Sql Server source and target Databases.
  • Wrote Oracle PL/Sql and Sql Server T-Sql stored procedures to clean up the source data and improve data quality.
  • Designed a star schema comprising two fact tables and twelve dimension tables.
  • Developed Informatica mapplets, mappings, worklets, and workflows. Merged the data from the staging area into the star schemas. Scheduled Informatica jobs.
  • Managing 12 Oracle Databases (8i, 9i, 10g, and 11g), one of which is a Data Warehouse storing two terabytes of data; upgraded several Oracle Database from 8i and 9i to 10g.
  • Designed and implemented the ETL and data integration aspects of the Data Warehouse.
  • Developing Informatica mappings, worklets, and workflows.
  • Developed Tableau dashboards.
  • Create action filters, parameters and calculations for preparing dashboards and worksheets in Tableau
  • Hands-on development assisting users in creating and modifying Tableau worksheets and data visualization dashboards.
  • Defined best practices for Tableau report development.
  • Managing NetApp 7.2 storage; Creating Aggregates, Volumes, Luns, initiators.
  • Managing 10 Sql Server Databases (2005 and 2008), upgraded from 2000 and 2005 to Sql Server 2008 R2.
  • Installed Windows 2008 R2 cluster on VMware and installed Sql Server cluster 2008 R2.
  • Managing VMware 4.1 Servers (VMguests) that host the Databases; upgraded vcenter 3.5 to 4.1, created clones, templates, and monitored the ESX servers.
  • Devised back and recovery procedures for Oracle and Sql Server Database.
  • Created and monitored Database auditing jobs.

Technologies: Oracle 11g.2.1, Sql Server 2012, Visio 2007, Python 2.7, Tomcat, Red Hat Linux 6.7, Ubuntu 12.4, Informatica Power Center 9.1, Apache Web Server.

Informatica Developer

Confidential

Responsibilities:

  • Managing day-to-day activities of BI databases (source, target, and MicroStrategy), and providing 24/7 support.
  • Develop Oracle packages, stored procedures, and triggers.
  • Develop Power Center maplets, mappings, worklets, and workflows for membership data integration.
  • Monitor and tune different components of the Database Servers such as shared memory, disk I/O, Users, processes, logs, disk space, locks, and latches.
  • Automated database monitoring and reporting functions utilizing UNIX SHELL scripts, and PERL scripts.
  • Setup/configured Oracle RMAN for database backup and recovery.
  • Cloned an Oracle database for the purpose of testing.
  • Upgraded Oracle 9i to Oracle 10g.
  • Performed data mappings from source systems to new target structures on the database platform.
  • Built the infrastructure and the star schema for the membership Data Mart.
  • Improve Data Warehouse daily refresh execution time by monitoring, measuring, and adjusting ETL components.
  • Participated in Informatica 8.5 Beta Testing program.

Technologies: ORACLE 9i 2.1.0 & 10g 10.1.2, Informatica Power Center 7.1.2 & 8.1, HP 9000, AIX 5.2, PL/SQL, Unix Shell, Perl, TOAD.

Consultant

Confidential

Responsibilities:

  • Participated in creating the Data Warehouse architecture for E-trade.
  • Establishing procedures pertaining to the database design, security, and maintenance.
  • Built the infrastructure and the star schema for the Accounting Data Mart.
  • Implemented the Logical/Physical Data Model.
  • Defined and built the Logical and Physical stages for Data Transformation.
  • Designed and created the summary tables.
  • Developed ETL mappings in Power Center 7.1.2
  • Performed data mappings from source systems such as Siebel to new target structures on the database platform.
  • Designed, developed and maintained Power Center workflow load processes to data warehouse.
  • Designed, wrote, and implemented procedures that controlled Data Warehouse refresh strategies.
  • Wrote shell programs for the automation of Data Warehouse Autosys Job Control.
  • Designed and implemented utilities, which automated several DBA functions and maintenance activities including setting up alerts and collection of important database health indicators.
  • Generated web reports using Shell scripts and HTML.
  • Monitored activities related to data growth, performance and availability.
  • Modified and Maintained existing Java programs.
  • Migrated a Sybase Database to an Oracle Database

Technologies: ORACLE 9i 2.1.0 & 10g 1.0.2, Informatica Power Center 7.1.2, Sun Solaris 8.1, Red Hat Linux AS release 3, PL/SQL, Unix Shell, TOAD, Erwin 4.5, Sql Server 2000

Data Warehouse DBA

Confidential

Responsibilities:

  • Hand coded ETL procedures and transformations in Oracle PL/Sql.
  • Elicited and documented business requirements from the user community.
  • Modified existing data models to accommodate new business requirements.
  • Planned and implemented Oracle 9i software installations, migration, upgrades, and patches.
  • Merged several databases into one database.
  • Performed system testing and assisted users in User Acceptance Testing.
  • Created test plans.
  • Implemented methods to ensure data integrity.
  • Performed database sizing and capacity planning.
  • Performed DBA tasks such as disk layout architecture, performance tuning, and backups.
  • Assisted developers with issue resolution and patch/fix identification and implementation.
  • Helped in Deploying of major software releases, feature enhancements, and bug fixes.
  • Implemented ORACLE Data Guard (Standby Database) between the production and batch reporting systems.
  • Implemented data replication and Materialized Views between the production and Real-time reporting systems.

Technologies: Informatica 7.0.1, ORACLE 9i 2.1.0, Oracle OEM, Sun Solaris 8, HP 9000, windows 2000, Apache Web Server, TOAD, Erwin 4.5, PL/SQL, Unix Shell.

Data Warehouse Architect

Confidential

Responsibilities:

  • Met with the team on a weekly basis and discussed progress, risks, and potential pitfalls.
  • Supervised and performed quality review of junior level DBA's to ensure success of all operations.
  • Managed all aspects of the technical implementation.
  • Worked closely with development and support staff, customers, business subject matter experts and business partners to gather requirements and translate them into technical designs.
  • Assisted in translating business requirements to report specifications.
  • Took the lead in instilling best practices into our development environment.
  • Established procedures and guidelines for the design, development, and administration of the warehouse.
  • Integrated IDX and MUMPS data with other medical data.
  • Wrote most of the code for the initial Data Warehouse population, and the monthly incremental refreshes and updates.
  • Collaborated in the review, selection, procurement, usage and maintenance of internally developed applications as well as purchased applications, particularly focusing on data and database administration functions as they relate to the associated application/project.
  • Ensured that IT business solutions meet the IT Strategy (i.e., making sure that general architectural issues are satisfied).
  • Mentored junior team members and enabled them to gain proficiency in Database Administration and Development.
  • Set standards and common conventions for the other Database Administrators in the team.

Technologies: Sql Server 2000, windows 2000, Erwin, Perl, MS SQL DTS, Transact Sql

We'd love your feedback!