We provide IT Staff Augmentation Services!

Cloud Engineer/data Architect Resume

2.00/5 (Submit Your Rating)

Ashburn, VA

SUMMARY

  • Recognized as an Extraordinary Achiever, Awarded Outstanding Achievement and High Impact Awards for sustained, long - term accomplishments exceeding expectations which improved business process, saved time and money and resulted in increased productivity, efficiency and performance.
  • Digital transformation of infrastructure, technology and data architectures at Confidential , Cummins, Confidential , OSSE, Freddie Mac, Coca-Cola, USPTO, Inovalon, CareFirst BCBS, Otsuka Pharmaceuticals, Ogilvy and Mather and George Mason University to improve business process and procedures to reduce costs and increase agility, efficiency and improve performance and productivity.
  • AWS Solutions Architect, Data Science using Python for PySpark, Python Boot camp certifications from Udemy. Certifications from Oracle as an Oracle Certified Professional (OCP) and Oracle 11G SQL Expert.
  • Proven, verified, hands-on technical knowledge and skill in various IT subject areas including Cloud Computing Platforms (AWS, Azure, and Salesforce), Data Science, Big Data, Databases (RDBMS, NoSQL, Graph, Appliances, and Datawarehouse), Data Architecture, Data Integration and Data Analytics and Visualization.
  • Thought Leadership, nurturing, guidance and mentoring teams to provide creative and innovative practical solutions to complex challenging problems through the identification, research and evaluation of various software tools and talent.
  • 18+ years experienced IT professional in Data and Database Architecture, Development and Administration, Software Development (Web and Mobile), Data Integration, Business Intelligence, Data Visualization and Reporting, Data Science and Cloud Computing using Amazon Web Services (AWS), Azure and Salesforce.
  • Solid background, knowledge and hands on experience on AWS including Compute, Storage, Databases, Networking, Security, Identity and Access Management, Artificial Intelligence, Internet of Things (IoT), Risk and Compliance, Analytics, Mobile Services, Content Delivery, Messaging, Developer and Management Tools.
  • Experience and hands on expertise on AWS including EC2, S3, RDS, VPC, IAM, EMR (Hadoop and Spark Clusters), Route 53, CloudFront, CloudWatch, CloudTrail, Machine Learning, Redshift, Athena, Glue, SNS, SQS, SES, Database Migration Services.
  • Expert level experience using Redshift for peta-byte scale Data Warehouse, Elastic Map Reduce (EMR) for big data processing with MapReduce and Spark using Hadoop, Hive, HiveQL, Spark SQL and Python for PySpark.
  • Hands on Data Science experience with supervised and unsupervised machine learning (ML) algorithms (MLlib) using Spark and Python for big data with PySpark using Jupyter Notebooks and the Databricks platform. Experience working with Classification, Segmentation and Association ML algorithms using MLlib.
  • Experience in architecture, design and implementation of very large scale data architecture projects involving data acquisition, integration and ingestion. Exceptional knowledge on Operational Data Stores (ODS), Enterprise Data Warehouse (EDW), Master Data Management (MDM), Data Lake and data distribution across the enterprise.
  • Experience with relational and dimensional database design, data modeling techniques (Normal Forms, Star/Snowflake schemas) using conceptual, logical, and physical data models. Knowledge and experience with Inmon and Kimball approaches to data warehouse and datamart design.
  • Expert level experience with RDBMS multi tenant database architecture, design, development and administration including performance tuning, partitioning, advanced compression techniques, and backup, restore and recovery. Working knowledge in Database /Data warehouse appliances like Netezza and Exadata.
  • Very strong experience with data modeling tools; ER/Studio, ERwin, IBM Data Architect, Visio, MySQL workbench. RDBMS skills using Oracle, Microsoft SQL Server, PostgreSQL and MySQL on Linux, Solaris, Ubuntu and Windows.
  • Implementation, design and development experience on Talend Suite of products including Data Integration (DI), Big Data, Data Quality, Data Preparation, Master Data Management (MDM), Enterprise Service Bus (ESB), Talend Open Studio and Talend Administration Center.
  • Excellent knowledge, hands on experience with Data Visualization tools using Tableau, Microstrategy and AWS QuickSight.
  • Concepts and knowledge with Continuous Integration and Continuous Development tools including Jenkins, Ansible and Puppet.

TECHNICAL SKILLS

Languages: SQL, PL/SQL, T-SQL, C, C++, C#, Java, Python, Shell Scripting, PowerShell

Databases: Redshift, Oracle, MySQL, Microsoft SQL Server, PostgreSQL, Aurora, DynamoDB, DB2, Hbase, Cassandra, Netezza, Exadata, Neo4j, Snowflake, Microsoft AccessDatabase Tools: TOAD, OEM Grid, Database and Cloud Control, Oracle SQL Developer, Microsoft SQL Server Management Studio, Toad, SQL Plus, DBSchema, SQL Workbench/J, JDBC, ODBC, Database as a Service (DAAS), Aginity Workbench, pgAdmin 4, Cloudberry, IBM Data Studio, DBArtisan

Data Modeling: ER/Studio Data Architect, IBM Data Architect, CA ERwin, Toad Data Modeler, MySQL Workbench, Power Designer

ETL/ELT: Talend Data Integration, Data Preparation, Master Data Management, Data Quality, Profiling, Big Data, Enterprise Service Bus (ESB), Talend Open Studio, Talend Administration Center, Alteryx, Microsoft SSIS

Data Visualization and Reporting: Tableau Desktop, QuickSight, Microstrategy, Business Objects, Microsoft Excel

Cloud Computing: Amazon Web Services (AWS), EC2, S3, RDS, VPC, EMR, SNS, IAM, Redshift, CloudWatch, CloudTrail, SQS, SageMaker, Comprehend, TensorFlow, Lex, Polly, Rekognition, Route53, CloudFront, CloudSearch, Athena, Kinesis, QuickSight, Lambda, DevOps, Data Pipeline, Simple Email Service (SES), IoT, Direct Connect, Elastic Load Balancer, CodeCommit, CodeDeploy, CodeBuild, Machine Learning (ML), Auto Scaling, EBS, EFS, Glue, Elasticsearch, Database Migration Service

Big Data: Elastic Map Reduce (EMR), HDInsight, Hadoop Distributed File System (HDFS), MapReduce, HortonWorks Data Platform and Cloudera (CDH), Spark, Spark SQL, Pig, Hive, HiveQL, Athena, Presto, Impala, Kafka, Sqoop, Zeppelin

Software Packages: Visual Studio Team Foundation Server, Database Projects, Data Compare, Schema Compare, Microsoft Project, Microsoft Visio, SPSS, MS Office

CI/CD: Jenkins, Ansible, Puppet

Others: Source Control, SVN, git, github, TortoiseSVN, VisualSVN, VPN, Google Apps (Docs, Email, Calendar, Hangout), WebEx, SSH Secure Shell, WinSCP, Putty, Apache Tomcat, Jboss, Oracle VM VirtualBox, Parallels Desktop, SharePoint, Adobe Connect, SOA, Web Services, Restful, JSON, Oracle VM VirtualBox, XML, XSD, XPath, SQL Workbench, Putty, CrossFTP

PROFESSIONAL EXPERIENCE

Cloud Engineer/Data Architect

Confidential, Ashburn, VA

Responsibilities:

  • Launched EC2 instances, created and configured Amazon Machine Images (AMI), volumes and snapshots, security groups, elastic IP’s, placement groups, key pairs and bootstrap scripts. Configured high availability, fault tolerant and scalable services including Elastic Load balancers (ELB) and Auto scaling groups.
  • Created S3 buckets, configured versioning, logging, access control lists and object lifecycle management using S3, S3-Infrequent Access, Reduced Redundancy Storage and Glacier.
  • Launched Amazon RDS relational database instance including Oracle, SQL Server, MySQL, Aurora, NoSQL and data warehouse database like DynamoDB and Redshift.
  • Configured and Launched Hadoop, Spark clusters on Amazon EMR, Spark using the Databricks platform, Jupyter Notebooks for data science development using Python (MLlib) and Spark with PySpark on an AWS Ubuntu EC2 instance. Virtualization software VirtualBox and Ubuntu for local installation and development on Spark. MySQL Hive metastore, HiveQL, Spark SQL, Python for PySpark for analysis and query execution.
  • Created virtual private cloud (VPC), public and private subnets, network address translation (NAT) instance and NAT gateways. Configured route tables, network access control list, security groups and Internet Gateways for systems security and networking.
  • Created Users, Groups, Roles and configured Policy and Multi-Factor Authentication for the root account using Identity and Access Management (IAM) on AWS.

Cloud Architect

Confidential, New York, NY

Responsibilities:

  • Elite member of the Center of Excellence, Cloud Readiness an innovative approach to Digital Transformation from On Premise to the Cloud including AWS and Azure.
  • Conceptualize, Design, develop and demo Proof of Concepts (POC), Minimal Viable Products and Pilots to introduce the latest technology and tools using Cloud Computing platforms AWS and Azure. Provided multiple POC's to demonstrate the messaging platform Kafka on AWS and Azure.
  • Tech Lead for Data Architecture, Data Modeling to transform and recreate the current state data architecture to a distributed future state.
  • Analyzed a very complex problem by creating > 100K lines of code to analyze and profile multiple DB2 distributed databases to identify the database objects for applications. Created a mapping between the UI elements and the database.

Software Architect

Confidential, Indianapolis, IN

Responsibilities:

  • Provided Thought Leadership, nurturing, guidance and mentoring members of the team in creating creative and innovative practical solutions to complex challenging problems through identification of the right tools and talent.
  • Lead the teams in the research, evaluation, and selection of Rapid Application Development, Software, Technology, and Infrastructure tools and standards.
  • Achievements include accomplishing and executing multiple Proof of Concepts on PAAS, SAAS and IAAS Cloud Computing platforms including Amazon Web Services and Salesforce CRM.
  • AWS Solutions Architect instrumental in the architecture, design, development of creative and innovative Internet of Things (IOT) big data analytical products on Amazon Web Services using S3 as data lake, Amazon Glue, Athena for query and to calculate metrics and KPI's, QuickSight for Data Analytics, Insights and Visualization.
  • Data Architect responsible for the creation of the relational data model using Data Modeling tools ER/Studio and ERWin for Forward and Reverse Engineering. Data Architect on the Salesforce CRM Platform, designing the data model for the iService application using the Schema Builder.
  • Installation and Administration of PostgreSQL, MySQL databases and database tools on local and AWS Cloud Computing platforms to execute the physical data models.

Big Data ETL Architect

Confidential, Silver Spring, MD

Responsibilities:

  • Data Integration (DI) Technical Lead and Architect for the Audience Precision Platform, a data and predictive analytics application which builds custom models to allow advertisers to target customers based on their purchasing habits through digital and cable advertisements.
  • Provided leadership, guidance and mentorship on using Talend for Big Data, Talend Administration Center, Amazon Web Services EC2, Redshift, Redshift Spectrum, Oracle Exadata Database Appliance, Tableau and Microstrategy for Data Visualization and Simple Storage Service (S3) as Data Lake.
  • Design, build, automate, schedule, monitor and execute complex data integration jobs using Talend for Big Data from AWS Redshift into target Exadata where the volume of data is greater than a billion records.
  • Executed multiple deployments to production, created DDL and alter scripts, detailed instructions for deployment and provided metrics. Monitored and provided proactive support involving production systems. Reviewed client feedback and resolved issues on a timely manner.
  • Executed Proof of Concept on using Amazon Web Services, Glue, S3, Redshift, RDS, IAM, CloudWatch and EMR. Using S3 as Data Lake, used crawlers and classifiers to create Amazon Web Services Glue databases.

We'd love your feedback!