We provide IT Staff Augmentation Services!

Sr. Cloud Architect Resume

Boston, MA

SUMMARY:

  • Optimization experienced with DevOps Engineer understands the melding of operations and development to quickly deliver code to customers. Have experience with the Cloud and monitoring processes as well as DevOps development in Windows Mac and Linux systems
  • Terraform emphasized infrastructure provisioned for DevOps implementation
  • Worked within the Cloud for integration processes.
  • Performed DevOps for Linux Mac and Windows platforms.
  • Focused on automation and integration.
  • Monitored developed applications and fixed bugs.
  • Wrote code and designed continual updates.
  • Completed load and performance testing of extremely complex systems
  • Cloud Architect responsibilities go hand in hand with Big Data / Hadoop activities which I am extremely hands on with as an Administrator / Cloudera / Hortonworks / Data Scientist / Hadoop Architect (CDH 5) / Hadoop Security Specialist (Kerberos / Sentry / ACL) / Cloud Architect (AWS) / Cloud Security Professional (CCSP) / GCP (Google Cloud Platform) / Microsoft Azure Platform
  • With all the above skills - I acquired Multi-dimensional skilled matrix with single point of justified solution applicable with enormous expertize to Architect, build and execute enterprise level solution with minimal dependency off of other technical resource(s).
  • 25+ years IT Industry across various industries with Data Science related world of skills emphasize over the certified skills below:
  • Hadoop Architecture related:
  • Cloudera Hadoop / Hortonworks / Data Science Analysis / Data Visualization skills / Amazon Web Services (AWS) Architect / Windows Azure Architect / Cloud Security (CCSP) / Google Cloud Platform
  • Working on a POC with Google Cloud Platform with features included:
  • Stackdriver (Monitoring / Logging / Error Reporting), App Engine, Compute Engine, Container and Networking, storage related Bigtable/SQL, Google Cloud Platform related API Manager, Cloud Launcher and IAM & Admin activities. BigQuery, Dataproc, DataFlow and Genomics from the BigData implementation of data migration activities.
  • Familiar with Hortonworks, MapR with Lambda Architecture, IBM BigInsights environments
  • Over 5 years of experience in BigData, Hadoop, HDFS, HBASE, Hive, pig and Linux with hands-on project experience in various Vertical Applications.
  • Expertise in HDFS Architecture and Cluster concepts.
  • Expertise in Hadoop Security and Hive Security.
  • Expertise in Hive Query Language and debugging hive issues.
  • Expertise in Sqoop and Flume.
  • Worked with Kafka messaging services, familiar with other related messaging tools - RabbitMQ.
  • Involved in implementation of Hadoop multi node cluster, installed Hadoop Ecosystem softwares, configured HDFS.
  • Worked on Multi Clustered environment and setting up Cloudera Hadoop echo-System, creating jobs to get the data from RDBMS to HDFS, from HDFS to RDBMS.
  • Experience in Big Data, Hadoop architecting.
  • Worked on Hadoop environment (HDFS) setup, Map Reduce Jobs, HIVE, Hbase, PIG and NoSQL and MongoDB
  • Software installation and configuration
  • Built automation and internal tools for Hadoop jobs.
  • Tableau 9.0 / Tableau 8.1, SQL Server 2008R2 & 2012, Excel, Access, Google Stack
  • Worked with all kinds of Data Sources, TDE, TDS, Extracts, live connections using HBase, Hadoop HDFS, GPDB (Greenplum), data blends, joins with both relational model databases and multi-dimensional modeled data sources across heterogeneous databases.
  • Experience designing complex dashboards that take advantage of all tableau functions including data blend
  • Strong experience writing complex SQL and troubleshoot and tune SQL to provide the best performance
  • Ability to drive insight by designing visualizations with logical and meaningful data flow
  • Experience doing full life cycle development, including business requirements, technical analysis and design, coding, testing, documentation, implementation, and maintenance
  • Experience implementing data visualization solutions using Hadoop is a pulsing, actions, and parameters
  • Big Data (Map Reduce, Impala, HIVE, etc.), JIRA project management suite
  • Data Modeling / Data Architecture
  • Multiple Reporting Structures / Dashboards

TECHNICAL SKILLS:

Hadoop Security related skills: Apache Sentry / Apache Knox / Apache Argus (Hortonworks)

BI Dash boarding skills: Tableau 9.3 / Tibco Spotfire / Qlikview / Sisense BI / Kibana / Splunk / Watson Analytics / Pentaho / SSRS / OBIEE / Microstrategy

ETL Expertize: Informatica BDE 9.6 / Ab Initio / SSIS / Spark / ODI / Datastage

Databases / RDBMS: NoSQL / Columnar DB / Unstructured Data / SQL related skills Oracle, Solr, HBase, MongoDB, Casandra, Greenplum (GPDB)

PROFESSIONAL EXPERIENCE:

Confidential, Boston, MA

Sr. Cloud Architect

Responsibilities:

  • Worked as Cloud Engineer using Infrastructure as Code as part of your deployment process has a number of immediate benefits to your workflow:
  • To ensure the speed - automating, manually navigating through an interface to deploy and connect up resources
  • Large set of infrastructure reliability, resource or provision the services. With IaC the resources will be configured exactly as declared, and implicit/explicit dependencies can be used to ensure the creation order.
  • With the ease at which the infrastructure can be deployed, experimental changes can be readily investigated with scaled down resources to minimize the cost and can be scaled up for production deployments.
  • As developer, I always look to employ the known best practices of software engineering wherever we can. Writing code to design and deploy infrastructure facilitates this in the arena of cloud provisioning, using established techniques like writing modular, configurable code committed to version control will lead us to view our infrastructure as somewhat of a software application in itself, and shifts us in the direction of a DevOps culture.
  • Building apps using Azure
  • Azure platform analytics and native AI
  • Building Azure cognitive APIs
  • Azure Data Factory / Databricks
  • Azure Data Warehouse / Service Bus automated administration to manage the infrastructure to run your code
  • Automatic scaling
  • Orchestrate multiple functions
  • Azure Functions related development for the requests served and the compute time
  • Continuous Delivery using Spinnaker platform
  • Worked with Spinnaker cloud deployment tool to support Google Cloud along with Azure

Environment: Azure Data Factory, DevOps, DataBricks, Terraform, Spinnaker, HortonWorks Hadoop, Cassandra, Azure Cloud Platform, PCF, Kafka, Flume, Splunk 6.2, DB2, TeraData, SQL Server, SQL, PL/SQL

Confidential

Hadoop / Cloud Architect

Responsibilities:

  • As developer I look to employ the known best practices of software engineering wherever we can. Writing code to design and deploy infrastructure facilitates this in the arena of cloud provisioning, using established techniques like writing modular, configurable code committed to version control will lead us to view our infrastructure as somewhat of a software application in itself, and shifts us in the direction of a DevOps culture. automated administration to manage the infrastructure to run your code
  • Automatic scaling
  • Orchestrate multiple functions
  • Functions related development for the requests served and the compute time
  • Continuous Delivery using Spinnaker platform
  • Worked with Spinnaker cloud deployment tool to support Google Cloud along with Azure

Environment: Spark/Scala, HortonWorks Hadoop, Cassandra, Azure Cloud Platform, PCF, Kafka, Flume, Splunk 6.2, DB2, TeraData, SQL Server, SQL, PL/SQL

Confidential, Cary, NC

Spark/Scala Architect

Responsibilities:

  • Automated administration to manage the infrastructure to run your code
  • Designed solution for various system components using Microsoft Azure
  • Created Solution Architecture based upon PaaS Services
  • Create Web API methods for three adapters to pull data from various systems like Database, BizTalk and SAP
  • Configure & Setup Hybrid Cluster to pull data from SAP Systems
  • Orchestrate multiple functions
  • Custom Functions related development for the requests served and the compute time
  • Worked with Spinnaker cloud deployment tool to support Google Cloud along with AWS and other cloud

Environment: Spark/Scala, Kafka Realtime Dataprocess, HortonWorks Hadoop, Cassandra, Azure Cloud Platform, PCF, Flume, Splunk 6.2, DB2, TeraData, SQL Server, SQL, PL/SQL

Confidential, Sunnyvale, CA

Cloud Architect (GCP) / Hadoop Architect

Responsibilities:

  • Google Cloud Admin / Architect using Google Cloud Platform:
  • Understand the various source systems, architect, design and develop each component of the architecture.
  • Extensively worked on all Google Cloud Platform components - BigQuery, BigTables, Google Cloud Storage in Cloud Shell writing scripts and GUI interface too.
  • Worked with migration of data from on-premise to cloud.
  • Developed pipeline using JSON-Kafka-GCS-BigQuery (BQ) using GCP Pub/Sub and DataFlow.
  • Work closely with domain experts of other group's Data scientists to identify their requirements to blend to generate a common model to leverage and avoid redundant processes in providing the data.
  • Collaborate with the other members of the practice to leverage and share the knowledge which helps in the implementation using Unix shell scripting.
  • Worked with Google CloudML libraries, deployed models to predictive analytics.
  • Worked with TensorFlow and Cloud Machine Learning Engine managed infrastructure.
  • Train machine learning models at scale
  • Host trained models to make predictions on cloud data.
  • Understanding the data and modeling Cloud ML Engine features
  • Worked with Deep Learning / machine learning applications.

Environment: Google Cloud Platform(GCP), Google BigQuery, Pub/Sub(Kafka), Google DataFlow(Spark), BigTable, CloudSQL

Confidential, Atlanta, GA

Cloud Architect and Security Consultant

Responsibilities:

  • Role as MS Azure Architect / Role as Azure Cloud IaaS Admin / PaaS Lead
  • Designed an Azure based solution using Web APIs, SQL Azure
  • Architect on Windows Azure and designing / implementing solutions.
  • Worked on components of Azure such as Service Bus (Topics, Queues, Notification Hubs), Blobs
  • Administered and tech design methodologies
  • Table Storage
  • Active directory
  • Web / Worker Roles / Web Sites
  • ACS / Azure Diagnostics and Monitoring
  • Multi-tenancy
  • SaaS / SQL Azure
  • SQL Reporting
  • IaaS Deployment
  • PowerShell etc
  • Market Trends in DevOps
  • Worked with Delivery pipeline in DevOps and the ecosystem
  • DevOps Security options and notification management in Jenkins.
  • Well versed with GIT and Continuous Integration via Jenkins.
  • Worked with Containers and VMs
  • Image and Containers in Docker / Networking
  • Best practice implementation using Docker Volume
  • Specialized in Virtualization using Docker.
  • Master-Agent Architecture
  • Catalog Compilation in Puppet.
  • Worked with hands on Puppet program using Puppet DSL along with third party tools
  • Node classification using hiera and ENC
  • Worked with Puppet environment structure and configuration, puppet classes / puppet templates
  • Designed and implemented environment creation using Chef, Puppet, Nexus and Nolio.
  • Automate the Linux/Cloud Infrastructure by Chef, Python & Bash Script.
  • Configuring/Managing Production & Dev/QA Chef Server
  • Configuration management tool experience with Chef
  • Worked with automated clusters of containers with Kubernetes Manager
  • Integrate jenkins, docker and puppet
  • Monitor system using Nagios/components
  • Involved in setup, configuration and management of security for Hadoop clusters using Kerberos and integration with LDAP/AD at an Enterprise level
  • Developed design for data migration form one cluster to another cluster using DISTCP.
  • Responsible for scheduling jobs in Hadoop using FIFO, Fair scheduler and Capacity scheduler
  • Possess good Linux and Hadoop System Administration skills, networking, shell scripting and familiarity with open source configuration management and deployment tools such as Puppet or Ansible.
  • Built data platforms, pipelines, storage systems using the Apache Kafka, Apache Storm and search technologies such as Elastic search.
  • Worked with Pivotal Cloud Foundry (PCF) CLI for deploying applications and other PCF management activities.
  • Deploying apps in PCF to orchestrate objects interact with other components to have secured mode of ETL jobs
  • Cloud Storage in Cloud Shell writing scripts and GUI interface too.
  • Worked in AIP which is a robust and structured mode of Microsoft Azure mode of implementation which is in the similar lines if AD but more of granular level to documents/components/services which defines the security enforced.
  • The AIP services are integrated with cloud application model with application components with permissions passed as parent hierarchical model to child components dependent on the role based security
  • Azure Information Protection integrates with end users' existing workflows when the Azure Information Protection client is installed with actions create a corresponding Rights Management template. Additionally use that template with applications and services that support Azure Rights Management.

Environment: Azure Cloud Platform, DevOps, Terraform, HortonWorks Hadoop, PCF, Kafka, Flume, Splunk 6.2, DB2, TeraData, SQL Server, SQL, PL/SQL

Hire Now