We provide IT Staff Augmentation Services!

Sr. Cloud Architect Resume

3.00/5 (Submit Your Rating)

SUMMARY:

  • Optimization experienced with DevOps Engineer understands the melding of operations and development to quickly deliver code to customers. Have experience with the Cloud and monitoring processes as well as DevOps development in Windows Mac and Linux systems
  • Terraform emphasized infrastructure provisioned for DevOps implementation
  • Worked within the Cloud for integration processes.
  • Performed DevOps for Linux Mac and Windows platforms.
  • Focused on automation and integration.
  • Monitored developed applications and fixed bugs.
  • Wrote code and designed continual updates.
  • Completed load and performance testing of extremely complex systems
  • Cloud Architect responsibilities go hand in hand with Big Data / Hadoop activities which I am extremely hands on with as an Administrator / Cloudera / Hortonworks / Data Scientist / Hadoop Architect (CDH 5) / Hadoop Security Specialist (Kerberos / Sentry / ACL) / Cloud Architect (AWS) / Cloud Security Professional (CCSP) / GCP ( Confidential Cloud Platform) / Microsoft Azure Platform
  • With all the above skills - I acquired Multi-dimensional skilled matrix with single point of justified solution applicable with enormous expertize to Architect, build and execute enterprise level solution with minimal dependency off of other technical resource(s).
  • 25+ years IT Industry across various industries with Data Science related world of skills emphasize over the certified skills below:
  • Cloudera Hadoop / Hortonworks / Data Science Analysis / Data Visualization skills / Amazon Web Services (AWS) Architect / Windows Azure Architect / Cloud Security (CCSP) / Confidential Cloud Platform
  • Working on a POC with Confidential Cloud Platform with features included:
  • Stackdriver (Monitoring / Logging / Error Reporting), App Engine, Compute Engine, Container and Networking, storage related Bigtable/SQL, Confidential Cloud Platform related API Manager, Cloud Launcher and IAM & Admin activities. BigQuery, Dataproc, DataFlow and Genomics from the BigData implementation of data migration activities.
  • Familiar with Hortonworks, MapR with Lambda Architecture, IBM BigInsights environments
  • Over 5 years of experience in BigData, Hadoop, HDFS, HBASE, Hive, pig and Linux with hands-on project experience in various Vertical Applications.
  • Expertise in HDFS Architecture and Cluster concepts.
  • Expertise in Hadoop Security and Hive Security.
  • Expertise in Hive Query Language and debugging hive issues.
  • Expertise in Sqoop and Flume.
  • Worked with Kafka messaging services, familiar with other related messaging tools - RabbitMQ.
  • Involved in implementation of Hadoop multi node cluster, installed Hadoop Ecosystem softwares, configured HDFS.
  • Worked on Multi Clustered environment and setting up Cloudera Hadoop echo-System, creating jobs to get the data from RDBMS to HDFS, from HDFS to RDBMS.
  • Experience in Big Data, Hadoop architecting.
  • Worked on Hadoop environment (HDFS) setup, Map Reduce Jobs, HIVE, Hbase, PIG and NoSQL and MongoDB
  • Software installation and configuration
  • Built automation and internal tools for Hadoop jobs.
  • Tableau 9.0 / Tableau 8.1, SQL Server 2008R2 & 2012, Excel, Access, Confidential Stack
  • Worked with all kinds of Data Sources, TDE, TDS, Extracts, live connections using HBase, Hadoop HDFS, GPDB (Greenplum), data blends, joins with both relational model databases and multi-dimensional modeled data sources across heterogeneous databases.
  • Experience designing complex dashboards that take advantage of all tableau functions including data blend
  • Strong experience writing complex SQL and troubleshoot and tune SQL to provide the best performance
  • Ability to drive insight by designing visualizations with logical and meaningful data flow
  • Experience doing full life cycle development, including business requirements, technical analysis and design, coding, testing, documentation, implementation, and maintenance
  • Experience implementing data visualization solutions using Hadoop is a pulsing, actions, and parameters
  • Big Data (Map Reduce, Impala, HIVE, etc.), JIRA project management suite
  • Data Modeling / Data Architecture
  • Multiple Reporting Structures / Dashboards

TECHNICAL SKILLS:

Hadoop Security related skills: Apache Sentry / Apache Knox / Apache Argus (Hortonworks)

BI Dash boarding skills: Tableau 9.3 / Tibco Spotfire / Qlikview / Sisense BI / Kibana / Splunk / Watson Analytics / Pentaho / SSRS / OBIEE / Microstrategy

ETL Expertize: Informatica BDE 9.6 / Ab Initio / SSIS / Spark / ODI / Datastage

Databases / RDBMS: NoSQL / Columnar DB / Unstructured Data / SQL related skills Oracle, Solr, HBase, MongoDB, Casandra, Greenplum (GPDB)

PROFESSIONAL EXPERIENCE:

Confidential

Sr. Cloud Architect

Responsibilities:

  • Worked as Cloud Engineer using Infrastructure as Code as part of your deployment process has a number of immediate benefits to your workflow:
  • To ensure the speed - automating, manually navigating through an interface to deploy and connect up resources
  • Large set of infrastructure reliability, resource or provision the services. With IaC the resources will be configured exactly as declared, and implicit/explicit dependencies can be used to ensure the creation order.
  • With the ease at which the infrastructure can be deployed, experimental changes can be readily investigated with scaled down resources to minimize the cost and can be scaled up for production deployments.
  • As developer, I always look to employ the known best practices of software engineering wherever we can. Writing code to design and deploy infrastructure facilitates this in the arena of cloud provisioning, using established techniques like writing modular, configurable code committed to version control will lead us to view our infrastructure as somewhat of a software application in itself, and shifts us in the direction of a DevOps culture.
  • Building apps using Azure
  • Azure platform analytics and native AI
  • Building Azure cognitive APIs
  • Azure Data Factory / Databricks
  • Azure Data Warehouse / Service Bus automated administration to manage the infrastructure to run your code
  • Automatic scaling
  • Orchestrate multiple functions
  • Azure Functions related development for the requests served and the compute time
  • Continuous Delivery using Spinnaker platform
  • Worked with Spinnaker cloud deployment tool to support Confidential Cloud along with Azure

Environment: Azure Data Factory, DevOps, DataBricks, Terraform, Spinnaker, HortonWorks Hadoop, Cassandra, Azure Cloud Platform, PCF, Kafka, Flume, Splunk 6.2, DB2, TeraData, SQL Server, SQL, PL/SQL

Confidential, Boston, MA

Hadoop / Cloud Architect

Responsibilities:

  • As developer I look to employ the known best practices of software engineering wherever we can.
  • Writing code to design and deploy infrastructure facilitates this in the arena of cloud provisioning, using established techniques like writing modular, configurable code committed to version control will lead us to view our infrastructure as somewhat of a software application in itself, and shifts us in the direction of a DevOps culture. automated administration to manage the infrastructure to run your code
  • Automatic scaling
  • Orchestrate multiple functions
  • Functions related development for the requests served and the compute time
  • Continuous Delivery using Spinnaker platform
  • Worked with Spinnaker cloud deployment tool to support Confidential Cloud along with Azure

Environment: Spark/Scala, HortonWorks Hadoop, Cassandra, Azure Cloud Platform, PCF, Kafka, Flume, Splunk 6.2, DB2, TeraData, SQL Server, SQL, PL/SQL

Confidential, Cary, NC

Spark/Scala Architect

Responsibilities:

  • Automated administration to manage the infrastructure to run your code
  • Designed solution for various system components using Microsoft Azure
  • Created Solution Architecture based upon PaaS Services
  • Create Web API methods for three adapters to pull data from various systems like Database, BizTalk and SAP
  • Configure & Setup Hybrid Cluster to pull data from SAP Systems
  • Orchestrate multiple functions
  • Custom Functions related development for the requests served and the compute time
  • Worked with Spinnaker cloud deployment tool to support Confidential Cloud along with AWS and other cloud

Environment: Spark/Scala, Kafka Realtime Dataprocess, HortonWorks Hadoop, Cassandra, Azure Cloud Platform, PCF, Flume, Splunk 6.2, DB2, TeraData, SQL Server, SQL, PL/SQL

Confidential, Sunnyvale, CA

Confidential Cloud Architect / Hadoop Architect

Responsibilities:

  • Understand the various source systems, architect, design and develop each component of the architecture.
  • Extensively worked on all Confidential Cloud Platform components - BigQuery, BigTables, Confidential Cloud Storage in Cloud Shell writing scripts and GUI interface too.
  • Worked with migration of data from on-premise to cloud.
  • Developed pipeline using JSON-Kafka-GCS-BigQuery (BQ) using GCP Pub/Sub and DataFlow.
  • Work closely with domain experts of other group's Data scientists to identify their requirements to blend to generate a common model to leverage and avoid redundant processes in providing the data.
  • Collaborate with the other members of the practice to leverage and share the knowledge which helps in the implementation using Unix shell scripting.
  • Worked with Confidential CloudML libraries, deployed models to predictive analytics.
  • Worked with TensorFlow and Cloud Machine Learning Engine managed infrastructure.
  • Train machine learning models at scale
  • Host trained models to make predictions on cloud data.
  • Understanding the data and modeling Cloud ML Engine features
  • Worked with Deep Learning / machine learning applications.

Environment: Confidential Cloud Platform(GCP), Confidential BigQuery, Pub/Sub(Kafka), Confidential DataFlow(Spark), BigTable, CloudSQL

Confidential, Atlanta, GA

Cloud Architect and Security Consultant

Responsibilities:

  • Designed an Azure based solution using Web APIs, SQL Azure
  • Architect on Windows Azure and designing / implementing solutions.
  • Worked on components of Azure such as Service Bus (Topics, Queues, Notification Hubs), Blobs
  • Administered and tech design methodologies
  • Table Storage
  • Active directory
  • Web / Worker Roles / Web Sites
  • ACS / Azure Diagnostics and Monitoring
  • Multi-tenancy
  • SaaS / SQL Azure
  • SQL Reporting
  • IaaS Deployment
  • PowerShell etc
  • Market Trends in DevOps
  • Worked with Delivery pipeline in DevOps and the ecosystem
  • DevOps Security options and notification management in Jenkins.
  • Well versed with GIT and Continuous Integration via Jenkins.
  • Worked with Containers and VMs
  • Image and Containers in Docker / Networking
  • Best practice implementation using Docker Volume
  • Specialized in Virtualization using Docker.
  • Worked with hands on Puppet program using Puppet DSL along with third party tools
  • Node classification using hiera and ENC
  • Worked with Puppet environment structure and configuration, puppet classes / puppet templates
  • Designed and implemented environment creation using Chef, Puppet, Nexus and Nolio.
  • Automate the Linux/Cloud Infrastructure by Chef, Python & Bash Script.
  • Configuring/Managing Production & Dev/QA Chef Server
  • Configuration management tool experience with Chef
  • Worked with automated clusters of containers with Kubernetes Manager
  • Integrate jenkins, docker and puppet
  • Monitor system using Nagios/components
  • Involved in setup, configuration and management of security for Hadoop clusters using Kerberos and integration with LDAP/AD at an Enterprise level
  • Developed design for data migration form one cluster to another cluster using DISTCP.
  • Responsible for scheduling jobs in Hadoop using FIFO, Fair scheduler and Capacity scheduler
  • Possess good Linux and Hadoop System Administration skills, networking, shell scripting and familiarity with open source configuration management and deployment tools such as Puppet or Ansible.
  • Built data platforms, pipelines, storage systems using the Apache Kafka, Apache Storm and search technologies such as Elastic search.
  • Worked with Pivotal Cloud Foundry (PCF) CLI for deploying applications and other PCF management activities.
  • Deploying apps in PCF to orchestrate objects interact with other components to have secured mode of ETL jobs
  • Cloud Storage in Cloud Shell writing scripts and GUI interface too.
  • Worked in AIP which is a robust and structured mode of Microsoft Azure mode of implementation which is in the similar lines if AD but more of granular level to documents/components/services which defines the security enforced.
  • The AIP services are integrated with cloud application model with application components with permissions passed as parent hierarchical model to child components dependent on the role based security
  • Azure Information Protection integrates with end users' existing workflows when the Azure Information Protection client is installed with actions create a corresponding Rights Management template. Additionally use that template with applications and services that support Azure Rights Management.

Environment: Azure Cloud Platform, DevOps, Terraform, HortonWorks Hadoop, PCF, Kafka, Flume, Splunk 6.2, DB2, TeraData, SQL Server, SQL, PL/SQL

Cloud Architect and Security Consultant

Confidential, Atlanta, GA

Responsibilities:

  • Design and support of business infrastructure. Day to day support of entire IT product lines.
  • Design, integration and management of Amazon Web Services cloud solutions. Provisioning of EC2 instances, built via puppet, and integrated into local area offices in different time zones. Amazon RDS, VPC construction, Security Group policies, IAM, Route 53, CloudFormation, S3, Glacier, OpsWorks.
  • Market Trends in DevOps
  • Worked with Delivery pipeline in DevOps and the ecosystem
  • DevOps Security options and notification management in Jenkins.
  • Well versed with GIT and Continuous Integration via Jenkins.
  • Worked with Containers and VMs
  • Image and Containers in Docker / Networking
  • Best practice implementation using Docker Volume
  • Specialized in Virtualization using Docker. automated administration to manage the infrastructure to run your code
  • Automatic scaling
  • Orchestrate multiple functions
  • AWS Lambda you pay only for the requests served and the compute time
  • Worked with Spinnaker cloud deployment tool to support Confidential Cloud along with AWS and other cloud providers
  • Spinnaker integrated for Kubernetes, Azure, Cloudfoundry to make reliable deployments
  • Implemented Spinnaker deployment pipelines
  • Terraform based AWS Setup
  • Spinning and launching instances using Terraform
  • Worked with EC2 instances off of Terraform
  • Terraform/Packer Jenkins integration
  • Terraform with Jenkins workflow using ECR and ECS
  • AWS Autoscaling along with Terraform commandline mode
  • Master-Agent Architecture
  • Catalog Compilation in Puppet.
  • Worked with hands on Puppet program using Puppet DSL along with third party tools
  • Node classification using hiera and ENC
  • Worked with Puppet environment structure and configuration, puppet classes / puppet templates
  • Designed and implemented environment creation using Chef, Puppet, Nexus and Nolio.
  • Automate the Linux/Cloud Infrastructure by Chef, Python & Bash Script.
  • Configuring/Managing Production & Dev/QA Chef Server
  • Configuration management tool experience with Chef
  • Worked with automated clusters of containers with Kubernetes Manager
  • Integrate jenkins, docker and puppet
  • Monitor system using Nagios/components
  • Management of build farm environment and workflow management and administration using Jenkins, GIT, Bamboo, Artifactory. Stash, Jira, Confluence, and various target build environments Android, iOS, Windows, Linux
  • Operational support, evaluation, and integration and administration of monitoring and availability support services such as Panopta, Pagerduty, Zendesk, New Relic, ONMS, Logstash, Kibana, and Redis.
  • VMWare management and support of server farms running in virtualization environments
  • SSL Security certificate management for enterprise, maintaining certificates across multiple ssl providers, and integrating certificates into products such as nginx, apache, tomcat, AWS-ELB
  • Management of enterprise puppet environments. Maintaining test and production catalogues and writing reusable modules for use in configuration management.
  • Management of enterprise Windows network. Support of domain controllers and internal LAN.
  • New product prototyping and evaluation.
  • Cost reduction strategies. Server and service consolidation, migration of legacy systems into virtualized environments.
  • Network and server room maintenance and support. Day to day operational support of server racks and networking infrastructure
  • Desktop/Laptop office support. Handled provisioning and customizations of user computing work environments that included all office infrastructure
  • Understand, implement, and automate security controls, governance processes, and compliance validation
  • Define and deploy monitoring, metrics, and logging systems on AWS
  • Implement systems that are highly available, scalable, and self-healing on the AWS
  • Design, manage, and maintain tools to automate operational processes
  • Deploying, managing, and operating scalable, highly available, and fault tolerant systems on AWS
  • Migrating an existing on-premises application to AWS
  • Implementing and controlling the flow of data to and from AWS
  • Selecting the appropriate Engine / service based on compute, data or security reqs.
  • Identifying appropriate use of AWS operational best practices
  • Estimating AWS usage costs and identifying operational cost control mechanisms
  • Migrated data from DB2 and Teradata to Hadoop
  • Migrated data from hadoop to AWS Cloud Storage
  • Written automated scripts to deploy compressed files into Cloud
  • Assigned roles and administered AWS Cloud users from Console
  • Functional, non-functional and performance tuning to flatten tables in AWS
  • Worked on configuration setup via Cloud Shell in AWS Cloud environment
  • As Big Data Architect am responsible for the creation, care and maintenance of the high performance systems
  • Node configuration, cluster management processes.

Environment: DevOps, Terraform, Spinnakerm Horton Works Hadoop, Cassandra, AWS Cloud Platform, PCF, Kafka, Flume, Splunk 6.2, DB2, TeraData, SQL Server, SQL, PL/SQL

Confidential, San Ramon, CA

Hadoop Architect / Cloud Architect and Security Consultant (AWS)

Responsibilities:

  • Hadoop Admin Responsibilities:
  • Responsible for architecting Hadoop clusters.
  • Install, Configure and Manage of Hadoop Cluster spanning multiple racks.
  • Debug, remedy, and automate solutions for operational issues in the production environment.
  • Participate in the research, design, and implementation of new technologies for scaling our large and growing data sets, for performance improvement, and for analyst workload reduction.
  • Define job flows using fair scheduler.
  • HA implementation of Name Node Replication to avoid single point of failure.
  • Manage and review Hadoop Log files.
  • Set up automated 24x7x365 monitoring and escalation infrastructure for Hadoop cluster.
  • Load log data into HDFS using Flume.
  • Provide support data analysts in running Pig and Hive queries.
  • Perform Infrastructure services (DCHP, PXE, DNS, KICKSTART, and NFS).
  • IoT Role: Architected IoT components on the Weather related data to make the application aware for the predictive analysis and analytics to cater for Solar Panel readings for the energy delivered on day-to-day basis.
  • Worked with Maven, Ant, Jenkins/Hudson, Nexus Repo Management
  • Deployment automation using Jenkins and chef client
  • Written Chef recipes to manage deployments and automation of infrastructure
  • Configuration Management tool - Opscode Chef, puppet.
  • Responsible for technology research, budgeting, feasibility, evaluation and proof of concept.
  • Systems design and architecture to meet capacity and throughput demands, as well as performance requirements.
  • Integrate with management and development organizations to produce high quality enterprise level hardware and software solutions that meet or exceed client demands, with an eye for the most efficient and cost effective solutions available.
  • Focus on application/systems performance and capacity management.
  • Debugging and support of applications using profiling and monitoring solutions
  • Created a Technology Strategy to align with the client’s five-year Business Strategy for data lake integration
  • Implement and manage continuous delivery systems and methodologies on AWS
  • Understand, implement, and automate security controls, governance processes, and compliance validation
  • Define and deploy monitoring, metrics, and logging systems on AWS
  • Implement systems that are highly available, scalable, and self-healing on the AWS platform
  • Design, manage, and maintain tools to automate operational processes
  • Deploying, managing, and operating scalable, highly available, and fault tolerant systems on AWS
  • Migrating an existing on-premises application to AWS
  • Implementing and controlling the flow of data to and from AWS
  • Selecting the appropriate AWS service based on compute, data, or security requirements
  • Identifying appropriate use of AWS operational best practices
  • Estimating AWS usage costs and identifying operational cost control mechanisms
  • Migrated key systems from on-prem hosting to Amazon Web Services
  • Functional, non-functional and performance testing of key systems prior to cutover to AWS
  • Configured auto-scaling website platform with peak visitors of 14k per minute
  • As Big Data Architect am responsible for the creation, care and maintenance of the high performance indexing infrastructure.
  • In-dept understanding of the Hadoop ecosystem.
  • Am responsible for designing the next generation data architecture for the unstructured data
  • Written, debugged, and analyzed the performance of many map reduce jobs.
  • Devised and lead the implementation of the next generation architecture for more efficient data ingestion and processing.
  • Proficiency with mentoring and on-boarding new engineers Hadoop and getting them up to speed quickly.
  • Experience with being a technical lead of a team of engineers.
  • Proficiency with modern natural language processing and general machine learning techniques and approaches
  • Extensive experience with Hadoop and HBase, including multiple public presentations about these technologies.
  • Experience with hands on data analysis and performing under pressure.
  • Designed and wrote a layer on top of MapReduce to make the task of writing MapReduce jobs easier and more safe for Junior Engineers.
  • Contributed much of the code in our open source project.
  • Provide thought leadership and architectural expertise to a cross-functional team charged with deploying a host of customer-related applications and data to the cloud.
  • Conduct systems design, feasibility and cost studies and recommend cost-effective cloud solutions.
  • Administer discovery, user testing and beta programs to garner feedback prior to each major release.
  • Advise software development teams on architecting and designing web interfaces and infrastructures that safely and efficiently power the cloud environment.
  • Selected Achievements - Reduce overhead and infrastructure costs by 38 percent by consolidating and deploying 10 legacy applications to cloud platforms Amazon web services
  • Deliver major releases to stakeholders on time and under budget.
  • Successfully develop feature packages that include use cases, work-flows, requirements and functional specifications for hand off to development team.
  • In-depth understanding of the Hadoop ecosystem.
  • Am responsible for designing the next generation data architecture for the unstructured data
  • Written, debugged, and analyzed the performance of many map reduce jobs.
  • Devised and lead the implementation of the next generation architecture for more efficient data ingestion and processing.
  • Proficiency with mentoring and on-boarding new engineers Hadoop and getting them up to speed quickly.
  • Experience with being a technical lead of a team of engineers.
  • Proficiency with modern natural language processing and general machine learning techniques and approaches
  • Extensive experience with Hadoop and HBase, including multiple public presentations about these technologies.

Environment: Cloudera Hadoop (CDH 4), AWS, MongoDB, Spark, Splunk 6.2, TeraData, SQL Server, SQL, PL/SQL, TOAD

Confidential, Chicago, IL

Hadoop / AWS Cloud Architect

Responsibilities:

  • Playing key role in designing Big data initiatives.
  • Understanding the data nature from different OLTP systems and designing the injection processes for HDFS
  • Using Informatica 9.6 and Sqoop.
  • Working on Hadoop File formats TextInputFormat and KeyValueTextInputFormat
  • Designing data model on Hbase and Hive.
  • Creating Mapreduce jobs for Adhoc data requests.
  • Partitioning and Bucketing techniques in hive to improve the performance.
  • Optimizing Hive and Hbase queries.
  • Designing HBase column schemas.
  • Creating common data interface for Pig and Hive using Hcatalog.
  • Understanding the business requirements and needs and drawing the road map for Big data initiatives.
  • Driving POC initiatives for finding the feasibilities of different traditional and Big data reporting tools with the data lake Spotfire BO, Tableau etc
  • Scheduling big data jobs using the in-house scheduler Appworx.
  • Using Kerberos and LDAP security authentications.
  • Implementing POC for big data tools like Mahout, Impala etc.
  • Driving initiative to automate the recurring manual activities for monitoring and operations using Unix Scripting.
  • AWS Cloud responsibilities:
  • Design and build of core platforms
  • Server configuration management via Puppet
  • Automated deployments using CloudFormation
  • Migrated key systems from on-prem hosting to Amazon Web Services
  • Functional, non-functional and performance testing of key systems prior to cutover to AWS
  • Configured auto-scaling website platform with peak visitors of 14k per minute
  • Capacity planning, Bottleneck identification

Environment: Cloudera Hadoop, AWS, SQL Server, SQL, PL/SQL, TOAD

We'd love your feedback!