Google Cloud Solutions Architect Resume
Austin, TX
SUMMARY:
- 15 years of experience in architecture for Google/AWS Cloud Solution Design, NoSQL, Big Data Architecture, Oracle Exadata, Blockchain, Consulting Data management and Integration, Data Warehouse and Database administration.
- Designed and implemented large data stores on - prem and in-cloud, to the scale of hundreds of Terabytes for analytics on data of various domains including healthcare, financial and tele-communications.
- Deep understanding of cloud computing and data technologies, business drivers, emerging computing trends, and deployment options
- Respond promptly to customer questions with deep, detailed technical explanations of product features and capabilities
- Partner with Sales team to articulate the overall value proposition, vision and strategy to customers.
- Building the evaluation plan, with detailed timelines and responsibilities.
- Showcased winning solutions and proof of concepts for multiple clients for multimillion-dollar deals.
- Presented Solutions/Case studies in seminars, trade shows, and webinars
- Individual contributor, problem solver and self-starter on cutting edge data technologies.
- Proven leadership, organizational, interpersonal, analytical and problem-solving skills
- Strong team building, partnership and collaboration skills
- Functional liaison between the Sales organization and Product Management/Development teams
- Demonstrate ability to prioritize/communicate conflicting demands in a fast-paced environment.
PROFESSIONAL SKILLS:
Cloud/ Platforms: GCP, AWS, Azure, Jenkins
Big Data Technologies: DataProc, Kubernetes, Data Stage, EC2, S3, BigQuery, Big Table Redshift, EMR, HDInsights, Azure blob, ADLS, Hive, HBase, Spark, Sqoop, Kafka
Databases: CouchBase, MongoDB, Oracle Exadata, Redshift, Cloud Spanner, BigQuery, MySQL
Blockchain: Amazon Managed Blockchain, Ethereum, Hyperledger, Fabric Network
Languages: JavaScript, Python, SQL, SQL-PL, Perl, Node.js
Business Intelligence: Elasticsearch, Kibana, Cognos, SPSS, Erwin
Methodologies: Agile/Scrum, Waterfall
WORK SUMMARY:
Confidential, Austin, TX
Google Cloud Solutions Architect
Responsibilities:
- Processing data loading in BigQuery from Google cloud storage using Google DataProc
- Design, develop and deliver data integration/data extraction/data migration using Data Stage to GCP.
- Scheduling end to end using GCP cloud composer service.
- Migrating ETL jobs to Google Platform.
- Maintaining BQ datasets for reporting requirements
- Hands-on Google BigQuery, Google Cloud Storage, Google Dataflow, CloudSQL, Google Cloud Dataproc, Google Pub/Sub, Sqoop and PySpark.
- Building and maintaining data catalog in GCP
- Provisioning system, user and data level security for data in transit and rest
- Expertise managing and working with large databases, data management, including understanding of various data structures and common methods in data transformation, data validation and audit.
- Established team of Data Stewards, Data analysts and Data Scientists, providing actionable data insights to C-Sales and marketing.
- Engage directly with the customers’ development team, understand their specific business and technology challenges in the area of distributed ledgers integration in new products and services.
- Conducted one-to-few and one-to-many sessions to transfer knowledge
- Engaged with customer and prospects to evaluate products perform POC’s and effectively communicate the key differentiators to stakeholders.
- Experience presenting to all job levels in an org, both technical and non-technical, C-level to individual contributor.
- Prepare and deliver customized solution and product demos.
- Technical coach and mentor cross functional team members.
Confidential, Raleigh, NC
Solutions Architect
Responsibilities:
- Rapidly understand and translate clients’ business challenges and concerns into a solution-oriented discussion
- Experience building and optimizing Cloud agnostic Big Data pipelines within AWS, GCP and Azure Data Lakes.
- Building the Data Integration (ETL), data governance operating model and architecture.
- Experience working on entire AWS/Azure/GCP stack (Data Store, Big Query, Big Table, Google Storage, AWS Glue, S3, Kinesis Data Analytics, EMR, Redshift, ADLS, HDInsight, Elasticsearch, Dataflow).
- Building the evaluation plan, with detailed timelines and responsibilities
- Engaged with customers and prospects to demonstrate our products and effectively communicate the key differentiators
- Experience delivering distributed and highly scalable applications on NoSQL/Sharded relational Databases/MapReduce.
- Data Governance - integrated with ServiceNow to build scalable data approval process where in business users search cataloged entities and request for access, on-demand.
- AWS Cloud Formation template development to deploy the infrastructure and applications in AWS
- Owned the technical sales process, owned and managed the development, delivery and orchestration of technical activities driving toward technical validation, technical selection and technical closure with the customer (technical presentations, POCs, demos)
- Transient/Ephemeral DataProc, EMR & Azure HDI clusters - designed ZDP to ensure it's able to launch compute cluster on-demand. Used ZDP workflows to launch EMR and HDI VMs on demand and process big data workloads.
- Experience presenting to all job levels in an org, both technical and non-technical, C-level to individual contributor.
- Respond promptly to RFI/RFP/RFQ’s with deep, detailed technical explanations of product features and capabilities
- Prepare and deliver customized solution and product demos.
- Responding to tenders, RFIs, RFPs, proposals with respect to product/solution information
- Large-scale systems integration involving both on-premises technology and public cloud platforms.
- Technical coach and mentor to the partner Pre-Sales community
Confidential, Austin, Texas
Senior Principal Engineer
Responsibilities:
- Designed and developed data ingestion from Kafka using spark batches and spark streaming
- Use Spark Structured Streaming to perform transformations and actions which gets the data from Kafka in near real time and move the data in HBASE/Hive
- Strong Linux experience with a focus on ETL, cloud data integration, analytics applications, and big data
- Design data architecture for Raw, Stage, Gold and Insight layers AWS/Azure Data Lakes.
- Implemented Cloudwatch and Airflow Dags for monitoring and Job scheduling
- Designed/developed the system having scalable distributed multi-threaded processing. Able to integrate Big Data Map Reduce methodology into overall architecture.
- Working as Senior Principal Advanced Engineer providing solutions to the Oracle Engineered Systems namely Exadata, Oracle Bigdata Appliance, Supercluster and ODA systems customers
- Supporting for Exadata hardware issues and if required collaborating with H/W team for parts replacement
- Supporting and troubleshooting the Infiniband Switches/ports, HCA ports on DB/CELL nodes and IB topology and InfiniBand/RDS performance issues for Exadata & Supercluster
- Support in Troubleshooting Exadata Cell/Storage and DB/Compute node performance issues
- RCA for Node evictions and Instance evictions
- Data transformation, manipulation and ingestion into downstream Hadoop using Sqoop
- Designed & developed a prototype using Hadoop, Python, Spark
- Performed Data Architecture, Data Modeling, SQL coding, performance tuning of Oracle backend.
- Data import and export to Data Lakes/RDBMS using Sqoop.
- Scheduling Daily/weekly process using UNIX scripts, Azkaban scheduler
Confidential
Lead Database Engineer
Responsibilities:
- Through the ETL process, Built Data Models and prepared refined data for Business users
- Installed and Administered Oracle Exadata and MySQL databases
- Installed and configured Oracle enterprise manager dashboard for Performance monitoring and tuning, capacity planning
- Design and builds database structure and objects that best support operational and analytics application.
- Convert data requirements into database structures for optimal performance and availability.
- Communicate and cooperate with team members and management on the designed database structures and schemas.
- Design ETL framework and partner in implementation
- Work with development teams to design solution architecture and review.
- Troubleshoot and optimize complex workflows.
- Mentor and lead the team of engineers
Confidential
Senior Oracle DBA
Responsibilities:
- Implementation of Enterprise Oracle Data Warehouse, Golden Gate.
- Design ETL framework and partner in implementation.
- Work with development teams to design solution architecture and review
- Troubleshoot and optimize complex workflows.