We provide IT Staff Augmentation Services!

Cloud Consultant, Big Data Resume

4.00/5 (Submit Your Rating)

NY

CAREER SUMMARY;

  • Decades ago, my career began with the introduction of computers into the scientific analytical research, design and manufacturing worlds.
  • Allowing advancements to achieve, design, manufacture and utilize many of the products and services we have today. To the recent decades of cloud computing, capturing and utilizing the information from the consumption of these modern digital devices. Unifying and collaborating this information in a parallelized, virtualized, pervasive, high - performance, high-volume, quantifiable fashion, for the consumption of analytical intelligence.
  • Experienced Computational Enterprise Technologist with expertise integration of data consumptions, Big Data, analytics, distributed intelligent cloud computing, container intelligence automation, mission critical high availability and high performance, grid, cluster, utility and business computing intelligence, analytic computing, probability intelligence to scientific analytics and solvers, software developments, patent and copyright developments, storage solutions and pervasive enterprises.
  • Scientific engineering and analysis’s, enterprise architecting, deployments and programming utilizing the latest technology advancements.
  • With expertise in computational solutions, supercomputer to cloud, research, analytics, business intelligence, scientific engineering, data sciences, distributed and parallelism computing, Iot, CFD analytics, process modeling data real time intelligence and GDPR data security.
  • Currently hold a utility patent and an expired software copyright.
  • My experience is from new markets, start-ups to IPOs and mergers, new technology deployments for business portfolio managements.
  • Known for innovative, collaborative execution, intuitiveness and dedicated approaches to researching, designing, architecting and developing and deploying new enterprise business solutions, programming, deployment, execution and ongoing management.
  • Accountable for researching, advising, leading, designing, developing and architecting many billion-dollar enterprises business solutions with the federal Government, public and private sectors.
  • Developed and conducted many Big Data Hadoop Distributed Processing business objectives from a financial feature function return on investment portfolio
  • Development and proof of concept for the world’s largest big data distributed parallelism computational solution for the human genome projects, with Hadoop, MR, Cassandra, IBM and other technologies.
  • Migrated, improved performance with reducing costs over dozens of companies to the Cloud.
  • Accountable for the development and implementation of a real time big data (streaming data) collaborative intelligent by-directional trading pattern and financial analysis system for the SEC, using Hadoop, MapR, Big Table, Cassandra and IBM technologies
  • Over a decade of world-wide deployment of pioneering leading-edge Big Data, HPC, Cloud computational analytical technology advancing a corporate technical sales model from 3m to 75m in sales.
  • Changed the ROI from red to black for many early Hadoop adapters.
  • Evaluated, developed and deployed a medical radiology collaborative real time cloud for a nationwide radiologist real time diagnostics solution.
  • Advised, enhanced, developed, and deployed Spark in dozens of Hadoop Big Data analytical deployment.
  • Engineering and programmed a big data distributed MR rendering farm for the entertainment industry, Pixar and Sony utilizing technology from Cray Research, SGI distributed big data file systems and IBM global parallel file systems technologies.
  • Acting CTO for a start up (WAMNET) obtaining over $700m to build, develop and deploy private HPC clouds for the medical, entertainment and print and prepress industries, prior to todays big 3 Cloud providers.
  • Responsible for the development and deployment for a 100% digital information (big data) computing solutions for a Boeing 7xx series airplane.
  • Responsible for delivery of a big data Hadoop Cassandra parsing (MR) patent abstract search solution for the USPTO
  • Consulted and engineered a next generation security model. A level 5, physically impossible to breach, very little to no computational resources needs with MIT, NIST, GDPR and double concurrent tunnels, Walled Gardens
  • Engineered, architect ed and developed a utility patent valued at over $250M.
  • Programmed a high-performance map reduce solution set for the processing of multiple computational fluid dynamics analysis’s
  • Over a decade of world-wide deployment of pioneering leading-edge CAE/CAD/CAM/CAM technology advancing a corporate technical sales model from 3m to 75m in sales.
  • Engineered and developed the world’s largest computational utility data center
  • Developed and obtained a copyright for a software package for an intelligent relational product data bases management solution, PDM/PLM/ILM
  • Programmed a series of map reduced parings coed sets to allow the analysis of large (big data) data files for the computer aided engineering and fluid dynamics analysis.
  • Achieved over 43 years of continual investments in a never ending and always growing knowledge, abilities and successes leading the digital frontier over decades,
  • With decades as a leader with technology allowing the advancements and engineering of products like a pacemakers, CAD/CAE/CAM. On to The utilization of this device capturing data and intelligence from this device. To the changing of the pacemaker (device) from reactive to proactive with a real time medical collaborative cloud enterprise.

TECHNICAL PORTFOLIO:

  • AIX, IRIX, HPUX, Solaris, CentOS, Linux, Red Hat, Windows, Solaris, Chappel, Computational Fluid Dynamics
  • Computer Aided Design and Manufacturing, Enterprise Modeling, ITIL, ESA, Abacus, Rinno, Evolution
  • Computer Aided Engineering, Telco Intelligence DevOps, Cisco, Lucent, Cray XMP Chapel, Data Commodity Classifications
  • Data De Duplication, Dynamic Offloading, Gluster FS, File Systems, GPFS, BigInsights, Cloudera Hadoop, Hortonworks Hadoop, Apache Hadoop, NiFi, Kafka, Greenplum, Teradata, Cassandra, High Performance Computing, Life Sciences High Performance Computing HTML HTML5. C++, Data Mining and Analysis, I/O Path Reduce Deployments, IBM GPFS, EDW, DM, Information Life Cycle Management, BI / BA
  • Low Voltage Computing, Nuclear Analysis, Lustre, Ceph, GPFS, Spark, Scala, Tackeyon, Berkeley BDAS, Databricks
  • Machine code learning and Intelligence, WebSphere, Agile, Jira, Git, Informatica, Eclipse, Chef, Puppet, Ancillary, Luster FS, MPI, WSO2, ESB, Virtualization Technologies, Oracle, Informix, Shell, Pearl, PHP, Java, Product Life Cycle Management, BMC Patrol, Computer Associates, SDLC, Terraform, Google Cloud. MuleSoft, RDBMS, Java/J2EE, Linux, PHP, Python, C, C++, Hadoop, Hive, Sqoop, HBase, Pig, MapReduce, MapR, Oozie, Yarn, Spark, Scala, Storm, Slum, Knox, Atlas, Rock clusters, R code, MongoDB & other Hadoop eco-system components, Data warehouse, Data Archiving, BI / BA / SA and ETL tools and frameworks, Relational Intelligent Data Base Managements and Deployments Rendering and Radiology, SAP, Storage and Backup, Mobile Data Intelligence, Python Coding, Rendering, Scientific Computational Enterprises, Big Data Business Intelligence Analytic, SQL, NoSQL, SSIS, WebSphere, Azure, AWS, Elastic Search, BlueMix, SSD Storage and Flash technologies, AMP, Talend, TIBCO, WSR ESB, AES, EMR, EKS, Glue, Lambda, Redshift, DynamoDB, MongoDB, MariaDB, RDS, VPC, AWS CLI, CloudWatch / Foundations / Front, EC2, S3, EFS, Ops Works, Elastic Beanstalk / Cache, AWS Luster, Kinesis, Spot instances, Glacier, Athena, AWS HDFS, Load Balancers, Cloud9, Machine Learning, Deep Lex, AWS Zeph, ECS, QuickSight, Ganglia, AWS Supercomputing, TensorFlow, OpenAI, Singa, Solr, Auto Scaling, Data Pipeline, Route 53, Snowball, Direct connect, Azure, HDInsights, Cloudera, Hortonworks, AD, SQL, SSIS, Terraform, Azure Supercomputing, Cluster computer, Utility Computing, Data Storage, Backup and DR / HA, Pentaho, alteryx, Confluent, Teradata Vertica, Pivotal, Wave Analytics, Tableau, Snowflake, birst, Splice, Mark Logics, Pure Storage, WAVE Computing, DWAVe computers

PROFESSIONAL EXPERIENCE:

Confidential, NY

Cloud Consultant, Big Data

Responsibilities:

  • Conducted assessments on the Cloudera Big Data solution, determining security gaps, performance bottlenecks and various default setting and incomplete installations.
  • Architected, integrated and delivered a road map for a 1 to 7-year Big Data HPC cloud portfolio solution and intelligent on demand automation AWS deployment.

Confidential

Cloud Consultant, Big Data

Responsibilities:

  • Extensive decade of pre & post / prod cloud deployments with GCP, AWS, Azure.
  • With 2ndWatch, I was exclusively focused on MuleSoft and Terraform intelligent automation integration for Big Data Analytics, HPC on the cloud. Working with their customers going to Big Data Analytics on the cloud, as well as educating the company on cloud-based Big Data advantages, solution portfolios and best practices.
  • Engineered, integrated and built and deployed multiple POC’s, MVP’s for a fully automated Big Data analytical data sciences on demand portal. The AWS cluster to be built on the fly, as needed, connected to data and offer a containerize pipeline for data scientists and removed when completed.
  • Assessed the customer’s needs, with detailed discovery sessions to engineer, architect, build, configure, code an deploy multiple Cloud (AWS & Azure) Big Data POC’s and MVP’s from small and basic, to large and complex scientific analytics, intelligent and of real-time high-performance levels.
  • Created, integrated an intelligent set of Cloud artifacts, templates and protocols for use with all customers deployments of Big Data analytics to the Clouds (Aws & Azure) from basic analytics to advanced scientific HPC analytics.
  • Consulted and Advised 2W on Big Data from the history, to recent decades, to today focused on the cloud advancements and supportive Big Data technology portfolio.

Confidential

Big Data Architect / Director

Responsibilities:

  • SME with cloud deployments with GCP, AWS, Azure.
  • I exclusively lead the direction for Big Data Analytics at Confidential from discovery, development and deployment to delivery of a automate interrelated (Terraform) cloud-based high-performance analytical data lake collaborative for the credit union industry. Changing the credit unions industries from a legacy 30-day analytical window to a real time analytical intelligence HPC data lake
  • Architected, integrated, developed and deployed a Computational Cloud HPC Credit Union Big Data Lake “CASPAIN” the next generation collaborative, intelligent analytical cloud framework.
  • Delivered software development life cycles plans, to increase the performance and profit margins for the current OA M360 product 20-fold. Roll out a plan to take this new software to a Cloud intelligent container application for over 7,000 Credit Unions across the USA
  • Changed the Credit Union digital reporting window from a 30-day report, benchmark to a daily, hourly and even real time.
  • Conducted business discovery, architected and built, POC’s / MVP’s / Pre-Production, coded multiple Cloud Data Lakes for the Credit unions industry
  • Organized and advised for a GDPR, NIST, MIT and Harvard, PII data compliance needs today and decades to come.
  • Created and conducted bi-weekly technology knowledge transfer briefings with technical personal, leaderships and legal.
  • Developed multiple legal abstracts for copyrights and patents for existing products and the CASPIAN Data Lake.
  • Created and conducted multiple video press releases, trade show advisement and prod casts for the OA Caspian Data Lake

Confidential, Minneapolis, MN

Big Data Architect / Independent Cloud SME

Responsibilities:

  • Re-architect, integrated, rebuilt and redeployed the current Big Data analytical solution to in corporate advancements utilizing cloud technologies to offer a Internet of things for the power generation and distribution services they provide worldwide
  • I was the lead engineer and architect to establish a pseudo-real time Big Data Cloud, Spark IoT with gas turbine, deiseal powered and wind power generation devices worldwide.
  • Directly responsible for the MVPs for increasing performance connecting power generation devices worldwide to a private Big Data analytical real-time cloud.
  • I productionized a active POC for a multiple PB, 300 node real time Spark Big Data Analytical machine learning solution for 911 source data. Incorporating HPC technologies, cloud advancements and NLP.
  • Architected, integrated, designed and built two POC’s (AWS and Azure) for changing the pacemaker digital process. To a Big Data real time intelligent collaborative medical analytical proactive solution.
  • Recovered, rebuilt and delivered the current Big Data solution increasing the service level availability from 30% to 90%. Followed by a re-architecture of the entire analytical Big Data foundation to in corporate on prim and hybrid cloud technologies utilizing a new portfolio stack, Spark, Cloudera, Azure to name a few.
  • Solution Architect, evaluated, validated, condensed and designed a high-volume gaming real time streaming, business intelligence Big Data Analytical Hybrid Cloud, Spark MVP enterprise solution
  • Engineered, Architected and built a MVP super computer Azure Cloud solution for scientific solvers, routines and computational intense analytics.

Confidential, Las Vegas, NV

Big Data Enterprise Architect / Data Scientist SME

Responsibilities:

  • Lead the Big data analytical business intelligence technology directions, from solution research to evaluations to prof of concepts and onto integrated operations and production ready steady states
  • Research and deliverer a long-term high-volume, real-time multimedia information streams for a future ultimate goal with Big Data automated intelligence corporate enterprise solution plan
  • Responsible for assessing the business needs, designing, architecting and managing all of Confidential Entertainment Big Data computational needs for all proprieties. With, as is, on-premises and as to be a hybrid Cloud.
  • Changed the Hadoop BI SLA from less than 50% to over 99.9%
  • Accountable for evaluating and integrated Big Data technology and vendor offerings, building POC’s and recommending Big Data technologies portfolios.
  • Managed, administered and developed two Hadoop (HA) production clusters greater than 500 TB is useable storage.
  • Supported a staff of script code developments for map R and EDW ETL solutions

Confidential, Minneapolis, MN

Big Data Analytics / Cloud Technologist

Responsibilities:

  • Extensive decade of pre & post cloud deployments with GCP, AWS, Azure, SAVIS and BlueMix
  • Design, architect and validated the world’s largest Big Data distributed parallelism high performance computational solution for the human genome project (bioinformatics).
  • Researched, integrated, engineered and architected and developed a utility patent valued at over $250M.
  • Accountable and the up to date subject matter expert in the Big Data Analytical, business intelligence and high-performance computing industry
  • Provided accountability and leadership for transitions from prof of concepts to production
  • Created, developed and directed a medical distributed big data Hadoop and Cassandra Map Reduce genome analysis HPC solution set for the Federal Human Genome Project
  • Lead programmer and architect that delivered multiple HPC scientific solution parsing and high-performance algorithm solutions sets.

Confidential, Washington, DC

Enterprise Director

Responsibilities:

  • Accountable for the architecting, integrated the development and implementation of a real time Big Data Fraud Analytical (streaming data) collaborative intelligent by-directional system for the Confidential .
  • Assessed, researched multiple Governments agencies for a Cloud computer direction.
  • Direct accountability, leadership, developed, deployed and administered multiple Big Data initial Government initiative for the National Scient Foundation, NSF, SEC, DOJ, USPTO
  • Direct accountability, leadership and recommendations for multiple Federal Government initial initiative for Big Data Analytics, on-premises and Cloud adaptors.
  • Responsible for delivery of a big data parsing patent abstract search solution for the USPTO
  • Created, deployed and delivered multiple big data commodity coded map reduce algorithm solution sets
  • Conducted multiple Information technology steering committee briefings for executive Government officials and officers and various military leadership.
  • Direct accountability, research and recommendations integration initiative for Big Data Analytics, on-premises and Cloud analytics for intelligent, automated and dynamic product replenishments and distributions.
  • Required to provide management for customers to understand the enterprise computational needs to architect an HPC data storage, HA enterprise solution in an ITO (Information Technology Outsourcing) private cloud environment.
  • Engineered, architected and executed a compute and storage migration / consolidation / enhancement solution from Minneapolis to Denver, on a private telco Grid Computational Cloud, 750 servers and 1.5 PB
  • Developed best practices procedures and policies for ING with a private Cloud grid for security protection, high availability and disaster recovery requirements

Confidential, Minnesota

Chief Architect / Founder

Responsibilities:

  • Founder, architecting, developing, integration, deployment and management of a private BaaS (backup / recover as a service) private telco Cloud technology service offering.
  • Architected, programmed and deployed a cloud technology enterprise solution for high availability, data protection and disaster recovery mission critical computing services.
  • Developed and executed the business plan that took Confidential to IPO status in 2004.

We'd love your feedback!