“How Quantum & AI Will Shape the World’s Future” — Peter H. Diamandis Moonshots EP #123
Episode summary
Part one of the Jack Hidary two-parter lays the conceptual foundation for SandboxAQ’s thesis: that Large Quantitative Models (LQMs) — AI trained on physics equations and numerical data rather than language — represent the next frontier beyond LLMs. Hidary walks through the history from Planck to Transformers, explaining why modeling atomic-scale behavior with quantum equations combined with AI will unlock breakthroughs in drug discovery, materials science, and energy that language models cannot touch.
Key arguments / segments
- [00:01:00] Hidary’s background: philosophy/physics/neuroscience at Columbia, quantum group at Alphabet, SandboxAQ spin-out with $500M and Eric Schmidt as chairman
- [00:05:02] Core thesis: both AI and quantum physics are compression engines — they take massive data and compress it into manageable, predictive models
- [00:06:01] History of neural networks from 1943 (McCulloch-Pitts paper) through RNNs to the 2017 Transformer paper (“Attention Is All You Need”) and GPU parallelization
- [00:10:01] LLMs as compression: they generalize patterns from language corpora but are fundamentally limited to recombining what exists in text; no new physics emerges from training on words
- [00:13:01] The majority of the world is numbers, not words — drug molecules, battery chemistry, fluid dynamics all require mathematical modeling that LLMs cannot provide
- [00:15:01] The realization that modern compute could bring quantum equation solving to production scale — the founding insight of SandboxAQ
- [00:22:00] History of quantum physics: Planck (1900), Einstein’s photoelectric effect (1905), Schrodinger, Heisenberg, Bohr — correcting the misconception that quantum only describes small scales
- [00:26:01] Valence electrons and drug design: immunotherapy drugs work by modeling molecular binding at the quantum level; LQMs can simulate candidate molecules computationally instead of wet-lab trial-and-error
Notable claims
- There are 4-5 simultaneous crises in late 1800s physics that all resolved once Newtonian assumptions were abandoned for quantum mechanics [00:23:00]
- Current LLMs up to 2-2.5 trillion parameters, but still fundamentally limited to probabilistic text generation [00:12:01]
- 40 years of Alzheimer’s research with effectively nothing to show; quantum-informed drug discovery is positioned as the breakthrough path [00:03:01]
Guests
- Jack Hidary — CEO of SandboxAQ. Author of “AI or Die” and “Quantum Computing: An Applied Approach.” Former Alphabet quantum group founder. XPRIZE trustee. Columbia-educated in philosophy, physics, and neuroscience.
Mapping against Ray Data Co
The LQM framing (Large Quantitative Models vs. Large Language Models) is a useful conceptual distinction for the newsletter. The compression-engine mental model — both AI and physics compress reality into actionable predictions — is a strong explanatory framework. The claim that language-trained AI cannot discover new physics or chemistry is relevant to any RDCO content about AI limitations and where the real frontier work is happening.
Related
- 2024-10-17-moonshots-ep124-jack-hidary-quantum-applications
- quantum-computing
- large-language-models