06-reference

moonshots ep183 eric schmidt superintelligence

Wed Jul 16 2025 20:00:00 GMT-0400 (Eastern Daylight Time) ·reference ·source: Moonshots Podcast ·by Peter Diamandis
eric-schmidtsuperintelligenceenergy-crisischina-us-ai-racenuclear-deterrenceopen-source-riskai-proliferationkissinger

Moonshots EP 183: Ex-Google CEO Eric Schmidt — What Artificial Superintelligence Will Actually Look Like

Summary

A one-on-one deep conversation between Peter Diamandis, Dave Blundin, and Eric Schmidt (former Google CEO, author of Genesis, co-author with Kissinger on AI deterrence). Schmidt’s framing: AI is underhyped because it is a learning machine inside network-effect businesses, and its natural limit is electricity, not chips. He testified that the US needs 92 additional gigawatts for AI — equivalent to 92 large nuclear power stations — and notes that essentially zero new nuclear plants are being started. Schmidt outlines the “San Francisco consensus”: within 1-2 years, AI will produce world-class mathematicians and programmers, and since math and programming are the basis of everything, this will accelerate physics, chemistry, biology, and materials science. He frames the timeline as 1.5-2x slower than the Leopold Aschenbrenner predictions, putting specialized AI savants in every field within 5 years. On China, Schmidt admits he was wrong about the 2-year lead — DeepSeek’s arrival proved inference-time compute and distillation collapsed the gap faster than expected. His most provocative framework is “Mutual AI Malfunction” (co-authored with Dan Hendricks and Alex Wang): the AI equivalent of mutually assured destruction, where nations maintain the capability to cyber-attack each other’s AI infrastructure as a deterrent against crossing sovereignty-threatening capability thresholds. He compares the current moment to 1938 — the Einstein letter has been written, and the conversation about deterrence needs to start before Chernobyl-level AI events occur. Schmidt predicts the endgame is 10 nationalized super-models in multi-gigawatt data centers guarded like plutonium facilities, with the major proliferation risk being open-source models that could eventually run on small servers. On the business side, he notes MCP (Model Context Protocol) is enabling LLMs to directly connect to enterprise databases and write code, threatening 100,000 middleware companies. Dave highlights that voice AI customer service conversations worth $10-$1,000 cost only 10-20 cents of compute.

Key Segments

Notable Claims