Moonshots EP 208: AI Costs Plummeting 40x — Why Costs Are Collapsing and What It Really Means
Summary
A wide-ranging WTF episode covering Anthropic’s rise in enterprise LLM market share (overtaking OpenAI on API usage), Anthropic’s projected $70B revenue at 77% margins by 2028 vs OpenAI’s $100B but unprofitable until 2029. The panel explores Fei-Fei Li’s World Labs and its 3D Gaussian splat approach to world generation (compute-efficient, client-side rendering vs pixel-wise server-side approaches like Google Genie 3). A Goodfire paper on separating memorized knowledge from reasoning weights draws excitement — the prospect of a sub-billion or even million-parameter model that retains general intelligence. Google’s nested learning paper on higher-order meta-learning gets framed as a step toward lifelong learning and a potential grand unified theory of ML. The cost-of-living / unemployment anxieties in the Global South provide the opening frame, with Salim reporting from Brazil on growing international demand.
Key Segments
- [00:00-01:00] Global anxiety frame — cost of living, unemployment, poverty as top concerns
- [03:00-06:00] Anthropic overtakes OpenAI in enterprise API share; code generation as critical path to recursive self-improvement
- [07:00-12:00] Anthropic $70B revenue projection at 77% margin vs OpenAI $100B unprofitable; Bezos strategy parallels
- [13:00-17:00] World Labs marble model — 3D Gaussian splats vs pixel-wise generation; client vs server compute tradeoffs
- [18:00-23:00] Goodfire paper on separating memorized data from reasoning weights; 90% parameter reduction through distillation
- [24:00-27:00] Google nested learning — higher-order meta-learning, compression as path to AGI
Notable Claims
- Anthropic projects $70B revenue by 2028 at 77% gross margin
- OpenAI projects $100B revenue but unprofitable until 2029
- 90% reductions in parameter count through distillation are “pretty common” now
- Code generation may be the critical path to recursive self-improvement (Alex’s framing)
- Compression beyond a certain threshold produces a phase transition to general intelligence
Guests
- Salim Ismail — Founder of OpenExO
- Dave Blundin — Co-host
- Alexander Wissner-Gross — Computer scientist, founder of Reified
RDCO Mapping
- Sanity Check angle: The Anthropic vs OpenAI business model divergence (margin vs growth) is a clean narrative hook
- Data point: 77% margin vs unprofitable-until-2029 is a concrete comparison for the newsletter’s “numbers that matter” thread
- Vault cross-ref: Connects to demonetization, AI cost curves, world models, and distillation efficiency threads