“First Neuralink Implanted & Where Other Tech Giants Are Headed w/ Salim Ismail” — Peter H. Diamandis Moonshots EP #85
Episode summary
Another WTF-in-tech roundup with Diamandis and Ismail covering: Google/Microsoft spending more on compute than people ($30B/$50B on data centers); the emerging humanoid robotics race (Figure AI receiving $500M from Microsoft/OpenAI, Tesla Optimus Gen 2, Amica by Engineered Arts); self-driving progress (Tesla’s 300K lines C++ to 3K lines LLM); X/Twitter’s potential to become the world’s biggest bank via crypto wallets; the Neuralink first human implant; and a wide-ranging debate on whether humanoid is the right form factor for robots. Ismail argues humanoid robots face the same “Roomba problem” (too many edge cases in physical space) and prefers specialized forms; Diamandis counters that our world is built for human bodies so humanoid form is optimal. They agree surgical robots are the most compelling near-term application because of “learn once, apply a million times” distributed intelligence. Side discussions include Ralph Merkle’s thermodynamically reversible computation (10 additional orders of magnitude), Eric Drexler’s molecular assemblers, and the sea squirt that eats its own brain.
Key arguments / segments
- [00:03:00] Google spending $30B, Microsoft $50B on data centers; chip compute approaching/exceeding 8B human brains’ worth; billion-dollar companies possible with 1-3 people
- [00:10:00] Figure AI humanoid robot making coffee; generative AI watching humans and learning by repetition replaces manual programming
- [00:13:00] “Roomba problem” debate: Ismail argues physical-world adaptation is inherently hard; Diamandis counters with Tesla FSD progress (300K C++ to 3K LLM lines)
- [00:20:00] Optimus at $20K ($500/month lease = $20/day for household labor); humanoid form needed because we live in a human-shaped world
- [00:29:00] Surgical robots: distributed learning is the killer app; every surgery uploads to cloud, every robot benefits; “learn once, apply a million times”
- [00:06:00] Nanotechnology: Ralph Merkle’s thermodynamically reversible computation offers 10 additional orders of magnitude; molecular assemblers could build anything for ~$1/pound
Notable claims
- Google and Microsoft collectively spending $80B on data centers, more than their people costs
- Sam Altman set a benchmark for one-person billion-dollar company
- Ralph Merkle’s reversible computation using molecular bonds could add 10 orders of magnitude to Moore’s Law
- Elon’s Optimus target price is $20K ($500/month lease equivalent)
- Sea squirt eats its own brain after it stops needing to move — brains evolved specifically for physical navigation
Bias / sponsor flags
- Fountain Life sponsorship: standard mid-roll by Diamandis
- Diamandis discloses his venture fund invested in Figure AI
- Both hosts are uniformly bullish on all technologies discussed
- No discussion of labor displacement concerns from humanoid robots
- Self-driving timeline optimism despite acknowledging they’ve been wrong for a decade
Relevance to Ray Data Co
Low-moderate. The “learn once, apply a million times” distributed robot learning model is an interesting parallel for how we think about AI agent learning. The compute-exceeds-brains milestone and Merkle’s reversible computation insight are useful framing for long-term AI infrastructure thinking. The one-person billion-dollar company thesis is relevant to how we structure our own operations.