Thin Is In
Thompson argues that AI is driving a fundamental architectural shift back to thin clients. Chat interfaces and agentic workflows require minimal local compute — all meaningful work happens server-side. This parallels the mainframe-terminal model, but with natural language replacing deterministic commands.
The implications for software: when the interface is a conversation, “years of muscle memory become worthless” and switching costs dissolve. For vertical software, the UI was often most of the value — that premium pricing is threatened. Agents take this further: the user needs zero local compute to accomplish real work, just connectivity.
A hardware crowd-out reinforces this shift. AI’s voracious demand for memory (HBM, DRAM, flash) is making consumer electronics more expensive — Sony may delay the PS6 to 2028-2029, smartphone makers are trimming forecasts, and Samsung now reviews memory contracts quarterly. But thick clients have plateaued anyway: current PCs, phones, and consoles are already “good enough.” AI simultaneously makes them more expensive and less important.
Thompson is skeptical of local inference near-term: limited model sizes, context windows, and speed. By the time local inference is viable, path dependency may have already locked in cloud-centric workflows. The exception: well-considered UI still matters for specific workflows where open-ended prompts are inferior to purpose-built buttons.
RDCO Mapping
Directly relevant to RDCO’s agent architecture decisions — validates the cloud-first approach for AI workloads. The vertical software disruption angle (UI = value, and that’s dissolving) is worth tracking as an opportunity for data-native entrants.
Related
- 2026-02-05-stratechery-interview-benedict-evans-ai-software
- 2026-01-07-stratechery-nvidia-ces-vera-rubin