Ilya on research over scaling, GPT-5 physics insights, data labeling regressions

42 episodes across 23 podcasts. 38.3 hours distilled.

Top 10 Moments

|
1.
GPT-5 generates de novo core insight for peer-reviewed physics paperSteve Hsu, Manifold
05:43
2.
The 'AI Baker Problem' and the collapse of the junior training ladderLuis Garicano, Epoch After Hours
23:13
3.
Frontier models regressed for months due to human raters not executing codeEdwin Chen, Unsupervised Learning with Jacob Effron
05:21
4.
The shift from the 'Age of Scaling' back to the 'Age of Research'Ilya Sutskever, The a16z Show
22:31
5.
Marketing, Sales, and Support converge into a single agent platformJason Lemkin, The Twenty Minute VC (20VC)
37:26
6.
Using 'latent space seeds' and dividers to bypass model guardrailsPliny the Liberator, Latent Space: The AI Engineer Podcast
10:10
7.
Enterprise AI ROI fails because legacy systems can't absorb speedNina Edwards, The AI in Business Podcast
05:45
8.
Memory reconsolidation as 'root permissions' for personalityStephen Zerfas, AI & I
48:24
9.
The 'Denialism Gap' in AGI progress is demand-drivenZvi Mowshowitz, The Cognitive Revolution
03:34
10.
Anduril CEO on the 'Galapagos island' of US defense manufacturingBrian Schimpf, Sourcery
07:55

By Role

Relevant moments from this week's episodes, organized by function.

|

38.3 hours of podcasts distilled