| Episode | Status |
|---|---|
| Episode | Status |
|---|---|
2025 was the year AI stopped feeling chaotic and started feeling buildable. In this Lightcone episode, the YC partners break down the surprises of the year, from shifting model dominance to why the re...
YC partners discuss how the AI landscape stabilized in 2025, with Anthropic overtaking OpenAI as the preferred model among YC startups (52% vs declining share). The conversation reveals a maturing ecosystem where model commoditization benefits application-layer startups, infrastructure buildout creates opportunities despite appearing bubble-like, and the playbook for AI-native companies has crystallized. Key insight: we're transitioning from the 'installation phase' of heavy CapEx investment to the 'deployment phase' where the next generation of applications will proliferate.
Analysis of YC Winter 2026 batch data reveals Anthropic Claude has become the #1 API choice (52%+) among applicants, surpassing OpenAI for the first time. This shift is driven by Claude's superior performance in coding tasks and vibe coding tools, with Gemini also climbing to 23% market share. The discussion covers model personalities and switching behaviors among technical users.
Startups are building orchestration layers to swap models dynamically based on task-specific performance, moving away from single-vendor loyalty. Series B companies now use different models for different stages (e.g., Gemini for context engineering, OpenAI for execution), with proprietary evals determining the winner for each use case. This commoditization of models shifts value to the application layer.
Partners argue the perceived 'AI bubble' in infrastructure spending (NVIDIA, OpenAI CapEx) actually creates opportunities for application-layer startups, similar to how telecom overinvestment in the 90s enabled YouTube. The glut of compute capacity, driven by competition among AI labs and chip makers, means lower costs and more resources for college students and founders building applications.
Power generation and land constraints are pushing data center innovation to space. StarCloud (YC S24) pioneered the concept, now adopted by Google and Elon Musk. Boom Supersonic pivoted to using jet engines for data center power due to supply chain backlogs. YC portfolio includes the 'trifecta': StarCloud (space data centers), Helion/Boom (energy), and Zephyr Fusion (space-based fusion).
The skill set for training models is becoming more common, similar to how startup knowledge dispersed over the past decade. Open source models plus fine-tuning and RL enable domain-specific models with 8B parameters to beat OpenAI in vertical tasks. However, companies must maintain post-training infrastructure to keep pace as frontier models improve.
What started as observed founder behavior in early 2025 evolved into a major category with companies like Replit, Emergence, and Google's Anti Gravity (led by YC alum Varun Mohan). While vibe coding isn't yet reliable for 100% production code, it's gained significant traction and even Sundar Pichai is publicly discussing it.
The AI economy has stabilized into clear layers (models, infrastructure, applications) with established playbooks for building AI-native companies. Unlike 2024's rapid ground-shifting changes, 2025 saw incremental model improvements without major disruptions. This maturation means finding startup ideas has returned to normal difficulty levels - the 'wait a few months for new capabilities' strategy no longer works.
Fast takeoff predictions (like AI 2027 report) haven't materialized due to log-linear scaling laws and organizational resistance to change. First-wave AI companies like Harvey that raised massive rounds and did victory laps now face serious competition from second-wave startups like Legora and Giga. Despite AI efficiency gains, companies still hire at pre-AI rates because customer expectations rise faster than productivity.
What Surprised Us Most In 2025
Ask me anything about this podcast episode...
Try asking: