Last updated Nov 29, 2025
aieconomy
Friedberg predicts that as AI model performance improves and costs decline over the coming years, demand for AI compute and applications will grow nonlinearly (i.e., accelerating rather than saturating or shrinking).
look, I think as performance improves, as cost declines, like any economic model, there's a pretty nonlinear relationship with demand. So we'll find new ways to apply this technology. I think the demand is only going to go nonlinear.View on YouTube
Explanation

Evidence since mid‑2023 strongly supports Friedberg’s prediction that AI compute and application demand would go nonlinear as models improved and effective costs fell:

  • Explosive growth in AI/GenAI spending: Gartner forecasts worldwide GenAI spending to jump from about $365B in 2024 to $644B in 2025, a 76% YoY increase after an estimated 337% surge in 2024, explicitly attributing this to better foundational models and rising demand for AI products. (gartner.com) Gartner also projects overall AI-related spending reaching ~$1.5T in 2025 and surpassing $2T in 2026, indicating a rapidly accelerating curve rather than saturation. (gartner.com)
  • Server and data‑center build‑out is described as exponential/nonlinear: Gartner notes that GenAI will "nearly triple server sales from 2023 to 2028" and that data‑center systems are the highest‑growth IT segment due to GenAI, a classic nonlinear infrastructure ramp. (gartner.com) TrendForce reports that capex by the eight major cloud providers in 2025 alone will roughly equal their combined 2023–2024 spending and grow another 24% in 2026, driven specifically by AI GPUs and custom AI ASICs. (trendforce.com)
  • Chip and infrastructure vendors see surging, not plateauing, demand: NVIDIA’s revenue more than doubled year‑over‑year in fiscal 2025 (up 114%), with data‑center revenue up 93%, on top of >100% growth the prior year, all driven by AI training and inference workloads. (nvidianews.nvidia.com) Dell and others report multi‑billion‑dollar AI‑server backlogs and rapidly rising AI‑server forecasts. (reuters.com)
  • Improved performance and effective cost reductions are explicitly cited as drivers: Market analyses tie rapid AI‑infrastructure growth to new, higher‑performance GPU generations (e.g., NVIDIA’s Blackwell) that deliver much more compute per dollar and thereby catalyze enterprise AI adoption. (prnewswire.com) Even where total frontier‑model training bills rise, research finds this is because organizations are scaling up compute 2.4× per year, not because demand is flattening—another sign of escalating appetite for AI compute. (arxiv.org)

Across multiple independent data points, demand for AI compute and AI applications from 2023–2025 has clearly grown in a rapid, super‑normal (effectively nonlinear) fashion rather than saturating or shrinking, aligning well with Friedberg’s prediction for the "coming years."