Last updated Nov 29, 2025
aitech
If AI capabilities continue improving on a roughly 48–72 hour cycle, then by roughly six months after April 14, 2023 (i.e., by mid-October 2023), the effective progress in AI will be comparable to 10–12 years of progress at traditional technology innovation rates.
And this is a perfect example where when you start to compound technology at the rate of 24 hours or 48 hours, which we've never really had to acknowledge, most people's brains break and they don't understand what six months from now looks like. And six months from now, when you're compounding at 48 or 72 hours is like 10 to 12 years in other technology solutions.View on YouTube
Explanation

Evidence cuts both ways, and there is no objective yardstick for “10–12 years of progress” or a well-defined “traditional innovation rate,” so the claim can’t be cleanly verified or falsified.

Why it might look roughly right (in spirit):

  • Between April and October 2023, there was a flood of new frontier and open(-ish) models and tooling: Anthropic’s Claude 2 (July 2023), Meta’s Llama 2 (July 2023), Alibaba’s Tongyi/Qwen (beta April, public release September 2023), IBM’s Granite models (announced September 2023), and many similar efforts, all riding the wave started by GPT‑4 in March 2023.(en.wikipedia.org) 2023 is widely described as an “AI boom,” with large language models and generative AI becoming central in science and tech news.(en.wikipedia.org) A major policy analyst reviewing 2023 AI developments said it felt like “10 years worth of news” compressed into one year, reflecting a broadly shared sense of unusually fast change.(csis.org) This supports Chamath’s qualitative intuition that progress in AI that year felt vastly compressed in time.

Why it might be overstated or wrong in detail:

  • Chamath framed this as “compounding at 48 or 72 hours,” implying extraordinarily fast, near-continuous leaps in capability. Contemporary quantitative work on language-model progress instead finds that, over 2012–2023, the compute needed to hit a given performance level halves roughly every 5–14 months (median ~8 months), not every 2–3 days.(arxiv.org) And by 2024, multiple analyses described generative AI as entering a more incremental phase, with plateauing qualitative gains and underwhelming real‑world usefulness relative to the hype, suggesting that 2023 did not trigger a sustained, runaway 48–72‑hour compounding regime.(wired.com)

Because (1) the prediction is partly rhetorical, (2) the 48–72‑hour compounding premise is not supported by quantitative data, yet (3) many observers still describe 2023’s AI changes as compressed into a period that felt like many years of ordinary progress, the claim cannot be judged clearly right or wrong using available evidence; it remains ambiguous.