So I can see like a couple of years from now, all these big enterprises are going to figure this thing out, and then you're not necessarily going to need to pay for the ChatGPT stuff. If there's an internal tool and an internal LLM thatView on YouTube
By November 30 2025, only about 1.5 years have passed since the June 2024 prediction, which explicitly referred to a timeframe of a couple of years (roughly mid‑2026). That window has not yet elapsed, so the outcome cannot be definitively judged. Current data suggests the market has not yet shifted in the way described: surveys show around 80% of organizations pay for subscriptions to tools like ChatGPT or Microsoft Copilot and 63% are using cloud AI APIs, while only 39% are using open‑source models on their own infrastructure and 27% are training proprietary in‑house LLMs. (siliconangle.com) Another survey cited by the Financial Times reports that only about one in eight commercial AI workloads run on open models, with most customers preferring paid, state‑of‑the‑art proprietary systems and even open‑model users typically mixing them with commercial ones. (ft.com) A separate study finds open models account for only about 20 percent of usage and 4 percent of revenue in the AI market despite being up to 84 percent cheaper to operate. (itpro.com) Enterprise spending on generative AI is still increasing overall, with a majority of organizations planning to boost LLM spending, and OpenAI and Google models remaining widely used. (electronicspecifier.com) Netskope telemetry likewise shows that enterprise AI use is dominated by SaaS gen‑AI apps such as ChatGPT, Gemini, and Copilot, along with managed cloud AI platforms, rather than internally hosted open‑source stacks replacing external services. (reddit.com) These indicators imply the prediction has not clearly come true yet, but because the stated timeframe runs to around mid‑2026, it is still too early to call it definitively right or wrong.