there's no reason that in seven years that is not the standard, is your that I don't have the ability to say, go look at all the software that's out there in the world today. So that help me build a tool that meets compliance standards, that meets all of my security standards.
The prediction’s timeframe has not elapsed yet. Friedberg was effectively saying that by roughly seven years after December 20, 2024 (around 2031–2032) it will be standard practice for organizations to use AI systems that can automatically design and generate production software (including security/compliance logic) from natural‑language instructions and analysis of existing software.
As of November 30, 2025:
- Advanced AI coding agents such as Devin can already plan, write, test, and refactor code autonomously, integrate with tools like Slack/Jira/Linear, learn from existing codebases, and generate production pull requests for real companies, including large refactors at a bank‑like fintech (Nubank).(devin.ai)
- These tools are still in early‑access / pilot‑style deployments and are framed as powerful assistants or teammates whose work must be reviewed before deployment, not as universally adopted, fully trusted, compliance‑aware generators of end‑to‑end production systems across organizations.(newbits.ai)
However, the claim is about what will be standard practice by around 2031–2032, and we are only in 2025. Even though we can say the prediction is not yet fully realized, we cannot judge its final correctness because the deadline is several years in the future.
Therefore, the appropriate status is: it’s too early to tell whether the prediction will ultimately be right or wrong.