The number of questions on Stack Overflow fell by 78 percent in December 2025 compared to a year earlier. Developers are switching en masse to AI tools in
Serious question here. LLMs trained their data off SO. Developers now ask LLMs for solutions instead of SO. New technology comes out that LLMs don’t have indexed. Where will LLMs get their data to train on for new technologies? You can’t exactly feed it a manual and expect it to extrapolate or understand (for that matter “what manual).
I am worried, because there are increasing cases where open source docs are going offline because they can’t take the bandwidth costs of the big LLM bots recrawling hundreds of times per day. Wikipedia is also getting hammered. There is so much waste and diminishing returns
Yes, that is the major problem with LLMs in general. There is no solution aside from “train on another different source (like Reddit)”, but then we rinse & repeat.
I guess, though I’m pretty ignorant as to how RLVR would fix the issue that arises from new coding languages or even new major versions. I’m not sure how LLMs would ever get to a correct answer if they don’t have good reference material to start from or reference.
The assumption seems to be that an LLM can’t figure out a manual or source code. If it can’t, then you have to pay people. But that’s not a universally valid assumption.
You can’t exactly feed it a manual and expect it to extrapolate or understand (for that matter “what manual).
You can do that to a degree (RLVR). They are also paying human experts. But that’s the situation now. Who knows how it will be in a couple more years. Maybe training AIs will be like writing a library, framework, …
Serious question here. LLMs trained their data off SO. Developers now ask LLMs for solutions instead of SO. New technology comes out that LLMs don’t have indexed. Where will LLMs get their data to train on for new technologies? You can’t exactly feed it a manual and expect it to extrapolate or understand (for that matter “what manual).
I am worried, because there are increasing cases where open source docs are going offline because they can’t take the bandwidth costs of the big LLM bots recrawling hundreds of times per day. Wikipedia is also getting hammered. There is so much waste and diminishing returns
Yes, that is the major problem with LLMs in general. There is no solution aside from “train on another different source (like Reddit)”, but then we rinse & repeat.
There is a solution: RLVR
I guess, though I’m pretty ignorant as to how RLVR would fix the issue that arises from new coding languages or even new major versions. I’m not sure how LLMs would ever get to a correct answer if they don’t have good reference material to start from or reference.
The assumption seems to be that an LLM can’t figure out a manual or source code. If it can’t, then you have to pay people. But that’s not a universally valid assumption.
You can do that to a degree (RLVR). They are also paying human experts. But that’s the situation now. Who knows how it will be in a couple more years. Maybe training AIs will be like writing a library, framework, …
From the questions people ask and from online accounts.