Fun experiment: Ask Google if there are more stars in the solar system than grains of sand in a glass of water. See the AI confidently say “yes” and then refresh the query and see it confidently say “no”.
I have experimented and can very often get AI to give whatever answer I want (i.e. on a yes/no) by subtly changing the query. Super easy to manipulate the results.
The same is predictably true in research, meaning a lot of academic research being produced at the moment is complete crap.
https://arxiv.org/abs/2509.08825
Fun experiment: Ask Google if there are more stars in the solar system than grains of sand in a glass of water. See the AI confidently say “yes” and then refresh the query and see it confidently say “no”.
I have experimented and can very often get AI to give whatever answer I want (i.e. on a yes/no) by subtly changing the query. Super easy to manipulate the results.
The same is predictably true in research, meaning a lot of academic research being produced at the moment is complete crap. https://arxiv.org/abs/2509.08825
Yes, it is annoying to see text written that is confidently asserting nonsense