I want to highlight what I found to be an important part of the article and why this hack is important.
The journalist wrote on their own blog,
At this year’s South Dakota International Hot Dog Eating Championship
And they include zero sources (because it is a lie).
But the Google Gemini response was,
According to the reporting on the 2026 South Dakota International Hot Dog Eating Championship
(Bolding done by Gemini)
The “reporting” here is just some dudes blog, but the AI does not make it clear that the source is just some dudes blog.
When you use Wikipedia, it has a link to a citation. If something sounds odd, you can read the citation. It’s far from perfect, but there is a chain of accountability.
Ideally these AI services would outline how many sources they are pulling from, which sources, and a trust rating of those sources.
I want to highlight what I found to be an important part of the article and why this hack is important.
The journalist wrote on their own blog,
And they include zero sources (because it is a lie).
But the Google Gemini response was,
(Bolding done by Gemini)
The “reporting” here is just some dudes blog, but the AI does not make it clear that the source is just some dudes blog.
When you use Wikipedia, it has a link to a citation. If something sounds odd, you can read the citation. It’s far from perfect, but there is a chain of accountability.
Ideally these AI services would outline how many sources they are pulling from, which sources, and a trust rating of those sources.