From the article:
This chatbot experiment reveals that, contrary to popular belief, many conspiracy thinkers aren’t ‘too far gone’ to reconsider their convictions and change their minds.
From the article:
This chatbot experiment reveals that, contrary to popular belief, many conspiracy thinkers aren’t ‘too far gone’ to reconsider their convictions and change their minds.
If they’re gullible enough to be suckered into it, they can similarly be suckered out of it - but clearly the effect would not be permanent.
That doesn’t follow with the “if you didnt reason your way into a believe you can’t reason your way out” line. Considering religious ferver I’m more inclined to believe this line than yours.
No one said at all that AI used “reason” to talk people out of a conspiracy theory. In fact I would assume it’s incredibly unlikely since AI in general is not reasonable.
Why? It works as a corollary - there’s no logic involved in any of the stages described.
I’ve always believed the adage that you can’t logic someone out of a position they didn’t logic themselves into. It protects my peace.
logic isn’t the only way to persuade, in fact all evidence seems to show it works on very few people.
Everyone discounts sincere emotional arguments but frankly that’s all I’ve ever seen work on conspiracyheads.