A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.
“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.
This seems a bit far-fetched, don’t you think? There could be so many reasons as to why someone would rather use AI than going to another person for advice (this is not just about women).
Honestly, as someone who actually went to therapy and yes, my therapist was a woman. It’s was quite tough to open up and be vulnerable.
I think for some people using AI, they might feel as if they’re not that vulnerable because it is not a person. However, they don’t realize that there’s data is being gathered.
With this, I can’t figure out whether you’re serious, trolling or just writing randomly.