A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      14 hours ago

      Well. Not very different from “opening up” to hashish fumes or Tarot cards or Chinese fortune cookies.

      And robotic therapists are a common enough component of classical science fiction, not even all dystopian.

      For the record, I agree that the results suck. Everything around us is falling apart, have you noticed?

      You can do more with less with 1% deadly error rate, and you can do much more with much less with 10% deadly error rate. Military and economic logic says that the latter wins . Which means the latter wins evolution.

      And we (that is, our parents and grandparents) have built a nice world intended for low error rates, because they didn’t think such a contradiction between efficiency and correctness will happen, or they thought that it’s our job to root out our time’s weeds, loosely quoting Tolkien, and they have rooted out theirs as well as they could.

      Which means that nice world doesn’t survive evolution.

      • MonkderVierte@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        9 hours ago

        Maybe short term and in a ultracapitalist society. But entrusting your most inner fears and hurtings to a company is not GDPR compliant, even less so with the more caring social economies.