• nymnympseudonym@piefed.social
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      20 hours ago

      More than time for regulation. LLMs can generally detect when a convo has gone delusional, psychotic, or towards self harm. Those should at minimum be logged and terminated

      EDIT: ofc existing laws against csam should be prosecuted. You’d think pizzagate people would be all over this