• Lvxferre [he/him]@mander.xyz
    cake
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 hours ago

    That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

    Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

    There ARE victims, lots of them.

    You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)

    Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.

    Is this clear now?

    • Atomic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      5
      ·
      2 hours ago

      Yes, it certainly comes across as you arguing for the opposite since you above, reiterated

      The real thing to talk about is the presence or absence of a victim.

      Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.