• ryper@lemmy.ca
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    edit-2
    9 hours ago

    It’s too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.

    • greenskye@lemmy.zip
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      8 hours ago

      I get this and I don’t disagree, but I also hate that AI fully brought back thought crimes as a thing.

      I don’t have a better approach or idea, but I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime. I’ve also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated.

      Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.

      • shani66@ani.social
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 hours ago

        Sure, i think it’s weird to really care about loli or furry or any other niche, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can’t have effective safeguards against that harm it makes sense to restrict it legally.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      16
      ·
      9 hours ago

      You can insist every frame of Bart Simspon’s dick in The Simpsons Movie should be as illegal as photographic evidence of child rape, but that does not make them the same thing. The entire point of the term CSAM is that it’s the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.

      • deranger@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        3
        ·
        edit-2
        9 hours ago

        The *entire point* of the term CSAM is that it’s the actual real evidence of child rape.

        You are completely wrong.

        https://rainn.org/get-the-facts-about-csam-child-sexual-abuse-material/what-is-csam/

        “CSAM (“see-sam”) refers to any visual content—photos, videos, livestreams, or AI-generated images—that shows a child being sexually abused or exploited.”

        “Any content that sexualizes or exploits a child for the viewer’s benefit” <- AI goes here.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          11
          ·
          edit-2
          8 hours ago

          RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.

          We will inevitably develop some other term like LPEOAEWACWR, and confused idiots will inevitably misuse that to refer to drawings, and it will be the exact same shit I’m complaining about right now.

          • deranger@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            4
            ·
            8 hours ago

            Dude, you’re the only one who uses that strict definition. Go nuts with your course of prescriptivism but I’m pretty sure it’s a lost cause.

      • VeganBtw@piefed.social
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        edit-2
        9 hours ago

        Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn, is erotic material that involves or depicts persons under the designated age of majority.
        […]
        Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and computer-generated images that appear to involve them.
        (Emphasis mine)

        https://en.wikipedia.org/wiki/Child_pornography

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          5
          ·
          9 hours ago

          ‘These several things are illegal, including the real thing and several made-up things.’

          Please stop misusing the term that explicitly refers to the the real thing.

          ‘No.’