• bearboiblake@pawb.social
    link
    fedilink
    English
    arrow-up
    32
    ·
    11 hours ago

    inb4 “In a stunning 5-4 decision, the Supreme Court has ruled that AI-generated CSAM is constitutionally protected speech”

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      21
      ·
      10 hours ago

      There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.

      • deranger@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        3
        ·
        edit-2
        9 hours ago

        Generating images of a minor can certainly fulfill the definition of CSAM. It’s a child, It’s sexual, It’s abusive, It’s material. It’s CSAM dude.

        These are the images you report to the FBI. Your narrow definition is not the definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          16
          ·
          9 hours ago

          There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean ‘shit what looks like it could be from the abuse of some child I guess.’ It means, state’s evidence of actual crimes.

          • Sas@piefed.blahaj.zone
            link
            fedilink
            English
            arrow-up
            9
            ·
            8 hours ago

            It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what’s happened. These kids did not consent to have their likeness sexualised.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              11
              ·
              edit-2
              8 hours ago

              Nothing done to your likeness is a thing that happened to you.

              Do you people not understand reality is different from fiction?

              • athatet@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                25 minutes ago

                Please send me pictures of your mom so that I may draw her naked and post it on the internet.

                  • Sas@piefed.blahaj.zone
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    arrow-down
                    1
                    ·
                    edit-2
                    5 hours ago

                    Your likebess modified naked being fucked, printed out and stapled to a tree in your neighborhood is ok then?

          • deranger@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            2
            ·
            9 hours ago

            CSAM is abusive material of a sexual nature of a child. Generated or real, both fit this definition.

              • deranger@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                7
                arrow-down
                1
                ·
                edit-2
                8 hours ago

                You’re the only one using that definition. There is no stipulation that it’s from something that happened.

                Where is your definition coming from?

                • mindbleach@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  7
                  ·
                  8 hours ago

                  My definition is from what words mean.

                  We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won’t use the same label to refer to drawings?

                  • deranger@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    14 minutes ago

                    I already did the “what words mean” thing earlier.

                    -involves a child
                    -is sexual
                    -is abusive (here’s your Simpsons exclusion, btw)
                    -is material

                    That’s literally every word of CSAM, and it fits.

                    We need a term to specifically refer to actual photographs of actual child abuse

                    Why? You’ve made a whole lot of claims it should be your way but you’ve provided no sources nor any justification as to why we need to delineate between real and AI.

              • queermunist she/her@lemmy.ml
                link
                fedilink
                English
                arrow-up
                6
                arrow-down
                1
                ·
                8 hours ago

                How do you think a child would feel after having a pornographic image generated of them and then published on the internet?

                Looks like sexual abuse to me.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          19
          ·
          9 hours ago

          ‘If you care about child abuse please stop conflating it with cartoons.’

          ‘Pedo.’

          Fuck off.

          • Leraje@piefed.blahaj.zone
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            1 hour ago

            Someone needs to check your harddrive mate. You’re way, way too invested in splitting this particular hair.

      • ruuster13@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        8 hours ago

        The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content