• deranger@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    3
    ·
    edit-2
    9 hours ago

    Generating images of a minor can certainly fulfill the definition of CSAM. It’s a child, It’s sexual, It’s abusive, It’s material. It’s CSAM dude.

    These are the images you report to the FBI. Your narrow definition is not the definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      16
      ·
      9 hours ago

      There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean ‘shit what looks like it could be from the abuse of some child I guess.’ It means, state’s evidence of actual crimes.

      • Sas@piefed.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        ·
        8 hours ago

        It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what’s happened. These kids did not consent to have their likeness sexualised.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          11
          ·
          edit-2
          8 hours ago

          Nothing done to your likeness is a thing that happened to you.

          Do you people not understand reality is different from fiction?

      • deranger@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        9 hours ago

        CSAM is abusive material of a sexual nature of a child. Generated or real, both fit this definition.

          • deranger@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            edit-2
            8 hours ago

            You’re the only one using that definition. There is no stipulation that it’s from something that happened.

            Where is your definition coming from?

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              7
              ·
              8 hours ago

              My definition is from what words mean.

              We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won’t use the same label to refer to drawings?

              • deranger@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                15 minutes ago

                I already did the “what words mean” thing earlier.

                -involves a child
                -is sexual
                -is abusive (here’s your Simpsons exclusion, btw)
                -is material

                That’s literally every word of CSAM, and it fits.

                We need a term to specifically refer to actual photographs of actual child abuse

                Why? You’ve made a whole lot of claims it should be your way but you’ve provided no sources nor any justification as to why we need to delineate between real and AI.

          • queermunist she/her@lemmy.ml
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            8 hours ago

            How do you think a child would feel after having a pornographic image generated of them and then published on the internet?

            Looks like sexual abuse to me.