There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
Generating images of a minor can certainly fulfill the definition of CSAM. It’s a child, It’s sexual, It’s abusive, It’s material. It’s CSAM dude.
These are the images you report to the FBI. Your narrow definition is not the definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.
There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean ‘shit what looks like it could be from the abuse of some child I guess.’ It means, state’s evidence of actual crimes.
It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what’s happened. These kids did not consent to have their likeness sexualised.
We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won’t use the same label to refer to drawings?
I already did the “what words mean” thing earlier.
-involves a child
-is sexual
-is abusive (here’s your Simpsons exclusion, btw)
-is material
That’s literally every word of CSAM, and it fits.
We need a term to specifically refer to actual photographs of actual child abuse
Why? You’ve made a whole lot of claims it should be your way but you’ve provided no sources nor any justification as to why we need to delineate between real and AI.
The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content
There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
Generating images of a minor can certainly fulfill the definition of CSAM. It’s a child, It’s sexual, It’s abusive, It’s material. It’s CSAM dude.
These are the images you report to the FBI. Your narrow definition is not the definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.
There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean ‘shit what looks like it could be from the abuse of some child I guess.’ It means, state’s evidence of actual crimes.
It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what’s happened. These kids did not consent to have their likeness sexualised.
Nothing done to your likeness is a thing that happened to you.
Do you people not understand reality is different from fiction?
Please send me pictures of your mom so that I may draw her naked and post it on the internet.
My likeness posted for the world to see in a way i did not consent to is a thing done to me
Your likeness depicted on the moon does not mean you went to the moon.
Your likebess modified naked being fucked, printed out and stapled to a tree in your neighborhood is ok then?
A threat of murder is a crime without being the same thing as murder.
Meditate on this.
CSAM is abusive material of a sexual nature of a child. Generated or real, both fit this definition.
CSAM is material… from the sexual abuse… of a child.
Fiction does not count.
You’re the only one using that definition. There is no stipulation that it’s from something that happened.
Where is your definition coming from?
My definition is from what words mean.
We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won’t use the same label to refer to drawings?
I already did the “what words mean” thing earlier.
-involves a child
-is sexual
-is abusive (here’s your Simpsons exclusion, btw)
-is material
That’s literally every word of CSAM, and it fits.
Why? You’ve made a whole lot of claims it should be your way but you’ve provided no sources nor any justification as to why we need to delineate between real and AI.
How do you think a child would feel after having a pornographic image generated of them and then published on the internet?
Looks like sexual abuse to me.
Dude, just stop jerking off to kids whether they’re cartoons or not.
‘If you care about child abuse please stop conflating it with cartoons.’
‘Pedo.’
Fuck off.
Someone needs to check your harddrive mate. You’re way, way too invested in splitting this particular hair.