• Cryophilia@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    6 days ago

    Search “AI woman porn miniskirt,”

    Did it with safesearch off and got a bunch of women clearly in their late teens or 20s. Plus, I don’t want to derail my main point but I think we should acknowledge the difference between a picture of a real child actively being harmed vs a 100% fake image. I didn’t find any AI CP, but even if I did, it’s in an entire different universe of morally bad.

    r/jailbait

    That was, what, fifteen years ago? It’s why I said “in the last decade”.

    • LustyArgonian@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      edit-2
      5 days ago

      “Clearly in their late teens,” lol no. And since AI doesn’t have age, it’s possible that was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.

      Obviously there’s a difference with AI porn vs real, that’s why I told you to search AI in the first place??? The convo isn’t about AI porn, but AI porn uses images to seed their new images including CSAM

      • Schadrach@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.

        That’s…not how AI image generation works? AI image generation isn’t just building a collage from random images in a database - the model doesn’t have a database of images within it at all - it just has a bunch of statistical weightings and net configuration that are essentially a statistical model for classifying images, being told to produce whatever inputs maximize an output resembling the prompt, starting from a seed. It’s not “seeded with an image of a 15 year old”, it’s seeded with white noise and basically asked to show how that white noise looks like (in this case) “woman porn miniskirt”, then repeat a few times until the resulting image is stable.

        Unless you’re arguing that somewhere in the millions of images tagged “woman” being analyzed to build that statistical model is probably at least one person under 18, and that any image of “woman” generated by such a model is necessarily underage because the weightings were impacted however slightly by that image or images, in which case you could also argue that all drawn images of humans are underage because whoever drew it has probably seen a child at some point and therefore everything they draw is tainted by having been exposed to children ever.

      • Cryophilia@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        5 days ago

        It’s fucking AI, the face is actually like 3 days old because it is NOT A REAL PERSON’S FACE.

        • LustyArgonian@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          edit-2
          5 days ago

          We aren’t even arguing about this, you giant creep who ALWAYS HAS TO GO TO BAT FOR THIS TOPIC REPEATEDLY.

          It’s meant to LOOK LIKE a 14 yr old because it is SEEDED OFF 14 YR OLDS so it’s indeed CHILD PORN that is EASILY ACCESSED ON GOOGLE per the original commenter claim that people have to be going to dark places to see this - NO, it’s literally in nearly ALL AI TOP SEARCHES. And it indeed counts for LEGAL PURPOSES in MOST STATES as child porn even if drawn or created with AI. How many porn AI models look like Scarlett Johansson because they are SEEDED WITH VER FACE. Now imagine who the CHILD MODELS are seeding from

          You’re one of the people I am talking about when I say Lemmy has a lot of creepy pedos on it FYI to all the readers, look at their history

          • Cryophilia@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            5 days ago

            I’m offended by stupid-ass shit and feel compelled to call it out. Thinking AI-generated fake images is just as harmful as actual children getting abused is perhaps the most obvious definition of “stupid-ass shit” since the invention of stupid-ass shit.