• flamingo_pinyata@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 day ago

    Humans suffer from the same problem. Racism and sexism are consequences of humans training on a flawed dataset, and overfitting the model.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 day ago

      That’s also why LARPers of past scary people tend to be more cruel and trashy than their prototypes. The prototypes had a bitter solution to some problem, the LARPers are just trying to be as bad or worse because that’s remembered and they perceive that as respect.

    • x00z@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Politicians shape the dataset, so “flawed” should be “purposefully flawed”.