• TheJesusaurus@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    76
    arrow-down
    1
    ·
    8 hours ago

    Why confront the glaring issues with your “revolutionary” new toy when you could just suppress information instead

    • Ex Nummis@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      4
      ·
      8 hours ago

      This was about sending a message: “stfu or suffer the consequences”. Hence, subsequent people who encounter similar will think twice about reporting anything.

      • Devial@discuss.online
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        1
        ·
        edit-2
        6 hours ago

        Did you even read the article ? The dude reported it anonymously, to a child protection org, not google, and his account was nuked as soon as he unzipped the data, because the content was automatically flagged.

        Google didn’t even know he reported this, and Google has nothing whatsoever to do with this dataset. They didn’t create it, and they don’t own or host it.

          • Devial@discuss.online
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            2
            ·
            edit-2
            5 hours ago

            They didn’t react to anything. The automated system (correctly) flagged and banned the account for CSAM, and as usual, the manual ban appeal sucked ass and didn’t do what it’s supposed to do (also whilst this is obviously a very unique case, and the ban should have been overturned on appeal right away, it does make sense that the appeals team, broadly speaking, rejects “I didn’t know this contained CSAM” as a legitimate appeal reason). This is barely news worthy. The real headline should be about how hundreds of CSAM images were freely available and sharable from this data set.

              • Devial@discuss.online
                link
                fedilink
                English
                arrow-up
                6
                ·
                edit-2
                5 hours ago

                They reacted to the presence of CSAM. It had nothing whatsoever to do with it being contained in an AI training dataset, as the comment I originally replied to states.