As a follow up to this post in this community: The Future is NOT Self-Hosted

I have thought about how to set up local, community-hosted fediverse servers that respect privacy and anonymity while still guaranteeing that users joining the server are human-beings.

The reasoning behind these requests is that:

  • You want anonymity to guarantee that people won’t face repercussions in real life for the opinions they voice in the internet. (liberty of free speech)
  • You want to keep the fediverse human, i.e. make sure that bot accounts are in the minority.

This might sound like an impossible and self-contradictory set of constraints, but it is indeed possible. Here’s how:

Make the local library set up a fediverse server. Once a month, there’s a “crypto party” where participants throw a piece of paper with their fediverse account name into a box. The box is then closed and shaked to mix all the tokens in it. Then, each one is picked out and the library confirms that this account name is indeed connected to a human. Since humans have to be physically present to throw in a paper, it is guaranteed that no bot army just opens a hundred anonymous accounts. Also, the papers are not associated to a particular person that way.

  • poVoq@slrpnk.net
    link
    fedilink
    English
    arrow-up
    14
    ·
    17 hours ago

    Admin review of account applications in Lemmy works fine. If you ask people to write a bit, it’s quite easy to sort out the bots as there are always give-aways. And if people use LLMs to write the responses, then that’s on them 🤷

      • poVoq@slrpnk.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        16 hours ago

        You can just make the questions more location or theme specific. There is no way a bot will not slip up on stuff like that, and it doesn’t need to be 100% fail proof either.

        We get a lot of LLM bot applications on our instance, and even if it would get 10x harder, they would be still really easy to spot.

  • rtxn@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    17 hours ago

    Have you heard of surveillance cameras and facial recognition? If a hostile actor knows in advance that members of a targeted online community will be physically present at a location at a given time, those people will be linked to the community. It doesn’t take a lot from then to link specific persons to accounts.

    Besides, libraries are having a hard enough time just existing in America. They don’t need the burden of protecting the identities of dozens of people and fighting off lawyers and enforcers.

    • gandalf_der_12te@discuss.tchncs.deOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      edit-2
      17 hours ago

      if the group of people registering their account names is big enough, facial recognition doesn’t do much, as it can only link the person to one-of-a-hundred-or-thousand account names.

      • rtxn@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        16 hours ago

        I don’t think you fully comprehend just how many footprints people leave behind on the internet. Users would have to practice perfect opsec – and I mean completely, absolutely perfect. One mistake, like using an e-mail address or an alias off-site, will link a person to the account. If that person cracks under legal threats, the entire operation is fucked. It’s happened before.

        Thinking you can solve the issue of privacy with a single idea is simply delusional.

  • skribe@aussie.zone
    link
    fedilink
    English
    arrow-up
    6
    ·
    18 hours ago

    Immediate flaws, I can see:

    Cameras (or human observer) undo any sense of anonymity. A bad actor could link participant with account.

    What’s preventing a MitM attack, where the BBEL (Big Bad Evil Librarian) substitutes the participant addresses with bot addresses?

    • gandalf_der_12te@discuss.tchncs.deOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      edit-2
      17 hours ago

      Cameras (or human observer) undo any sense of anonymity. A bad actor could link participant with account.

      hence the mixing (shuffling) of paper cards before they are being registered. so there’s no one-to-one mapping of humans and account names anymore.

    • gandalf_der_12te@discuss.tchncs.deOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      edit-2
      17 hours ago

      you have to go there in person to cast a piece of paper with your account name on it, similar to a vote in a ballot. it’s anonymous because the pieces of paper cannot be associated to a physical human, just like voting ballots are anonymous, but it still transports the information that a human registered this account.

      • SavvyWolf@pawb.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        25 minutes ago

        What’s stopping someone making a new account every month this way or going to many different libraries and then just selling the account to bot farm operators?