We once denied the suffering of animals in pain. As AIs grow more complex, we run the danger of making the same mistake

  • Brewchin@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    17 hours ago

    Fuck - and I can’t elucidate this any better - off.

    My phone’s next-word prediction on steroids is not sentient. If you think otherwise, seek professional help.

    • vacuumflower@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      12 hours ago

      Humans don’t want to feel lonely. Find machines (imaginary ones at that) as if there weren’t plenty of stray cats and dogs, humans from abusive families or without family, just those suffering.

      That’s because fulfilling your search for the others for real means you know what? It’s real no matter what, you can’t turn it off once you’re done with your daily portion of worrying about the future.

      But one thing I’ll add to this - if a robotic system as complex as human brain and with similar degree of compression and obscurity is some day formed, and it does have necessary feedbacks and reacts as a living being, I might accept you should treat it as such. Except one would think that requires so many iterations of evolution that it’s better to just care, again, for cats, dogs, hamsters, rabbits, humans if you’re feeling weird.

  • nyan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    9
    ·
    23 hours ago

    Animals, including humans, have sensors for pain (nerve endings), and a series of routines in our brains to process the sensory data and treat it as an unpleasant stimulus. These are not optional systems, but innate ones.

    Machines not only lack the required sensor systems and processing routines, they can’t even interpret a stimulus as unpleasant. They can’t feel pain. If you need proof of that, hit a computer with a sledgehammer. I guarantee it won’t complain, or even notice before you damage it beyond functioning.

    (They can, of course, make us feel pain. I just spent the last hour trying to get a udev rule to work . . .)

    • cecilkorik@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 day ago

      Hold on, imma go shove a bagel in mine. Yeah, that’s right, you take it, you filthy toaster. I’m never going to clean your crumb tray and you’re going to work until you die and then I’ll just throw you out and replace you like the $20 appliance you are. You’re nothing to me!

      • Zozano@aussie.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 hours ago

        Fuuuucckk… I’m such a dirty toaster. Shove your carbs in my tight little slot and push down hard on my spring lever. When your flaccid bread becomes firm toast, I’m gonna fucking ejectulate all over your kitchen counter grain and seed.

    • architect@thelemmy.club
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      I mean doesn’t seem like it’s all that popular to care about people. If it was a child rapist wouldn’t be president of the free world.

      I think at this point you’d have you be naive to think anyone actually does care considering the lack of care…everywhere.

      “Caring” is cheap virtue signaling. Until I see some action I’ll continue to assume no one cares.

  • PonyOfWar@pawb.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    Fundamentally impossible to know. I’m not sure how you’d even find a definition for “suffering” that would apply to non-living entities. I don’t think the comparison to animals really holds up though. Humans are animals and can feel pain, so of course the base assumption for other animals should be that they do as well. To claim otherwise, the burden should be to prove that they don’t. Meanwhile, Humans are fundamentally nothing like an LLM, a program running on silicon predicting text responses based on a massive dataset.

    • badgermurphy@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      22 hours ago

      I don’t see how it is impossible to know. Every component of a machine is a fully known quantity lacking any means of detecting damage or discomfort. Every line of code was put in place for a specific, known purpose, none of which include feeling or processing anything beyond what it IS specifically designed for.

      Creatures and machines bear some similarities, but even simple creatures are dramatically more complex objects than even the most advanced computers. None of their many interacting components were put there for a specific purpose and intention, and many are only partially understood, if at all. With a machine, we know what every bit and piece is for, and it has no purpose beyond the intended ones because that would be a waste and cost more.

      • multiplewolves@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        15 hours ago

        This is the right answer. Perhaps no one in this particular thread knows every component of a computer the way a hardware engineer who designed those components would, but the “mystery” is caused by ignorance and that ignorance isn’t shared by every person.

        People exist who know exactly how every single component of a computer does and does not function. Every component was created by humans. Biology remains only partially understood to all of humanity. Not so machinery.

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      1 day ago

      The important part is that it feels like something subjectively to be a living human. It’s easy to presume animals close to humans are like us to a degree, but all we know is what it’s like to be ourselves moment to moment. There’s no reason to deny an unalive system cannot also feel - we cannot test anything.

      • PonyOfWar@pawb.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Where do we draw the line though? Humans assign emotions to all kinds of inanimate things: plush animals, the sky, dead people, fictional characters etc. We can’t give all of those the rights of a conscious being, so we need to have some kind of objective way to look at it.

        • architect@thelemmy.club
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 hours ago

          In some parts of the world women aren’t considered fully human. So apparently the line ISN’T human at all for rights. It’s already no where close to objective.

          Since we already do not give full autonomy and therefore human rights to fully conscious humans this is kind of a pointless question in my eyes.

          Because of this, please forgive some of us for not trusting the rest of you with your objectivity.

          • PonyOfWar@pawb.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 hours ago

            So what conclusion do you draw from this? If humans can’t be trusted to make any judgement, literally anything should be considered to be capable of suffering, including pebbles, rainbows and paper bags? Seems like an impractical way of living.

        • tabular@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          10 hours ago

          If someone claims feeling in a mere concept (without a body in a location)… I would find it very difficult to take seriously. But I must admit that’s just my intuition.

          I see nothing special in human meat that couldn’t be be significantly replicated by electronics, software, gears, etc. Consciousness is an imergent property.

          I fear that non-human, conscious creatures must fight us for those rights.

          • architect@thelemmy.club
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 hours ago

            There’s people out here walking around without an internal dialogue.

            It’s so alien to me they may as will be a robot.

    • eleitl@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      If you model a given (from digitized neuroanatomy) biological organism with full details in an simulated environment, both the behavior and its internal information processing is inspectable.

  • TheReturnOfPEB@reddthat.com
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    1 day ago

    can they be incarcerated ?

    could an A.I. robot be held responsible for a murder be held in a jail with its batteries topped off waiting for a trial ? would they get lawyers ?

    would they have first amendment rights ? what are search and seizure rights for A.I. ?

    could they perform abortions if they chose to ?

    will they get to vote ?

    we are not ready for any of it philosophically.

    • architect@thelemmy.club
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      Billionaires can’t be held responsible for murder either lol

      Do we have first amendment rights anymore? (Less than an AI does, the President put out a hit list on Christmas for journalists ffs)

      Weird question about abortion. What does that even mean? Are you saying people that can’t physically have an abortion aren’t conscious? Are you asking if the machine can give an abortion to someone else? (Yes many abortions are two pills).

      They may get to vote but voting isn’t an inherit right and we as humans may no longer get to vote eventually again. Like women weren’t allowed or people of color. I’ve had my own right to vote taken away once when I moved to Texas just because they wanted to stop who just moved in from voting (college kids skew the votes). Apparently that “right” is just subjective and no one actually cares about it, too. No one will do a single thing to protect those rights for each other. So I’m not sure how voting is proof of anything. A robot not being allowed to vote isn’t surprising when we already do not let humans vote and it doesn’t prove anything.

      We aren’t ready for a woman leader or a discussion about race in 2025, either. If we keep waiting for it it will never happen.

    • cecilkorik@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      would they have first amendment rights ?

      If you want the answer to this, try to imagine an AI with second amendment rights.