I was having a conversation with my friend about this. We were discussing AI and she believes AI will destroy all of humanity just like so many others. I personally don’t believe that. I’m aware of all the theories and the multitude of ways that it could happen and I understand that with AI, in theory we wouldn’t understand its goals so we wouldn’t know how it would destroy us but again, that’s just a theory.

There’s also the constant fear of massive nuclear holocaust with WWIII but I also don’t believe that we’d realistically get to a point where we’d use Nukes on each other knowing the implications of what would happen. But it made me realize that we’re constantly fearful of mass extinction. To the point where some people fight tooth and nail and will not try to look at things from a more positive or optimistic perspective. It’s all death or you’re wrong.

Please help me understand this. I’m here with open ears.

  • trustnoone@lemmy.sdf.org
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    1 year ago

    Eh I don’t think we’re obsessed with it, it’s more just like the likely outcome due to human nature. Largely pushed by:

    • greed
    • differences
    • And only asking “could we” and not “should we”
    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      1
      arrow-down
      3
      ·
      1 year ago

      I’m sorry human nature? As in humans’ tendency to stop existing? To all just die out and not proliferate everywhere and master new levels of reality at an accelerating rate?

      What about human nature indicates a lack of survival?