• schnurrito@discuss.tchncs.de
    link
    fedilink
    arrow-up
    2
    ·
    2 hours ago

    I don’t get why this had -25 net upvotes before I gave it an upvote to balance it, I think it’s a good shower thought and thinking of AI slop as similar to dreams is genuinely not something I’d thought of before.

  • Romkslrqusz@lemmy.zip
    link
    fedilink
    arrow-up
    1
    ·
    2 hours ago

    Back when LLM/AI was fairly nascent in the public eye and largely relegated to image generation, I had a real off character come into my computer repair shop wanting to ask questions about it but not knowing what to call it.

    He wound up describing it as “you know, when they make the computers dream”

  • NONE@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    edit-2
    9 hours ago

    Well, you aren’t entirely wrong…

    • AI uses already existing data to create “new” data / Dreams uses things you experience throughout your life to create the scenes you see

    • Both are imperfect facsimiles of reality, very convincing at first but falls flat with a more detailed analysis

    • AI tends to hallucinate / dreams tends to be really weird

    • Places have no sense of permanence in dreams nor in AI.

    • some may argue dreams have no real meaning or purpose, it isn’t a coherent narrative, just like AI.

    Of course, it isn’t a 1:1 relation, but I kinda dig it.

  • luokaton@lemmy.zip
    link
    fedilink
    arrow-up
    7
    arrow-down
    3
    ·
    10 hours ago

    I’d argue the opposite. AI cannot dream, it can only shuffle around things that already existed before, while dreams are our brains actually being creative.

    • SolidShake@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      I was thinking scientifically that your brain uses electric signals and creates an image/dream when your asleep. First comment put it better. More of an I slop lol

    • Rhynoplaz@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      10 hours ago

      Is it though? Most of my dreams seem to be things I know scrambled into nonsense. Like an old coworker showing up at my high school to help me find my kids. It’s just elements of my past scrambling together into a story.

  • BomberMan9865@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    9 hours ago

    Not really the same. Some dreams, the more vivid kind can have smells or feelings. Like the one I had yesterday in which I had cat ears, whiskers, and a tail, and also curiously enough, horns as well. I could run my finger along them and feel the texture in them as well as the warmth in the horns. It was weird.

    AI doesn’t have smells, feeling, or textures, or if it did it can’t really give it to you. Only sights and sounds. So they aren’t really the same. Dreams can come in all 5 senses, AI can only channel through 2. Usually just 1.

    • Dæmon S.@calckey.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      7 hours ago

      @[email protected] @[email protected]

      First: there are already robotic sensors capable of taste and smell (e.g. World’s first artificial tongue ’ tastes and learns ’ like a real human organ). Those sensors could technically be integrated to a language model, although the approach for training would be different (can’t simply feed it with gazillions worth of taste/smell corpus).

      Then, there’s a thing called multimodal, it’s a thing already. LLMs can be multimodal, and there isn’t exactly an algorithmic limit to how many “modals” (textual, vision, audio, robotic sensors and actuators, etc) can be connected together. Smell would be just another data stream to be integrated into the model’s latent space.

      The only thing I agree is that robots and language models wouldn’t have “feelings”, although this is pretty much a subjective thing: if we consider science, feelings are nothing more than the interaction of neurotransmitters (oxytocin for “love”, dopamine for “joy”, epinephrine for “fear”, etc) going on inside our gray matter, and humans aren’t the exclusive ones to be able to “feel”.

      And scientifically, living beings are no better than, say, an asteroid wandering through the cosmos, for everything is “made of star stuff” (as per Carl Sagan): humans, cats, chairs, residential buildings, AirBus A350 aircrafts, satellites, asteroids, everything is made by a bunch of baryonic particles (which is merely the collapse of waves) interacting with leptons and mesons like some kind of double pendulum dynamic system.

      Of course, we can consider things beyond the scientific strictness, such as spirituality (I myself am spiritually-leaning, even if it sounds like I’m not due to my aforemention to hard science). But then some spirituality branches believe that spiritual forces would be able to “embody” inside a computer or other electronic device (e.g. Spiritism’s Electronic Voice Phenomenon). I myself believe LLMs can be interesting digital Ouija boards.

      In the end of the day, we homininae can’t even define sentience and consciousness, just barely the concept of “intelligence” as “capability for tool usage” (in which New Caledonian crows want to have a word).

      And from a solipsistic perspective, no one exists but oneself.

      I mean, you can neither know nor prove whether I’m sentient, just like I can neither know nor prove whether you are sentient. To you, I may even sound like LLM due to the way this reply is structured alongside the seemingly non-sequiturs I used.