• 0 Posts
  • 218 Comments
Joined 2 years ago
cake
Cake day: April 3rd, 2024

help-circle
  • Not what I said at all. I said the few limitations I know of do not matter for 99% of what I would want to do, meaning there are a few use cases I can think of (and that I had in the past) where X11 offered something Wayland does not, but they are such corner cases that for all intents and purposes, they do not matter.

    Did really not expect such a passive agressive response for that…






  • An LLM trained on all books ever written would probably take romance novels, books by flat earthers, or even “Atlas Shrugged” as truth as much as current AIs consider all stack overflow comments to contain useful and accurate information.

    Thinking about it, your questions comes back to the very first and original instance of a computer and the question interested people asked about it:

    If you put into the machine wrong figures, will the right answer come out?

    Now if we allow ourselves the illusion of assuming that an AGI could exist, and that it can actually learn by itself in a similar way as humans, than just that quote above leads us to these two truths:

    • LLMs cannot help being stupid, they just do not know any better.
    • AGIs will probably be idiots, just like the humans asking the above question, but there is at least a chance that they will not.







  • What You’re Proposing by Status Quo.

    I know they have this whole minimalist gig going on and all, but… damn guys, really, the same words over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over again, and then some.

    I do like some of their other songs, I really do, but I hate this with passion.