• Catoblepas@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    7 hours ago

    If LLMs can’t do whatever you tell them based purely on natural language instructions then they need to stop advertising it that way.

    It’s not just advertisement that’s the problem, do any of them even have user manuals? How is a user with no experience prompting LLMs (which was everyone 3 years ago) supposed to learn how to formulate a “correct” prompt without any instructions? It’s a smokescreen for blaming any bad output on the user.

    Oh, it told you to put glue in your pizza? You didn’t prompt it right. It gives you explicit instructions on how to kill yourself because you talked about being suicidal? You prompted it wrong. It completely makes up new medical anatomical terminology? You have once again prompted it wrong! (Don’t make me dig up links to all those news stories)

    It’s funny the fediverse tends to come down so hard on the side of ‘RTFM’ with anything Linux related, but with LLMs it’s actually the user’s fault for believing they weren’t being sold a fraudulent product without a user manual.

    • fonix232@fedia.io
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      4 hours ago

      Sounds like you’re the kind of person who needs the “don’t put your fucking pets in the microwave” warnings.