• sramder@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    […]will only take a few hallucinations before no one trusts LLMs to write code or give advice

    Because none of us have ever blindly pasted some code we got off google and crossed our fingers ;-)

    • Avid Amoeba@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 months ago

      It’s way easier to figure that out than check ChatGPT hallucinations. There’s usually someone saying why a response in SO is wrong, either in another response or a comment. You can filter most of the garbage right at that point, without having to put it in your codebase and discover that the hard way. You get none of that information with ChatGPT. The data spat out is not equivalent.

    • Seasm0ke@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      7 months ago

      Split segment of data without pii to staging database, test pasted script, completely rewrite script over the next three hours.