• howrar@lemmy.ca
    link
    fedilink
    arrow-up
    1
    ·
    1 month ago

    LLMs cannot:

    • Tell fact from fiction
    • Accurately recall data from its training set
    • Count

    LLMs can

    • Translate
    • Get the general vibe of a text (sentiment analysis)
    • Generate plausible text

    Semantics aside, they’re very different skills that require different setups to accomplish. Just because counting is an easier task than analysing text for humans, doesn’t mean it’s the same it’s the same for a LLM. You can’t use that as evidence for its inability to do the “harder” tasks.