Chat boxes get a system prompt describing them as a helpful assistant, and then a blank, they need to predict how to fill the blank. Then they get the exact same system prompt, the word they just filled, and a new blank. Repeat until the blank becomes an ending token.
This automatically means the AI is likely to answer in a way a human would find natural not necessarily optimal or correct.
Which is why the “Oh my god, you’re right! I missed this obvious feature!” remarks appear even in coding agents.
AI says “oh my god” now? What could that mean?
Chat boxes get a system prompt describing them as a helpful assistant, and then a blank, they need to predict how to fill the blank. Then they get the exact same system prompt, the word they just filled, and a new blank. Repeat until the blank becomes an ending token.
This automatically means the AI is likely to answer in a way a human would find natural not necessarily optimal or correct.
Which is why the “Oh my god, you’re right! I missed this obvious feature!” remarks appear even in coding agents.
I meant philosophically, not technically
Philosophically it is certainly interesting it didn’t capitalize “god”, suggesting it might be polytheist.
misspelled
glob*slob