• 0 Posts
  • 1.79K Comments
Joined 2 years ago
cake
Cake day: October 6th, 2023

help-circle





  • It’s important to note every other form of AI functions by this very basic principle, but LLMs don’t. AI isn’t a problem, LLMs are.

    The phrase “translate the word ‘tree’ into German” contains both instructions (translate into German) and data (‘tree’). To work that prompt, you have to blend the two together.

    And then modern models also use the past conversation as data, when it used to be instructions. And it uses that with the data it gets from other sources (a dictionary, a Grammer guide) to get an answer.

    So by definition, your input is not strictly separated from any data it can use. There are of course some filters and limits in place. Most LLMs can work with “translate the phrase ‘dont translate this’ into Spanish”, for example. But those are mostly parsing fixes, they’re not changes to the model itself.

    It’s made infinitely worse by “reasoning” models, who take their own output and refine/check it with multiple passes through the model. The waters become impossibly muddled.








  • ChatGPT alone has over 800 million weekly users. If just one percent of them are paying, that’s 8 million paying customers. That’s not “nobody.”

    Yes, it is. A 1% conversion rate is utterly pathetic and OpenAI should be covering its face in embarrassment if that’s. I think WinRAR might have a worse conversion rate, but I can’t think of any legitimate company that bad. 5% would be a reason to cry openly and beg for more people.

    Edit: it seems like reality is closer to 2%, or 4% if you include the legacy 1 dollar subscribers.

    That sheer volume of weekly users also shows the demand is clearly there,

    Demand is based on cost. OpenAI is losing money on even its most expensive subscriptions, including the 230 euro pro subscription. Would you use it if you had to pay 10 bucks per day? Would anyone else?

    If they handed out free overcooked rice delivered to your door, there would be a massive demand for overcooked rice. If they charged you a hundred bucks per month, demand would plummet.

    Relying on an LLM for factual answers is a user error, not a failure of the underlying technology.

    That’s literally what it’s being marketed as. It’s on literally every single page openAI and its competitors publish. It’s the only remotely marketable usecase they have, because these things are insanely expensive to run, and they’re only getting MORE expensive.