• brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        21 hours ago

        I think it’s highly contextual.

        • Like, let’s take Lemmy posts. LLMs are useless because the whole point is to affect the people you chat with, right? LLMs have no memory. So there is a philosophical difference even if comments/posts are identical.

        • …Now let’s take game dev. I think if a system generates the creator’s intent… does it matter what the system is. Isn’t it better if the system is more frugal, so they can use precious resources for other components and not go in debt?

        • TV? Could inevitably lead to horrendous corporate slop, a “race to the bottom.” OR it could be a killer production tool for indie makers to break the shackles of their corporate master. Realistically, the former is more likely at the moment.

        • News? I mean… Accurate journalism needs a lot of human connection/trust, and LLM news is just asking to be abused. I think it’s academically interesting, but utterly catastrophic in the real world we live in, kinda like cryptocurrency.

        One can wobble about all sorts of content. Novels, fan fiction, help videos, school material, counseling, information reference, research, and advertising, the big one.

        …But I think it’s really hard to generalize.

        ‘AI’ has to be looked at a la carte, and engineered for very specific applications. Sometimes it is indistinguishable, or mind as well be. But trying to generalize it as a “magic lamp” like tech bros, or the bane of existence like their polar opposites, is what’s making it so gross and toxic now.


        And I am drawing a hard distinction with actual artificial intelligence. As a tinkerer who has done some work in the space too… Franky, current AI architectures have precisely nothing to do with AGI. Training transformers models with glorified linear regression is just not the path; Sam Altman is full of shit, and the whole research space knows it.