• 0 Posts
  • 3.99K Comments
Joined 3 years ago
cake
Cake day: August 27th, 2023

help-circle

  • It’s a whole new kind of software.

    A pile of examples can become a working program. Neural networks are universal approximators, and anyone with a video card can now make them. The work they do feels like hard science fiction written by comedians.

    For some reason we’ve only seen two models taken seriously: spicy autocomplete and a denoiser. One is a chatbot that’s just smart enough to get in trouble. The other is CGI for dummies that could make movies as cheap as pen and paper.

    The problem in full is the world’s most obvious bubble forcing these technologies on people. On everybody. The folks who choose this, for themselves, don’t need worrying about. Where it doesn’t work out they’ll pretend it never happened. Where it works, neat. Again: the problem is the force and the scale.

    So yes, an artificial tornado beside your house is intolerable, but it’s obviously not a fundamental problem with the technology. Even an identical quantity of GPUs could simply be spread out, so many buildings merely hum.

    And vegan local models will arise, made from only bespoke licensed data, trained by distributed amateurs. But the big boys shove fancier models into your hands so often that it’d be archaic before it begins… and most people loudly complaining would just keep complaining.

    The identarian performance has to stop. Even folks mumbling ‘it’s awful, you should never,’ usually end with ‘but anyway here’s how I use it.’ The tech is fine. It doesn’t belong in your browser. It doesn’t belong on your keyboard. It doesn’t belong in your goddamn e-mail, before you’ve even read it. But curmudgeons and iconoclasts alike have found utility in this Yes Man improv partner who kinda knows C++. And animators will get real quiet when some product magically in-betweens their drawings.

    Sam Altman is a fraud. Facebook can burn. CUDA must become open-source after Nvidia craters. But five years from now, this wave of AI will still be so commonplace that it’s boring. We will take for granted that computers perform dubious witchcraft.



  • They know people spit slop slop slop slop like a thirsty dog. Every public defense is protesting too much, every quiet effort is conscience of guilt. The nature of bad faith is that there is no right answer.

    We each need private vigilance against participating in public harassment campaigns. Is there any reason these people’s behavior changed, or that they were keeping things quiet, besides the fear of dealing with you?


  • > entire product loudly denigrated because of new tool used

    Yeah can’t imagine why they’d remove the ‘come have an argument at me’ label.

    I want the bubble to burst so this moral panic will end. Programs can code, now. That’s not going away. Make your peace. We can either leverage this new ability to describe code into existence, and improve all the ways where it demonstrably works okay - or we can pretend that wasn’t the goal of compilers and high-level languages the whole time.

    Oh but this new thing is different; yeah it’s always different, that’s what new means. Neural networks sounded great for decades but had a hard time existing. We finally accepted the bitter lesson that power scales better than cleverness - and hey presto, ‘what’s the next symbol?’ is as smart as a junior developer.

    If you think these fumbling efforts are the best this tool will ever be, we can still extract useful work from it. It’s already a punchline in videos that build some crazy thing the hard way, then have an LLM effortlessly switch languages for speed. Or fight integration hell on their behalf. We’re not doing anyone favors by pretending the problem is the tech. Or by harassing people who work for free on things you like.