Joshington?
Jhleswig-Holstein?
Joshington?
Jhleswig-Holstein?
I think some people are so eager to offload all critical thinking to the machine because they’re barely capable of it themselves to begin with.


deleted by creator


It’s glorified autocorrect (/predictive text).
People fight me on this every time I say it but it’s literally doing the same thing just with much further lookbehind.
In fact, there’s probably a paper to be written about how LLMs are just lossily compressed Markov chains.
That’s funny, I didn’t notice that lol
Animals never had a war
Guess they call what happened in Tanzania in 1974 a “chimp police action” instead.


deleted by creator


I’ve long maintained that actually writing code is only a small part of the job. Understanding the code that exists and knowing what code to write is 90% of it.
I don’t personally feel that gen AI has a place in my work, because I think about the code as I’m writing it. By the time I have a complete enough understanding of what I want the code to do in order to write it into a prompt, the work is already mostly done, and banging out the code that remains and seeing it come to life is just pure catharsis.
The idea of having to hand-hold an LLM through figuring out the solution itself just doesn’t sound fun to me. If I had to do that, I’d rather be teaching an actual human to do it.
“AI bubble causing economic collapse” here.
But at a certain point, it seems like you spend more time babysitting and spoon-feeding the LLM than you do writing productive code.
There’s a lot of busywork that I could see it being good for, like if you’re asked to generate 100 test cases for an API with a bunch of tiny variations, but that kind of work is inherently low value. And in most cases you’re probably better off using a tool designed for the job, like a fuzzer.
I’ve maintained for a while that LLMs don’t make you a more productive programmer, they just let you write bad code faster.
90% of the job isn’t writing code anyway. Once I know what code I wanna write, banging it out is just pure catharsis.
Glad to see there’s other programmers out there who actually take pride in their work.
That’s probably based on the first definition because you can play either an ascending or descending scale.
Also the music staff kinda looks like a ladder.
Didn’t know doordashers could contact you through whatsapp
I was thinking about this the other day and realized something:
Back when the modern Santa character was first being developed, coal was a genuinely useful thing. It was fuel for the stove which heated your house and cooked your food. It was a basic necessity of life.
If you were naughty, Santa didn’t just give you nothing. You weren’t going to get an awesome toy, but he made sure you weren’t going to freeze to death on Christmas, either.
Santa believes everyone deserves to live. That having a warm place to sleep is a basic human right.
This might be /r/im14andthisisdeep material, but I just thought that was interesting.


Why would you bring up C# in a thread about kernel programming?


You go ahead and write an OS kernel in C# then.


Because Rust lets you choose when something is unsafe vs writing all unsafe in code all the time:
Note the other 159 kernel CVEs issued today for fixes in the C portion of the codebase
Yeah but on the second incarnation, wouldn’t that put you right back where you started?
What happens to the guy that was driving it? Does he just blink out of existence when the car shuts off? That’s my question. You might argue that there is no such thing, but my own conscious experience proves to myself that there’s something else there. I want to know what happens to that part.
Hell, for all I know, you might just be a soulless meatbag automaton, and there really is no one in the driver’s seat for you. Or I could just be the only actual human talking in a thread full of bots. With 90% of the training data going into LLMs being vapid contrarian debates on social media, I could easily see that being the case here.
The doctor is hilarious and I want B’lanna to step on me