

A customer paying a recurring subscription just to do their job.
Local models will win. They’re half-assed, but the big boys only provide fractionally more ass. LLMs will become just another tool you can call on when you’d rather read code than write it.






A Raspberry Pi can run local models. You don’t need 64 gigs and a 5090.