

If heroin was fully legalized, zero restrictions, we’d be much better off than the current situation we have right now with the war on drugs, fentanyl analogs, and xylazine. Full stop.
If we hadn’t invaded Afghanistan and started importing heroin in bulk through Ahmed Wali Karzai’s mafia connections, we wouldn’t have tons of cheap heroin to hook people to begin with. Also, we did have fully legalized (functionally) zero restrictions opioids, back under Bush Jr.
That’s what Oxycotin was.
If you want to describe the US as a criminal nacro-state, you can start at the Florida pill-mills that flooded the country with hundreds of billions of dollars in highly addictive prescription drugs and made the Sackler Family some of the wealthiest people on the planet.
Based on this I’m not gonna read the rest of the article







I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.
I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.
Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.
Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.
But TL; DR; I’d be more inclined to blame “bloat” on internet web browsers and low cost memory post '00s than on AI written-code.