Windows 11 often requires new hardware. But that will be extremely pricey or have very little RAM for a while.
I dont believe that a single competent person works at Micro$oft anymore, but maybe maybe this could lead them to make a less shitty OS?
And garbage software like Adobe Creative Cloud too?
They obviously dont care about users, but the pain could become too big.
It still costs more to rewrite all your existing code sooo no.
Naaaah, you are just going to have to run it in the cloud optimised by AI for the low low price of both your kidneys so Bezos, Mark and Elon can continue partying.
Linux FTW
Wouldn’t that be nice! Yeah I think it’ll totally work.
Hey, I think I see someone right now, they’re switching from writing in Python to writing in assembly! “Hey buddy, don’t forget to clear that register! And don’t forget you’ll need to write this all over from scratch to get it to work on any other platform!”
One of those little truisms folks forget is that optimising software takes a LOT longer than making something that just works.
I’m currently running Fedora Linux with Firefox and YouTube opened up. The whole system uses ~4GB of memory. That’s totally fine and I couldn’t care less about what Microsoft is doing with their OS.
With that said, I don’t think we’ll see a lot of optimizations in commercial software. Maybe a few here and there, but a lot of developers nowadays don’t even know how to optimize their code. Especially people working in web development or adjacent frameworks. Let’s just throw hundreds of npm packages into one project and bundle them up with webpack, here’s your 12MB JavaScript - take it or leave it. Projects like this aren’t the exception, they are the norm.
Even if the devices that can run that code without running out of memory get more expensive, companies will just pay for those and write them off on the taxes. And if not, more apps will just get pushed into the cloud.
🤣 Nah, they’ll enforce mandatory cloud computing.
You’ll just have a “terminal”
It’s crazy that people don’t see this is where computers are heading.
The day tech bros realized they could squeeze recurring monthly subscriptions out of you for basically increasingly banal shit the writing was on the wall. The end game is that you have a chromebook with 800 subscriptions to streaming services for your os, music, movies, tv, games, image editing software, music DAWs, plugins for both the aforementioned softwares, subscriptions for hardware associated with the software (eg drawing tablets or midi keyboards), etc but covering every niche you can possibly think of and not just graphic art and music.
And when you bitch about it tech bros and weird alphas and young zoomers who were raised on this ecosystem and indoctrinated by it will go “well you see it’s fair because updates cost money to develop” as if the old system of expecting bug fixes and security patches to be free but not necessarily feature updates was unfair. Like if I buy a car and it’s fucked up I expect it to be fixed for free but I don’t expect them to feature match the next model year.
Tech workers are disproportionately high paid and so whiney when they have to provide even a modicum of support because then they have to potentially cut into that disproportionate high pay. Like “oh no i make 80-150,000+ a year but if i support this I’ll have to work more without generating sales and will maybe only make 60-130,000+. The horror!” fuck those libertarian shitstains that are literally overthrowing an entire government (and possibly more) with technofacism so that they can justify their “I know python, I should be able to earn as much as I want, fuck ethics, I never emotionally matured past 16” bullshit
Username checks out and i love it
Just like the early days of programming when you really had to manage your memory, often down to the last bit. Those were the days when programming was more difficult. Now it’s mostly just point and click for middle-schoolers.
I’m presuming you know nothing about programming because this is complete and utter nonsense.
Well, the point and click part was a bit extreme. Still true in some rare cases, but actual programming still requires a keyboard.
However the RAM thing is interesting. Haven’t actually written any code in the 70’s and 80’s, but what I’ve heard from people who did, RAM was a huge bottle neck. Well, pretty much everything was. Even the bandwidth between your terminal and the mainframe was a bottle neck that made you suffer.
Back in those days, programmers were painfully aware of the hardware limitations. If you wanted your code to run within a reasonable amount of time, you absolutely had to focus on optimizing it.
Not when AI is writing the code.
Maybe it’ll write native apps instead of garbage web/electron/chrome apps
Narrator:
‘It didn’t’
There’s plenty of “unbloated” software available. It’s just not on Windows.
Which unbloated browser do you use?
(This isn’t a dig or a gotcha, I’m serious, I’m looking to switch browsers)
Shouldnt Firefox or a Fork oft Firefox Mike Waterfox or ZenBrowser be fine?
Michael Waterfox is pretty chill yeah
No, everything will just become subscription based.
And powered by the cloud
Why is this painful truth not the top comment? Maybe people are still hopeful after all the time?
Linux Mint Cinnamon is pretty easy to move to…
As someone who recently made the switch with zero Linux experience, I completely agree.
Not likely. I expect the AI bubble will burst before those software optimization gears even start to turn.
Even returning to JVM languages would be huge over the current js based electron slop. Things are so bad “optimized software” doesn’t need to mean C++ or Rust.
with Rust getting popular the architecture is there to make huge savings without having to be a rocket scientist
the rocket scientists are also getting involved and regularly outperforming even optimised C code
Not just that all of their ai slop code will be more unoptimized
Yeah, the systems in place right now took 40 years to build
Yes, but with AI, you can build it in 4 hours, and with all those extra RAMs, it could drop to 2
Big AI is a bubble but AI in general is not.
If anything, the DRAM shortages will apply pressure on researchers to come up with more efficient AI models rather than more efficient (normal) software overall.
I suspect that as more software gets AI-assisted development we’ll actually see less efficient software but eventually, more efficient as adoption of AI coding assist becomes more mature (and probably more formalized/automated).
I say this because of experience: If you ask an LLM to write something for you it often does a terrible job with efficiency. However, if you ask it to analyze an existing code base to make it more efficient, it often does a great job. The dichotomy is due to the nature of AI prompting: It works best if you only give it one thing to do at a time.
In theory, if AI code assist becomes more mature and formalized, the “optimize this” step will likely be built-in, rather than something the developer has to ask for after the fact.
It’s not just garbage software. So many programs are just electron apps which is about the most inefficient way of making them. If we could start actually making programs again instead of just shipping a webpage and a browser bundled together you’d see resource usage plummet.
In the gaming space even before the RAM shortage I’ve seen more developers begin doing optimization work again thanks to the prevalence of steam deck and such so the precedent is there and I’m hopeful other developers do start considering lower end hardware.
Probably a super unpopular take, but the Switch and Switch 2 have done more for game optimization than the Steam Deck has by sheer volume of consoles sold than the Steam Deck ever could. I agree the Steam Deck pushed things further but the catalyst is the Switch/2
I take it the Switch/S2 has many non-Nintendo games shared with other consoles? Hard to search through 4,000 titles on Wikipedia to find them at random, but I did see they had one Assassin’s Creed (Odyssey) at the game’s launch. I never really had Nintendo systems and just associate them with exclusive Nintendo games.
I’m choosing to believe the Steam Machine will do more of the same for PC games. Maybe it won’t force optimization at launch, but I hope it maintains itself as a benchmark for builds and provides demand for optimization to a certain spec.
I try to follow the gaming space and I didn’t really see anyone talk about optimization until the Steam deck grew. I do wish more companies were open about their development process so we actually had some data. The switch/switch 2 very well could have pushed it, but I think with those consoles people just accept that they might not get all the full modern AAA games, they’re getting Pokemon and Mario and such. Where as the steam deck they want everything in their steam library. I dunno
I have no real data, just what I’ve seen people discussing.
I only own one Nintendo game on my Switch. I’m not going to sit here and pretend most of my games run great on it though. Slay the Spire and Stardew run well. But I’ve had quite a few crashes with Civilization and some hangs with Hades or Hollow Knight too
So the developers of PC games like Claire Obscure: Expedition 33, which doesn’t have a Switch version of any kinda, spent time, effort and money to optimize specifically for the Steam Deck… because of the Switch’s market share? Cmon now bud, that’s a straight up ridiculous take.
Web apps are a godsend and probably the most important innovation to help move people off of Windows.
I would prefer improvements to web apps and electron/webview2 if I had to pick.
If those web apps were using the same shared electron backend then they could be “a godsend”. But each of those web apps uses it’s own electron backend.
The beauty of it is that it electron/webview2 will probably get improved and you don’t need to fix the apps.
I don’t disagree with that. But the problem is having one electron backend for each web app and not one backend for all web apps.
Absolutely
Idk, I don’t think the issue is election apps using 100mb instead of 10mb. The kind of apps that you write as html/js are almost always inherently low demand, so even 10x-ing their resources doesn’t really cause a problem, since you’re not typically doing other things at the same time.
The issue is the kind of apps that require huge system resources inherently (like graphically intensive games or research tools), or services that run in the background (because you’ll have a lot of them running at the same time).
You’re off by a large margin. I’ll use two well documented examples.
Whatsapp native used about 300mb with large chats. Cpu usage stayed relatively low and constant. Yes it wasn’t great but that’s a separate issue. The new webview2 version hits over a gig and spikes the cpu more than some of my games.
Discord starts at 1gb memory usage and exceeds 4gb during normal use. That’s straight from the developers. It’s so bad they have started rolling out an experimental update that makes the app restart itself when it hits 4gb.
These are just two electron apps meant just for chatting mostly. That’s up to 5Gb with just those two apps. Electron and webview2 both spin up full node.js servers and multiple JavaScript heaps plus whatever gpu threads they run, and are exceedingly bad at releasing resources. That’s exactly why they are the problem. Yes the actual JavaScript bundles discord and Whatsapp use are probably relatively small, but you get full chromium browsers and all of their memory usage issues stacked on top.
Right
But those are only problems because they use the resources in the background. When the foreground app uses a lot of resources it’s not a problem because you only have one foreground app at a time (I know, not really, but kinda). Most apps don’t need to run in the background.Yes, thats the problem? I’m confused what you’re not getting here. Those programs are made to constantly run. Many people need both for various reasons. Add a main program like Photoshop and then you don’t have enough RAM. People don’t load discord, check a message, close it, load Whatsapp, check it, close it, then load Photoshop.
The RAM usage doesn’t suddenly stop because you alt+tab to a different program.
There are, of course, bad offenders.
I’m just skeptical that “webapps that need a ton of resources and people leave open” is the norm. But I haven’t done any research on it so maybe it is.
It’s a really nice idea, but bad developers are already so deep in the sunk cost fallacy that they’ll likely just double down.
Nobody reassesses their dogma just because the justification for it is no longer valid. That’s not how people work.


















