

I remember stories of Google intentionally making it hard for people to fork the project and maintain updates on it.
Never heard anything like that about Firefox, but doesn’t mean they don’t.


I remember stories of Google intentionally making it hard for people to fork the project and maintain updates on it.
Never heard anything like that about Firefox, but doesn’t mean they don’t.


In Canada (or BC anyway) they need to pay the time and a half, and there’s technically no such thing as salaried. If you work over 40 even if you’re on salary, it’s over time at 1.5x.
However, there is a clause in BC at least that says high tech workers are exempt from the 1.5x rate for overtime.
It’s utter bullshit. Discriminated against in written law.


I mean ya, that works if they know who you are.


Sorry my bad. The messages will all be converted into AI slop when you submit them so you aren’t actually communicating with someone with your own written words so you have plausible deniability that you didn’t actually write any messages using encryption which is now an offense punishable by death.


It’s because it can’t be taken down. What happens when they arrest everyone working at signal? When they arrest you for hosting a E2E encrypted message relay?
That’s where language like this is headed.
Ethereum also recently introduced something called blobs, which is temporary data that lasts around 18 days. So it isn’t necessarily stored forever if you wanted a not as permanent message. There are archives that will keep all that as well, but it’s not maintained on the regular chain past 18d.


I know this is gonna get downvoted but…
This is all going to lead to needing to use some open blockchain based communication system.
E2E encrypted, routed through a public blockchain they can’t block. Use one of Ethereums L2 systems to keep short messages under a penny or two and route all payments through Tornado Cash so they can’t be tracked.
All open source, nothing to take down, uncensorable.
It won’t be for everyone at first it would he too technical in a lot of cases, but it’s going to be the last option at the end of this road.


Wait for a PS6 that Sony will sell at a loss due to earning it back on games and salvage that sweet sweet ram and GPU for a home made PC lol.


deleted by creator


At what point does removing garbage become so complicated that they just stop merging anything in at all.


Decades from now when we have Speed Racer type tracks they’ll regret that.


They also show ads unless you pay more than the prime membership.
I haven’t opened prime since that day.


So, I just watched the F1 movie with Brad Pitt, and the mention f2 and f3.
I only just realized, that f zero is likely meant to be the next one after 1, as in the new elite race rank.
That game was so fucking good.


I’ve used it to help me make some website stuff as a mobile dev. It’s definitely not the best webpage and I’ll probably want to redo it at some point, but for now it works.
That cost a freelancer a small job or would have taken exceptionally longer than it did. But I really don’t have a big interest in web dev so I probably would have ended up hiring someone.
The amount of stuff it gets wrong is still enormous. I can’t imagine what someone with zero programming skills would end up with if they only used it. It’d be so god awful.


Clearly their adoption of rushing out AI generated code is working well.


That was cool to watch


I had $10 million and now I’m broke, how did that ever happen?!?!?!?
Seems to happen so often for the rich celebrity type.


I’ve been enjoying my carbon steel more than cast iron. It’s the same as cast iron for seasoning and non stick, but much lighter.


The power usage isn’t even that much on an individual basis once it’s trained, it’s that they have to build these massive data centers to train and serve millions of users.
I’m not sure it’s much worse than if nvidia had millions of customers using their game streaming service running on 4080s or 4090s for hours on end vs less than an hour of AI compute a day.
It’d be better if we could all just run these things locally and distribute the power and cooling needs, but the hardware is still to expensive.
You have apple with their shared GPU memory starting to give people enough graphics memory to load larger more useful models for inference, in a few more generations with better memory bandwidth and improvements, the need for these data centers for consumer inference can hopefully go down. These are low power as well.
They don’t use CUDA though so aren’t great at training, inference only.


With such scarce food up there, isn’t it always a good time to eat if it’s just a single human?
Military is a good example.
First people who were gay were removed.
Then don’t ask don’t tell.
Then it was okay.
Now it’s not and they’re being removed and many outed themselves once it was okay.
One day, you’re not a terrorist. Then on Sept 22 2025 you are because you don’t support fascism.