

It also discouraged you from finding/starting an open source solution for those problems, thus undermining the high-quality open knowledge ecosysten that it relied on in the first place.


It also discouraged you from finding/starting an open source solution for those problems, thus undermining the high-quality open knowledge ecosysten that it relied on in the first place.


As Cory Doctorow says: code is a liability, not an asset
One project that can help with this is the OUI-SPY, a small piece of open source hardware. The OUI-SPY runs on a cheap Arduino compatible chip called an ESP-32. There are multiple programs available for loading on the chip, such as “Flock You,” which allows people to detect Flock cameras and “Sky-Spy” to detect overhead drones. There’s also “BLE Detect,” which detects various Bluetooth signals including ones from Axon, Meta’s Ray-Bans that secretly record you, and more. It also has a mode commonly known as “fox hunting” to track down a specific device.
https://www.eff.org/deeplinks/2026/01/how-hackers-are-fighting-back-against-ice


I was once a fool like you :)
Mike McShaffry’s book “Game Coding Complete” is a good guide to the practical side of using a game engine IRL to get things done.
It’ll give you a good idea of how things should be shaped in order to be useful, and some things you can “skip ahead” to. Off-the-shelf engines have to be extremely general in order to be flexible enough to be useful to many customers, so game devs have to put in the effort to make them more specific. You’ll have to start off by being specific, if you have any chance of actually finishing something.
Eberly’s book “3D Game Engine Architecture” deals with the nuts and bolts, the rigorous academic engineering stuff. It’s pretty solid, but it’s aimed at making a general-purpose engine, which is beyond the scope of a one-person project.
Backing up though… You don’t have any language or library opinions? You might need 5-10 years of experience doing general programming (or game dev) before you can sustainably tackle this, or else you’re likely to paint yourself into a corner.
Edit: Probably the biggest PITA with game engine dev is testing. If you’re not already an expert in setting up test harnesses at multiple levels of detail, you’re gonna find it impossible to keep moving after a few months.
Good luck!


Yeah… Linear increases in performance appear to require exponentially more data, hardware, and energy.
Meanwhile, the big companies are passing around the same $100bn IOU, amortizing GPUs on 6-year schedules but burning them out in months, using those same GPUs as collateral on massive loans, and spending based on an ever-accelerating number of data centers which are not guaranteed to get built or receive sufficient power.


I like the way Ted Chiang puts it:
Some might say that the output of large language models doesn’t look all that different from a human writer’s first draft, but, again, I think this is a superficial resemblance. Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly, and it is accompanied by your amorphous dissatisfaction, your awareness of the distance between what it says and what you want it to say. That’s what directs you during rewriting, and that’s one of the things lacking when you start with text generated by an A.I.
There’s nothing magical or mystical about writing, but it involves more than placing an existing document on an unreliable photocopier and pressing the Print button.
I think our materialist culture forgets that minds exist. The output from writing something is not just “the thing you wrote”, but also your thoughts about the thing you wrote.


Money-making is an orthogonal issue. LLMs subvert engagement with open source projects, which is important for their health whether or not there’s anyone trying to monetize that engagement.


“If you put money in a vending machine and got two items instead of one, would you put additional money in for the second item?”
That is wild.
The vending company factors this into the prices they charge for the items, the amount they spend on the machine to ensure accuracy, and the amount they pay the people who stock the machines to do it properly.
If you take it upon yourself to unilaterally re-balance the equation, you’re not being noble, you’re just a fool.


Or the original upload: https://youtube.com/watch?v=39jsstmmUUs


Not to be confused with SOLID, SolidJS, or Solidity.
It’s a neat idea. Because of the need to operate on data close to web servers and backend services for potentially long timeframes, I think we’ll need a widely-adopted CRDT solution in order for something like Solid to really take off from a technical standpoint.
And from a business standpoint, there’s really no upside. Sure, you delegate some cost for storage, but compute tends to be the more expensive aspect, and if you’re spending more time to interact with these external data stores, it may be more expensive in the end.


“Issues are for work items only. Communications go in discussions.”
posts this communication as an issue
Seriously though, this approach makes sense to me and I wish more projects did it.


I’m gonna need to see that math


Conceptual analysis of proximity isn’t exactly what I expected to see when I joined Lemmy
But it’s… 😎 close


Gamehub Lite is pretty wild. It does take some fiddling, but it’s amazing how well (and relatively easily) you can get x86 Windows games to run on a $200 ARM Android device.
I’m 12/13 so far on getting games to work at an acceptable level.
Inexplicably, Vampire Survivors causes the entire device to crash. I guess they pull some pretty silly memory tricks to keep that game responsive with potentially hundreds of thousands of projectiles, so maybe it’s not so surprising.


Security guard is one. Had a friend in college that basically got paid 8hrs/night to do 2hrs of actual work and 6hrs of building his portfolio. It can definitely work well for some folks.


Got Megabonk working on my Retroid, and can’t stop playing it. I thought I would try getting some other games going, but I just play Megabonk instead.


Opus Magnum
That game scratches my brain in such a satisfying way


Sure, but do you need a discrete video card if you’re gaming on an ARM SoC? And we’ve seen from the struggles of x86 iGPUs that graphics APIs pretty much have to choose whether they’re going to optimize for dedicated VRAM or shared memory, cuz it has inescapable implications for how you structure a game engine. ARM APIs will probably continue optimizing for shared memory, so PCI-E GPUs will always be second-class citizens.
Trying desperately to keep the ponzi scheme going, but his biggest customers already have warehouses full of GPUs that will never get connected.
The bubble is full, dude. Just try to minimize the damage from the pop so we don’t try to figure out what size pitchfork your dumb leather jacket is.