The Centos “eight pointed star”?
The Centos “eight pointed star”?


Menu bar at the top at least makes some sense - it’s easier to mouse to it, since you can’t go too far. Having menus per-window like Linux, or like Windows used to before big ugly ribbons became the thing, is easier to overshoot. (Which is why I always open my menu bars by pressing ‘alt’ with my left thumb, and then using the keyboard shortcuts that are helpfully underlined. Window likes to hide those from you now since they’re ‘ugly’, and also makes you mouse over the pretty icons to get the tooltip that tells you what they are, which is just a PITA. Pretty != usable.)
Mac OS has had the menu at the top since before it was a multitasking OS. They had them there on the first Mac I ever used, a Mac Classic 2 back in 1991 or so, and it was probably like that before then too. It’s not like they’ve been ‘innovating’ that particular feature and annoying their users.


I had 32GB of RAM in my desktop as 4x8GB; one of the sticks failed a couple of years ago, and it was cheaper to replace it with 64GB = 4x16GB than it was to get a replacement 8GB.
That’s convenient for work purposes (in fact, I could actually do with more) but massive pointless overkill for most games. Even games which do “big loads” - Witcher 3, say - aren’t noticeably quicker from RAM cache than they are off of an NVMe drive.


Data centre GPUs tend not to have video outputs, and have power (and active cooling!) requirements in the “several kW” range. You might be able to snag one for work, if you work at a university or at somewhere that does a lot of 3D rendering - I’m thinking someone like Pixar. They are not the most convenient or useful things for a home build.
When the bubble bursts, they will mostly be used for creating a small mountain of e-waste, since the infrastructure to even switch them on costs more than the value they could ever bring.


There’s times when I want to find “exact matches and nothing but” - searching for error messages, for instance - and that’s made much harder than it should be by AI bullshit search engines that don’t want you to switch off their “helpful” features. Considering moving to Kagi instead.
Mine was my local Forgejo server, NAS server, DHCP -> DNS server for ad blocking on devices connected to the network, torrent server, syncthing server for mobile phone backup, and Arch Linux proxy, since I’ve a couple of machines that basically pull the same updates as each other.
I’ve retired it in favour of a mini PC, so it’s back to being a RetroPie server, have loads of old games available in the spare room for when we have a party, amuses children of all ages.
They’re quite capable machines. If they weren’t so I/O limited, they’d be amazing. They tend to max out at 10 megabyte/second on SD card or over USB / ethernet. If you don’t need a faster disk than that, they’re likely to be ideal in the role.
I’ve always seen it as a “take turns at being the guesser, and whoever does best wins” kind of game. If you take six goes and your opponent takes seven, then taste that sweet victory.
A digression, but the “viking chess” game Hnefatfl basically guarantees a win for white as written. So you need to mix it up - play two games, see who wins fastest; or constrain it like backgammon, roll dice and that’s the moves you must make.


Zelda 3? You get fast travel quite early and the world is packed with stuff, it’s not absurdly huge. Doesn’t have that bloody owl in it either, telling you the obvious at great length.
Certainly not Wind Waker, anyway. Now there is a slow game.


No unexpected crashes, no game breaking bugs. Performance was… dubious. It looks amazing, but UE5 has scalability issues. None of the graphics options seemed to do anything for frame count.


The studio is mostly ex-Ubisoft employees. So yeah, it’s their first game as that studio, but they’re by no means novice developers. Fair play to them for following their passion though, it’s paid off.


Best story, for sure. Most emotionally affecting is Majora’s, for me, but TP is close.
Don’t think the gameplay holds up. The Wii version is pure waggle, but even on the Gamecube, there’s a lot of filler - empty space and backtracking. Doesn’t respect your gaming time.


Especially since any version of Git from the last view years has a passionate hatred of symlinks for this reason, which is a bit annoying if you’ve a legit usecase. They’re either very out-of-date, or have done some very foolish customisation…
Criminal waste of elotes, though. I’ll have them if they don’t want them.


HDMI -> DP might be viable, since DP is ‘simpler’.
Supporting HDMI means supporting a whole pile of bullshit, however - lots of handshakes. The ‘HDMI splitters’ that you can get on eg. Alibaba (which also defeat HDCP) are active, powered things, and tend to get a bit expensive for high resolution / refresh.
Steam Machine is already been closely inspected for price. Adding a fifty dollar dongle into the package is probably out of the question, especially a ‘spec non-compliant’ one.


I’m going to guess it would require kernel support, but certainly graphics card driver support. AMD and Intel not so difficult, just patch and recompile; NVIDIA’s binary blob ha ha fat chance. Stick it in a repo somewhere outside of the zone of copyright control, add it to your package manager, boom, done.
I bet it’s not even much code. A struct or two that map the contents of the 2.1 handshake, and an extension to a switch statement that says what to do if it comes down the wire.


Python tkinter interfaces might be inefficient, slow and require labyrinthine code to set-up and use, but they make up for it by being breathtakingly ugly.
On account of Dan Ek’s bullshit, have cancelled Spotify this year in favour of Qobuz, and am much happier all round.
Last year’s ‘wrapped’ was just AI generated slop. After a year of listening to metal and electronica, got a top five of stuff that I’m not sure I’d listened to at all. Who would have thought the great plagiarism machine, trained to produce the most average output from any given input, would not do well on input that diverges from the mean?
I’d probably have preferred a completely random K-Pop selection; might have been an interesting listen, try out something new.


He did shake things up with a lot of new ideas. I’d like to think that proving him wrong has gotten us to a better place; it’s the fin de siecle version of being wrong on the internet, everyone writes to correct you. Kind of sucks for everyone that got the bad advice in the meantime, tho.


True. Was thinking of indie games, of the kind I might develop myself., which would be limited to the languages I speak myself.
If you’re developing something where you’d expect enough international sales to hire a translation team, then Chinese would be a sensible first choice, followed by Spanish.
Indeed.
In some ways, this kind of thing is ideal for Rust. It’s at it best when you’ve a good idea of what your data looks like, and you know where it’s coming from and going to, and what you really want is a clean implementation that you know has no mistakes. Reimplementing ‘core code’ that hasn’t changed much in twenty years to get rid of any foolish overflows or use-after-free bugs is perfect for it.
Using Rust for exploratory coding, or when the requirements keep changing? I think you’ve picked the wrong tool for the job. Invalidate a major assumption and have to rewrite the whole damn thing. And like you say; an important choice for big projects as choosing a tool that a lot of people will be able to use. And Window is very big.
They’re smoking crack, anyway. A million lines per dev per month? When I’m doing major refactoring, a couple thousand lines per week in the same language, mostly moving existing stuff into a new home, is a substantial change. Three orders of magnitude more with a major language conversion? Get out of here.