right? a US friend of mine messaged me the other day “is VIC on fire?”… like… na, and actually this summer has been pretty mild so far lol. i haven’t even seen a single melting road!
right? a US friend of mine messaged me the other day “is VIC on fire?”… like… na, and actually this summer has been pretty mild so far lol. i haven’t even seen a single melting road!


i thought this too, and i just started actually working with it and DAMN is it fast… i agree that it’s kinda a technical “what the fuck are you doing?!?” but… yeah… i can’t even really explain why


absolutely! similar is true of node in v8 (though python imo is far more mature in this regard) and probably most other languages
exactly why things like numpy are so popular: yeah python is slow, but python is just the orchestrator


further to that, “demonstrably worse for the planet” i’d like to debate: considering a huge amount of climate science is done with python-based tools because they’re far easier for researchers to pick up and run with - ie just get shit done rather than write good/clean code - i’d argue the benefit of python to the planet is in the outputs it enables for significantly reduced (or in many cases, perhaps outright enabled) input costs


yeah we have a “supply charge” that’s ~$1/day on top of that base rate too, so roughly the same situation :(
we’ve got this crap because of privatisation so it’s not likely to change any time soon.
i hope your energy prices come down when energy things stabilise in europe!


just sayin’ this is still so incredibly cheap… 8c/kwh… australian electricity prices are 24-43c/kwh (obv usd vs aud but the aussie $ isn’t that weak)


enron sold plenty of gas and real things too: it’s the double handling that’s the problem; not the nature of the goods or services


openai has practically no value and that’s well known… nvidia is paying companies to buy their chips and playing bullshit shell games
the difference is openai is a pretty well known unprofitable company, and they aren’t doing quite as much of the bullshit shell games. nvidia is selling to basically everyone, taking stakes in companies, giving weird deals… it’s bloody impossible to track how much of their sales are real and how much those real sales are actually worth, or if those sales are loss leaders for some investment then those investments look a lot like openai
so nvidia not only is invested in a lot of very questionable AI bubble companies, but also their own sales figures are… unreliable
they’re making billions upon billions because they’re using their own money multiple times. it’s kinda like leveraged trading with all the risk and it’s incredible arrogant at the scale that nvidia is doing it


perhaps… i guess the single directional execution model would help to prevent memory leaks, and components would help keep things relatively contained… and also javascript in general avoids whole classes of c/c++ bugs… but it’s also incredibly slow. imo it’s just not something you should write core system components in
to be clear, it’s not react that’s the problem here: its execution model is an excellent way of structuring UI… but something as core as the start menu just really isn’t something you should fuck around with slow languages with
and also, that’s not to say that FOSS shouldn’t do it - they’re open, and thus something like react makes it easier for devs to write plugs and extend etc… but that’s not an engineering concern for windows: they don’t get the luxury of using extensibility as an excuse


and thus in this case worst than useless: dangerous


little measurable difference? the last time they rewrote something they replaced the start menu with fucking react
the difference will be measurable and enormous


yeah they do certainly exist, but bog standard “red light cameras”… ie single purpose cameras are not that kind of problem… imo, as long as they’re deployed to combat actual issues they’re very much a beneficial tool
i think it’s important to differentiate these new kinds of cameras from the single purpose cameras so that arguments against them can be made independently
going straight from nothing to 30g/day (RDI) absolutely causes diarrhoea because it irritates your gut lining
if you follow directions like metamucil has to increase by a g or 2 per week then you’ll be right
you also need insoluble fibre, and psyllium only has soluble fibre
adjacent YSKs:


red light cameras - at least in australia - are stock standard canon DSLRs… they take images, but not video
there are some newer ones that do things like photos of people using their phones stopped at lights etc, but generally speed/red light and “single purpose” cameras will just be doing stills, and wouldn’t be too useful for anything other than a single photo when the sensor triggers it


yeah i remember that as well! considering the bandwidth netflix takes up i’m not surprised at all! i think it’s like 15% of global internet bandwidth or something crazy?


I’m guessing you dropped a zero or two on the user count
i was being pretty pessimistic because tbh i’m not entirely sure of the requirements of streaming video… i guess yeah 200-500 is pretty realistic for netflix since all their content is pre-transcoded… i kinda had in my head live transcoding here, but also i said somewhere else that netflix pre-transcodes, so yeah… just brain things :p
also added an extra zero to the wattage
absolutely right again! i had in my head the TDP eg threadripper at ~1500w - it’s 350w or lower


my numbers are coming from the fact that anyone who’s replacing all their streaming likely isn’t using a single disk… WD red drives (as in NAS drives) according to their datasheet use between 6 and 6.9w when in use (3.6-3.9w at idle)… a standard home NAS has 4-6 bays, and i’m also assuming that in a typical NAS setup they’re in some kind of RAID configuration, which likely means some level of striping so all disks are utilised at once… again, i think all of these are decent assumptions for home users using off the shelf hardware
i’m ignoring sleep here, because sleep for NAS drives leads to premature failure… this is why if you buy WD green drives for your NAS for example and you use linux, you wdparm to turn off sleep to avoid constantly parking and unparking the heads which leads to significantly reduced life (afaik many NAS products do this automatically, or otherwise manage it)
the top end of that estimate for drives (6 drives) is 41.4w, and the low end (4 drives) is 24w… granted, not everyone will have even those 4 drives, so perhaps my estimate is a little off, but i don’t think 30w for drives is an unreasonable assumption
again, here’s where data centres just do better: their utilisation is spread much more evenly… the idle power of drives is not hugely less than their full speed read/write, so it’s better to have constant access over fewer drives, which is exactly what happens with DCs because they have fewer traffic spikes (and can legitimately manage drive power off for hours at a time because their load is both predictable, and smoother due just to their scale)
also, as someone else in the thread mentioned: my numbers for severs were WAY off for a couple of reasons, but basically
Back of the envelope math says that’s around 0.075 watts per individual stream for a 150w 2U server serving 2000 clients, which looks pretty realistic to my eyes as a Sysadmin.
that also sounds realistic to me, having realised i fucked up my server numbers by an order of magnitude for BOTH power use, and users served
servers and data centres are just in a class of their own in terms of energy efficiency
here for example: https://www.supermicro.com/en/products/system/storage/4u/ssg-542b-e1cr90
this is an off the shelf server with 90 bays that has a 2600w power supply (which even then is way overkill: that’s 25w per drive)… with 22tb drives (off the top of my head because that’s what i use, as it is/was the best $/byte size) that’s almost 2pb of storage… that’s gonna cover a LOT of people with that 2600w, and imo 2600w is far beyond what they’re actually going to be pulling


that’s all irrelevant though… the rule is the rule and they got caught
people should be allowed to have awards for games which only use humans, and if a game is caught cheating they should be disqualified
if they want to compete for some awards, these aren’t the awards for them: there are others
what kind of monster writes a script without a shebang?