I actually intended to post this to Reddit but I thought I would contribute content to here instead to get the ball rolling here and do my part.
Anyway, this is a Windows XP-era machine I have at work for testing, and I had just this monitor plugged into it and saw the CPU fan trying to spin. I spun it a bit myself and it just kept going. I disconnected the HDMI cable and it stopped.
The monitor is actually DisplayPort, with a passive adapter to HDMI which then goes to the HDMI cable connected to this PC. The GPU is just PCI-E. The computer has some old ~2007 AMD CPU in it. The GPU actually doesn’t seem to work anyway, the PC posts normally but there’s no image from either the GPU or onboard, but when putting either another GPU or no GPU, there’s an image from the appropriate output.
I actually showed this to my IT lecturer and he said he hadn’t seen anything like this either, but he went down the rabbit hole of HDMI specs and found that there is a small 5v 50mA power sent out from HDMI devices, presumably to detect if they’re connected to something or not.
Somehow, that power must be running through the GPU, through the PCI-E slot, fuck knows where next, to the CPU fan header…
If you restart the system without the HDMI attached does the fan still not turn on? Technically it could be an obscure software issue 🤔
I think that sounds logical apart from the system not being powered on at all lol
Only 50mA though? Is that enough to drive a fan?
I mean, it certainly seems plausible but that must be a hilariously badly designed motherboard, with also a surprisingly efficient route from the HDMI port to the CPU fan header?
I think The 50mA is only the maximum current a device is allowed to draw. I found a document from texas instrument that devices may deliver up to 0.5 A in a short circuit condition. So the monitor could deliver more than 50mA. But the graphics card probably draws more than it’s supposed to.