Hue bulbs (and any other RGB LED) can display (almost) any color perceptible to the human eye as it combines the three wavelengths of colors our eyes can detect (red, green and blue) and blends them at different brightnesses. The “millions of colors” sell comes from 16-bit color found all over the place in technology. Here’s more info: https://en.m.wikipedia.org/wiki/High_color
It’s just the sum. Monitors have 8bit per color, making for 24bit per pixel, giving the millions mentioned. 16bit is actually 4bit per color and then another 4 for a single of those colors. But this has downsides as explained in the article when going form higher bit depth to lower.
HDR is 10bit per color, and upwards for extreme uses. So it’s sorta true they are 24 or 30 bit, but usually this isn’t how they are described. They normally talk about the bit depth of the individual color.
Hue bulbs (and any other RGB LED) can display (almost) any color perceptible to the human eye as it combines the three wavelengths of colors our eyes can detect (red, green and blue) and blends them at different brightnesses. The “millions of colors” sell comes from 16-bit color found all over the place in technology. Here’s more info: https://en.m.wikipedia.org/wiki/High_color
16-bit colour gives us around 65000 colours, 24-bit colour gives us the millions mentioned above.
also 10 bit raw footage, not 30 bit
so is it called 24 bit or 8 bit? I feel like most monitors have 8 bit color and the fancy ones have 10, not 24 and 30
It’s just the sum. Monitors have 8bit per color, making for 24bit per pixel, giving the millions mentioned. 16bit is actually 4bit per color and then another 4 for a single of those colors. But this has downsides as explained in the article when going form higher bit depth to lower.
HDR is 10bit per color, and upwards for extreme uses. So it’s sorta true they are 24 or 30 bit, but usually this isn’t how they are described. They normally talk about the bit depth of the individual color.