a single 1500w server can probably service ~20 people in a DC
I’m guessing you dropped a zero or two on the user count, also added an extra zero to the wattage (most traditional colocation datacenters max out at around 2,000 concurrent watts per 48U rack, so each server is going to target around 50-75w per rack unit of average load)
Netflix is going to be direct-playing pre-transcoded streams, so the main constraint would be bandwidth. If we average out all streams to 5mb/s, that’s about 200 streams per gigabit of network bandwidth. Chances are that server has at least 10 gigabit networking, probably more like 50 gigabit if they have SSDs storing the data (especially with modern memory caching). That’s between 2,000 and 10,000 active clients per server
Back of the envelope math says that’s around 0.075 watts per individual stream for a 150w 2U server serving 2000 clients, which looks pretty realistic to my eyes as a Sysadmin.
Granted for a service the size of Netflix we aren’t talking about individual servers we’re looking at a big orchestrated cluster of servers, but most of that is handling basic web server tasks that are a completely solved problem and each individual server is probably serving a few million clients thanks to modern caching and acceleration features. The real cost and energy hit is going to be in the content distribution which I covered above.
I’m guessing you dropped a zero or two on the user count
i was being pretty pessimistic because tbh i’m not entirely sure of the requirements of streaming video… i guess yeah 200-500 is pretty realistic for netflix since all their content is pre-transcoded… i kinda had in my head live transcoding here, but also i said somewhere else that netflix pre-transcodes, so yeah… just brain things :p
also added an extra zero to the wattage
absolutely right again! i had in my head the TDP eg threadripper at ~1500w - it’s 350w or lower
Hey if you were thinking live-transcode I can definitely see why you’d think around 20 clients per server for CPU transcode and I can also see where such a high wattage would come from!
Edit: fun bonus fact! Netflix offers caching servers to ISPs that they can place on their side of the interconnect to mutually reduce bandwidth costs. By memory from a teardown I saw on reddit like a decade ago, it was a pretty standard 1U single socket server (probably a supermicro whitebox if we’re being real)with 4-6 HDDs to serve the media files
yeah i remember that as well! considering the bandwidth netflix takes up i’m not surprised at all! i think it’s like 15% of global internet bandwidth or something crazy?
I’m guessing you dropped a zero or two on the user count, also added an extra zero to the wattage (most traditional colocation datacenters max out at around 2,000 concurrent watts per 48U rack, so each server is going to target around 50-75w per rack unit of average load)
Netflix is going to be direct-playing pre-transcoded streams, so the main constraint would be bandwidth. If we average out all streams to 5mb/s, that’s about 200 streams per gigabit of network bandwidth. Chances are that server has at least 10 gigabit networking, probably more like 50 gigabit if they have SSDs storing the data (especially with modern memory caching). That’s between 2,000 and 10,000 active clients per server
Back of the envelope math says that’s around 0.075 watts per individual stream for a 150w 2U server serving 2000 clients, which looks pretty realistic to my eyes as a Sysadmin.
Granted for a service the size of Netflix we aren’t talking about individual servers we’re looking at a big orchestrated cluster of servers, but most of that is handling basic web server tasks that are a completely solved problem and each individual server is probably serving a few million clients thanks to modern caching and acceleration features. The real cost and energy hit is going to be in the content distribution which I covered above.
i was being pretty pessimistic because tbh i’m not entirely sure of the requirements of streaming video… i guess yeah 200-500 is pretty realistic for netflix since all their content is pre-transcoded… i kinda had in my head live transcoding here, but also i said somewhere else that netflix pre-transcodes, so yeah… just brain things :p
absolutely right again! i had in my head the TDP eg threadripper at ~1500w - it’s 350w or lower
Hey if you were thinking live-transcode I can definitely see why you’d think around 20 clients per server for CPU transcode and I can also see where such a high wattage would come from!
Edit: fun bonus fact! Netflix offers caching servers to ISPs that they can place on their side of the interconnect to mutually reduce bandwidth costs. By memory from a teardown I saw on reddit like a decade ago, it was a pretty standard 1U single socket server (probably a supermicro whitebox if we’re being real)with 4-6 HDDs to serve the media files
yeah i remember that as well! considering the bandwidth netflix takes up i’m not surprised at all! i think it’s like 15% of global internet bandwidth or something crazy?