SnausagesinaBlanket@lemmy.world to Ask Lemmy@lemmy.world · 9 hours agoIs there a currently an accurate way to say how much power per prompt LLMs use?message-squaremessage-square7fedilinkarrow-up122arrow-down13
arrow-up119arrow-down1message-squareIs there a currently an accurate way to say how much power per prompt LLMs use?SnausagesinaBlanket@lemmy.world to Ask Lemmy@lemmy.world · 9 hours agomessage-square7fedilink
minus-squarefizzle@quokk.aulinkfedilinkEnglisharrow-up12·5 hours agoMost of the power consumption comes from training and optimising models. You only interact with the finished product, so power per query is very low compared to that required to develop the LLM.
minus-squarelime!@feddit.nulinkfedilinkarrow-up2·1 hour agowhile this is true in isolation, the amount of users means that inference now uses more power than training for the large actors.
Most of the power consumption comes from training and optimising models. You only interact with the finished product, so power per query is very low compared to that required to develop the LLM.
while this is true in isolation, the amount of users means that inference now uses more power than training for the large actors.