Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I mostly agree, though I actually wonder if the energy difference is as big as you say. Yeah the big LLM company datacenters consume tremendous amounts of power, but that's mostly for either training or serving ~millions of requests at once. I wonder what the actual net power consumption is for a single machine doing about as much work as a single ordinary person could do with just enough hardware to run the necessary models. Or what the average amount of energy for a single set of interactions is at one of the big shared datacenters - they reportedly have a lot of optimizations to fit more requests on the same hardware. I think it might be only one order of magnitude greater than a human brain. Maybe actually pretty close to equal if you compare total energy used against work being done, since the human brain needs to be kept alive all the time, but can only effectively do work for a limited part of the day.


Great points. I was thinking only about the noggin inside our heads, without considering all the infrastructure it relies on for support!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: