Data centers are big consumers of energy. Most modern data centers will have a mix of vector and scalar compute because ML/AI is a bunch of stuff, most of which was ubiquitous a decade ago.
In the limit case where Prineville just gets 100k BH100 slammed into it? The absolute best you’re going to do is to have Brendan Gregg looking at the cost. He’s the acknowledged world expert on profiling and performance tuning on modern gear in the general case. There are experts in a vertical (SG14, you want to watch Carl Cook).
I’ve been around the block and my go-to on performance trouble is “What’s the Gregg book say here…” it your first stop.
The data source is linked and is based on the ARM Datacenter Energy prediction.
But i don't think its too far fetched.
The compute needed for digital twins, simulating a whole army of robots than uploading it to the robots, who sitll need a ton of compute, is not unrealistic.
Cars like Tesla have A TON of compute build in too.
And we have seen what suddenly happens to an LLM when you switch the amount of parameters. We were in a investment hell were it was not clear in what to invest (crypto, blockchain and NFT bubble bursted) but AI opened up the sky again.
If we continue like this, it will not be far fetched that everyone has their own private agent running and paying for it (private / isolated for data security) + your work agent.
Is that implying that by 2030 they expect at least 20% of all US energy to be used by AI?