Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

onlyrealcuzzo wrote:

> Google is pretty useful. It uses 15 TWh per year.

15TWh per year is about 1.7GW.

Assuming the above figures, that means OpenAI and Nvidia new plan will consume about 5.8 Googles worth of power, by itself.

At that scale, there's a huge opportunity for ultra-low-power AI compute chips (compared with current GPUs), and right now there are several very promising technology pathways to it.



" there's a huge opportunity for ultra-low-power AI compute chips (compared with current GPUs), and right now there are several very promising technology pathways to it"

Sharing an example would be nice. Of how much power reduction are we talking here?


This one datacenter should be able to perform a 51% attack on any of the big cryptocurrencies with that much compute.

An interesting hedge in case the AI bubble pops.


Someone did the math above and said all of it would only be about 0.05 percent for Bitcoin.

I'm not sure about the GPU pow coins though


Nope, anything besides ASICs are useless for crypto mining.


Kansas City shuffle?


You're downvoted but it's a real threat. Imagine hackers or state sponsored entities use one of these mega data centers to destroy a few cryptocurrencies.


They are nothing compared to BTC Asics


Comparing 100 duck sized horses to 1 horse sized duck. Or perhaps the amount of GPUs is in the ratio of 1,000:1.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: