It's that we are living in a period of time where there are very real consequences of nearly a century of unchecked CO2 due to human industry.
And AI (like crypto before it) requires considerable energy consumption. Because of which, I believe we (people who believe in AI) need to hold companies accountable by very transparently disclosing those energy costs.
> need to hold companies accountable by very transparently disclosing those energy costs.
And if they do, then what? If it is "too high" do we delay research because we need to keep the world how it is for you? What about all the other problems others face that could be solved by doubling down on compute for AI research?
> And if they do, then what? If it is "too high" do we delay research because we need to keep the world how it is for you?
First, it's keeping the world how it is for all of us, not just me.
Second, to answer you question, I think that is a decision for all of us to weigh in on, but before we can do that, we must be informed as best as we can.
Do sacrifices have to be made for the greater good? Absolutely. Do for-profit mega corporations get to make those decisions without consent from the public? No.
Many people don't want to live in the world how it is. They would rather see risks taken for accelerated progress. Stop trying to pretend your take is the humanitarian take.
I know this is not an uncommon opinion in tech circles, but I believe an insane thing to hang humanities hopes on. There's no reason to think AI will be omnipotent.
It's that we are living in a period of time where there are very real consequences of nearly a century of unchecked CO2 due to human industry.
And AI (like crypto before it) requires considerable energy consumption. Because of which, I believe we (people who believe in AI) need to hold companies accountable by very transparently disclosing those energy costs.