Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The point isn't IMO rules.

It's that we are living in a period of time where there are very real consequences of nearly a century of unchecked CO2 due to human industry.

And AI (like crypto before it) requires considerable energy consumption. Because of which, I believe we (people who believe in AI) need to hold companies accountable by very transparently disclosing those energy costs.



They are 100% accountable for paying the bill from their electricity provider.


This is a very salient point.


> need to hold companies accountable by very transparently disclosing those energy costs.

And if they do, then what? If it is "too high" do we delay research because we need to keep the world how it is for you? What about all the other problems others face that could be solved by doubling down on compute for AI research?


> And if they do, then what? If it is "too high" do we delay research because we need to keep the world how it is for you?

First, it's keeping the world how it is for all of us, not just me.

Second, to answer you question, I think that is a decision for all of us to weigh in on, but before we can do that, we must be informed as best as we can.

Do sacrifices have to be made for the greater good? Absolutely. Do for-profit mega corporations get to make those decisions without consent from the public? No.


Many people don't want to live in the world how it is. They would rather see risks taken for accelerated progress. Stop trying to pretend your take is the humanitarian take.


> Many people don't want to live in the world how it is.

Got a source to support your assertion that many people are okay with the effects of climate change?

https://www.scientificamerican.com/article/more-climate-laws...

> Stop trying to pretend your take is the humanitarian take.

That's a straw man. However, I cannot believe many humans are in support of an uninhabitable world.


What if at some point AI figures out a solution to climate change?


Why would we be any more likely to implement it, relative to the solutions that humans have already figured out for climate change?


Well we can be confident in the knowledge that techbros might finally take the issue seriously if an AI tells them to!


I know this is not an uncommon opinion in tech circles, but I believe an insane thing to hang humanities hopes on. There's no reason to think AI will be omnipotent.


There is not, but there is plenty of historical evidence that scientific and technological progress has routinely addressed humanities crisis du jour.


You mean new and incredibly effective ways to shape public opinion? I guess it might. But then someone would still have to use it for that purpose...




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: