Yes, but that electricity consumption benefits an actual person.
I'm so surprised that I often find myself having to explain this to AI boosters but people have more value than computers.
If you throw a computer in a trash compactor, that's a trivial amount of e-waste. If you throw a living person in a trash compactor, that's a moral tragedy.
The people who build, maintain, and own the datacenters. The people who work at and own the companies that make the hardware in the datacenters. The people who work to build new power plants to power the data centers. The truck drivers that transport all the supplies to build the data centers and power plants.
Call me crazy, but I'd rather live in a world with lots of artists making art and sharing it with people than a world full of data centers churning out auto-generated content.
One thing I've noticed - artists view their own job as more valuable, more sacred, more important than virtually any other person's job.
They canonize themselves, and then act all shocked and offended when the rest of the world doesn't share their belief.
Obviously the existence of AI is valuable enough to pay the cost of offsetting a few artists' jobs, it's not even a question to us, but to artists it's shocking and offensive.
It's shocking and offensive to artists and to like-minded others because AI labs have based the product that is replacing them off of their existing labor with no compensation. It would be one thing to build a computerized artist that out-competes human artists on merit (arguably happening now), this has happened to dozens of professions over hundreds of years. But the fact that it was built directly off of their past labors with no offer, plan, or even consideration of making them whole for their labor in the corpus is unjust on its face.
Certainly there are artists with inflated egos and senses of self-importance (many computer programmers with this condition too), but does this give us moral high ground to freely use their work?
How many people is it OK to exploit to create "AI"?
Every piece of work is built off of previous work. Henry Ford designed his car based off of the design of previous cars, but made them much more efficiently. No difference here. It's always been the case that once your work is out in the world the competition is allowed to learn from it.
compensation for previous products just hasn't been the norm, and if it becomes the norm, countless humans will suffer due to the slowing of progress.
we will never be able to automate anything because the risk:reward ratio simply won't be there if you have to pay off millions/billions of people. progress will grind to a halt but i guess we'll have preserved some archaic jobs while our children are denied a better world
certainly the Industrial Revolution would never have happened with this mindset
I read this comment as implying a similar kind of exceptionalism for technology, but expressing a different set of values. It reminds me of the frustration I’ve heard for years from software engineers who work at companies where the product isn’t software and they’re not given the time and resources to do their best work because their bosses and nontechnical peers don’t understand the value of their work.
The opposite is also true, the tech world views itself as more sacred that any other part of humanity.
You say it's obvious that the existence of AI is valuable to offset a few artists' jobs, but it is far from obvious. The benefits of AI are still unproven (a more hallucinatory google? a tool to help programmers make architectural errors faster? a way to make ads easier to create and sloppier?). The discussion as to whether AI is valuable is common on hackernews even, so I really don't buy the "it's obvious" claim. Furthermore, the idea that it is only offsetting a few artists' jobs is also unproven: the future is uncertain, it may devastate entire industries.
> One thing I've noticed - artists view their own job as more valuable, more sacred, more important than virtually any other person's job.
> They canonize themselves, and then act all shocked and offended when the rest of the world doesn't share their belief.
You could've written this about software engineers and tech workers.
> Obviously the existence of AI is valuable enough to pay the cost of offsetting a few artists' jobs, it's not even a question to us
No, it's not obvious at all. Current AI models have made it 100x easier to spread disinformation, sow discord, and undermine worker rights. These have more value to me than being able to more efficiently Add Shareholder Value
I have noticed this, but it's not artists themselves. It's mostly coming from people who have zero artistic talent themselves, but really wish they did.
I would be fine if data centers paid the full cost of their existence, but that isn't what happens in our world.
Instead the cost of pollution is externalised and placed on the backs of humanity's children. That includes the pollution created by those datacentres running off fossil fuel generators because it was cheaper to use gas in the short term than to invest in solar capacity and storage that pays back over the long term. The pollution from building semiconductors in servers and GPUs that will likely have less than a 10 year lifespan in an AI data center as newer generations have lower operating cost. The cost of water being used for evaporative cooling being pulled from aquifers at a rate that is unsustainable because it's cheaper than deploying more expensive heat pumps in a desert climate.... and the pollution of the information on the internet from AI slop.
The short term gains from AI have a real world cost that most of us in the tech industry are isolated from. It is far from clear how to make this sustainable. The sums of money being thrown at AI will change the world forever.
Given that data centers use less energy than the alternative human labor they replace, they actually improve pollution. Replacing those GPUs with more efficient models also improves pollution because those replacements use less electricity for the same workload than the units they replaced.
This is such a wild take. You're 100% correct that AI-generated art consumes less resources that humans making art and having to, you know, eat food and stuff.
Obviously, the optimal solution is to eliminate all humans and have data centers do everything.
> I'm so surprised that I often find myself having to explain this to AI boosters but people have more value than computers.
That is true, but it does not survive contact with Capitalism. Let's zoom out and look at the larger picture of this simple scenario of "a creator creates art, another person enjoys art":
The creator probably spends hours or days painstakingly creating a work of art, consuming a certain amounts of electricity, water and other resources. The person enjoying that derives a certain amount of appreciation, say, N "enjoyment units". If payment is exchanged, it would reasonably be some function of N.
Now an AI pops up and, prompted by another human produces another similar piece of art in minutes, consuming a teeny, teeny fraction of what the human creator would. This Nature study about text generation finds LLMs are 40 - 150x more efficient in term of resource consumption, dropping to 4 - 16 for humans in India: https://www.nature.com/articles/s41598-024-76682-6 -- I would suspect the ratio is even higher for something as time-consuming as art. Note that the time taken for the human prompter is probably even less, just the time taken to imagine and type out the prompt and maybe refine it a bit.
So even if the other person derives only 0.1N "enjoyment" units out of AI art, in purely economic terms AI is a much, much better deal for everyone involved... including for the environment! And unfortunately, AI is getting so good that it may soon exceed N, so the argument that "humans can create something AI never could" will apply to an exceptionally small fraction of artists.
There are many, many moral arguments that could be made against this scenario, but as has been shown time and again, the definition of Capitalism makes no mention of morality.
But it sounds like in this case the "morality" that capitalism doesn't account for is basically just someone saying "you should be forced to pay me to do something that you could otherwise get for 10x cheaper." It's basically cartel economics.
In isolation that makes sense, but consider that these AIs have been trained on a vast corpus of human creative output without compensating the human creators, and are now being used to undercut those same humans. As such there is some room for moral outrage that did not exist in prior technical revolutions.
Personally, I think training is "fair use", both legally and practically -- in my mind, training LLMs is analogous to what happens when humans learn from examples -- but I can see how those whose livelihood is being threatened can feel doubly wronged.
The other reason I'm juxtaposing Capitalism and morality is the disruption AI will likely cause to society. The scale at which this will displace jobs (basically, almost all knowledge work) and replace them with much higher-skilled jobs (basically, you need to be at the forefront of your field) could be rather drastic. Capitalism, which has already led to such extreme wealth inequality, is unsuited to solve for this, and as others have surmised, we probably need to explore new ways of operating society.