Every artist and creator of anything learned by engaging with other people's work. I see training AI as basically the same thing. Instead of training an organic mind, it's just training a neural network. If it reproduces works that are too similar to the original, that's obviously an issue, but that's the same as human artists.
For profit products are for profit products, that are required to compensate if they are derivative of other works (in this case, there would be no AI product without the upstream training data, which checks the flag that it's derivative).
If you would like to change the laws, ok. But simply breaking them and saying 'but the machine is like a person' is still... just breaking the laws and stealing.
This is a bad-faith argument, but even if I were to indulge it: human artists can/do get sued for mimicing the works of others for profit, which AI precisely does. Secondly, many of the works in question have explicit copyright terms that prohibit derivative works. They have built a multi-billion dollar industry on scaled theft. I don't see a more charitable interpretation.
You can't call something a bad-faith argument just because you disagree with it. I mean, you can, but it's not at all convincing.
As I said, if AI companies reproduce copyrighted works, they should be sued, just like a human artist would be. I haven't experienced that in my interactions with LLMs, but I've never really tried to achieve that result either. I don't really pirate anymore, but torrents are a much easier and cheaper way to do copyright infringement than using an AI tool.
LLMs don’t have to be able to mimic things. And go ahead and sue OpenAI and Anthropic! It won’t bother me at all. Fleece those guys. Take their money. It won’t stop LLMs, even if we bankrupted OpenAI and Anthropic.