Search engines appear to care more about being good "Netizens". It's not like GoogleBot never crashed a site, but it's rare. Search engine bots check if they need to back off for a bit, they check etags, notices if page changes infrequently and slow down their crawler frequency.
If you train an LLM, it's not like you keep a copy of every page around, so there's no point to check if you need to re-scrape a page, you do, because you store nothing.
Personally I think people would be pretty indifferent to the new generation of scrapers, AI or other types, if they at least behaved and slowed down if they notice a site struggling. If they had the slightest bit of respect for others on the web, this wouldn't be an issue.
Most offer ways to opt out, some don’t. Scraping somebody’s website might be annoying or problematic traffic-wise but that’s a far (very far) step removed from saying scrapers should be criminalised. The latter statement is outright laughable.
How do you think search engines work?