Calling nearly naked, non consensual imagery of real children “not CSAM” is a dangerous avenue to follow. For a child, this can easily lead to bullying, substantiate rumors that are otherwise false, or normalize their unwilling participation in sexual activity.
I think you may be coming to this view from the approach that this is just the AI using imagination/hallucination so it’s “art”, but a better approach would be to treat it like a real photo taken secretly because absent overt labeling of its AI origins that is exactly how the world will treat it.
I disagree. If we solve this problem, we’ll also solve the more serious “real” problem.
The only distraction comes from people saying this is okay, and those people are bound to always practice a motte and bailey argument style to hide the fact that they are generally okay with CSAM.
The same people who defended “creep shots” and “family nudism photos” are now defending GrokAi.
Choosing not to counter them, merely allows them to normalize grey areas so they can use them to freely seek association with others with more obscene and obviously illegal content.