Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Why even care if it is written by a machine or not? I am not sure it matters as much as people think.

You don't see the writing on the wall? OK, here is a big hint: it might make a huge difference from a legal perspective whether some "photo" showing child sexual abuse (CSA) was generated using a camera and a real, physical child, or by some AI image generator.



I don't think all jurisdictions make that distinction to start with and even if they did and societies really wanted to go there: not sure why a licensing regime on generators with associated cryptographic information in the images could not work. We don't have to be broadly permissive, if at all.


I agree with you, but in some jurisdictions the distance between stuff generated with AI and actual photographs of child abuse are treated rather closely; either way, possessing either could result in what the England & Wales calls a "sexual harm prevention order" (SHPO). To me the idea that someone could be served such an order without ever possessing real CSEM (or "child porn"), never mind actually never being near a child is rather worrying.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: