Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Seems like there is rather a lot of misunderstanding about what Apple are doing here, which is a bit disappointing. They're running an algorithm to detect known child pornography pictures from a database (CSAM). It's run on your phone and the hashes of images are compared.

Separately, if you are a child, and I presume you are not if you are reading this, photos sent in messages are scanned to catch if you are sending something sexually explicit. You can opt not to send, but if you do your parents are informed. So this is relevant if you are a child or if you have a child with an iPhone. Otherwise, it is not relevant to you.

Please help me understand what is so outrageous about these systems that would make you throw away your Apple products and move to something (what?) else.



Outside of controversial implementation with on device "processing" is the fact that a third party - private corporation funded by DOJ (30+ millions)can change hashes. Nobody publicly can check this because of the sensitive nature of the materials. This system will be exported to all the world.


They require hash hits from two companies. And there is a human at Apple checking final hits, so that is a way of verifying nonsense isn’t being added to the database.


Ok. It is ok. For you. But not for me and growing crowd of "screeching voices".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: