Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Oh come on, Apple has had numerous privacy issues as well.

What about uploading every photo to their server to check for child porn?

What about uploading every binary you run to their server to check if its “approved”?

People really need to put down their “sheeple” glasses. Apple is a huge public company like all the others, beholden to profit their shareholders and thats it.

All that virtue signalling that they do is just a smokescreen. They are not any better or worse than MS/Google/Amazon etc…



> What about uploading every photo to their server to check for child porn?

I don't believe that's true. If you use iCloud it goes to their server but if you aren't backing up to their service they are not checking. What was controversial was they were going to check on the device itself so even if you didn't use iCloud your phone was getting scanned.


Using iCloud is the incentivized default (like using an online account on Windows), so it applies to most users.


> What about uploading every binary you run to their server to check if its “approved”?

My understanding was macOS only checked the signature of third party executables before first run.

Windows may let you run it first, but any unknown executable (not just a hash, the file itself) gets sent to Microsoft by default. [0]

[0] https://medium.com/sensorfu/how-my-application-ran-away-and-...


The child porn check is to please the government, not shareholders


This. Apple did this against its own interests as a privacy first company as a way to reduce a real world problem. Personal privacy has always been at odds with law enforcement and Apple is one company that has taken real steps to restrict the unimpeded flow of personal information into governments. The way the CSAM database and checking was implemented was still in a privacy preserving way that made every effort to restrict that same flow while not enabling child sexual exploitation.


Is there any law that required Apple to do this? On a global scale even?


While I’m unsure of laws on a global scale, I do know that law enforcement has repeatedly and publicly pressured Apple for keys-to-the-kingdom style access to customer devices. The justification is often a case that seems unredeemable to the average Joe. I know this from articles that have previously hit HN’s front page.


> They are not any better or worse than MS/Google/Amazon etc…

You sure? I would be surprised if MS, Google and Amazon are the same when it comes to how they handle and use PII. Note that I say that "I doubt", because without having gone the length of issuing a GDPR right of access request with any and all of them it is more of a gut feeling based on personal experience than hard facts.

> What about uploading every binary you run to their server to check if its “approved”?

You are either being dishonest or misinformed. For one, they don't upload binaries. Secondly, the more important function is checking that it isn't known malware, and thirdly they also do not (no longer) log PII like IP address or user id when performing the check. [1]

1: https://support.apple.com/en-us/HT202491




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: