Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It kind of feels like there's a bit too much noise around this topic.

I'm getting the same feeling I did years ago when it was discovered that the iPhone had a historical database of all the locations you'd been to. There were rather a lot of articles about how Apple were "tracking you everywhere you went" and so on.

The reason it's similar – they are both dumb, technically bad, and privacy-compromising decisions, and in both cases much of the public discussion about it has been a little hysterical and off-base.

Apple should 100% be criticised for this particular failure. It's obviously a bad implementation from a technical and usability point of view; the privacy implications are bad, and this features should not have been able to make it out as-is.

But I've legitimately seen people describe this as "Apple's telemetry" which is just obvious nonsense and distracts from the actual problem – how did such a bad implementation of a useful feature end up in a major commercial product, and how are they going to make sure it doesn't happen again?



It’s actually not at all obvious how a local list of locations used to power suggestions in maps or Siri, is in any way a compromise of privacy or technically bad.

The only thing that made it sound bad were people saying things like “Apple stores your location history”, knowing that it would create the false impression that Apple was uploading location data to their servers.

This situation is similar in that there are people posting misleading inuendo about Apple having some hidden agenda, but the difference is that there do seem like real design problems with the mechanism this time.


Every iOS device connects to Apple's push service and stays connected. The client certificate it uses is tied to the serial number of the device itself, when it registers for the push service.

Apple sees the client IP of the push connection, naturally.

Therefore, based on IP geolocation, Apple really does have coarse location history for every single iOS device by serial number.

Apple is indeed storing your (coarse) location history.


> Therefore, based on IP geolocation, Apple really does have coarse location history for every single iOS device by serial number.

This conflates technical possibility with an implemented system which stores that data and the insinuation that this is used for purposes other than what the user enabled. Do you have an evidence that Apple stores this data and uses it in violation of their privacy policy? You're apparently in Europe so you should be able to file a GDPR request to see exactly what they're storing.


Everything I described is implemented today, and is required for APNS to work. Whether Apple does or does not mine the data in some way that you personally find offensive is not relevant; the fact is that they are presently logging IPs for APNS connections, which have unique identifiers that are related directly to hardware serial numbers in their database. Because IPs generally equal location, they are in possession of location history for each iOS device serial number.

I'm not sure why these (plainly factual) statements are controversial.


There’s no question that they need connections to operate the service but they do not need to retain that information, however, and while you have repeatedly asserted that they do, you have been unable to support that claim. This would be covered by privacy laws in many places so it should be easy to point to their privacy disclosures or the result of an inquiry showing that they do in fact retain connection logs for more than a short period of time.


It stands to reason that user of the service who has agreed to the TOS that governs APNS and App Store sending unique device hardware serial numbers to Apple has also (legally) consented to IP address collection. IP addresses are less unique identifiers than globally unique device serials, so I assume Apple has already secured what passes for "consent" under the relevant privacy laws.


Again, nobody is questioning their access to that information. Where you typically go wrong is by asserting without evidence that they are performing additional activities without disclosing that. Surely you understand that having IPs be visible does not automatically mean retaining those records, much building a searchable database?


I’m with you on this. If Apple genuinely views privacy as a human right (and FWIW I believe that the right people at Apple do), then they need to learn about privacy by design.


I think people who have this viewpoint are largely ignorant of the amount of tracking data that comes out of a mac or iphone, even in first party apps, that you cannot turn off at all.

This is not an isolated incident, and their own OS services are explicitly whitelisted to bypass firewalls and VPNs in Big Sur.

There’s telemetry in most Apple apps, now, and you can’t disable it or opt out, or even block or VPN it in some cases. I encourage you to read their disclosures when you first launch Maps or TV or App Store. Every keystroke in the systemwide search box hits the network by default, too.


This is a good example of what I mean - it’s a rant about telemetry that has nothing to do with the issue in question and as a result conflates a whole mess of different issues.

I am pretty well-informed about most of the data that is being generated and transmitted by my machine. This is an issue where it it totally reasonable to pressure Apple and all other companies to design for privacy as a priority and ensure that any of this data collection can be disabled. I don’t think an effective means to do that is to deliberately conflate and mislead about issues like the one under discussion.


System services bypassing VPN and firewall because they're on a vendor whitelist (that can't be modified due to OS cryptographic protections) is a) fact not rant and b) nothing to do with telemetry.


This is a basic by-the-RFC implementation. The developer who was assigned this just used existing libraries and followed the protocol. This was a rational move on their part. Especially when mucking with x509 has been historically fraught with vulnerabilities.

OCSP has since been improved to increase privacy and security, but the extensions to enable that only considered OCSP in the context of TLS.


Just to correct slightly incorrect perception: there is nothing inherently insecure or vulnerable about X.500/ASN.1/BER/DER parsing, in fact it is probably more sane format to parse than JSON. The perception that it is somehow fraught with parser vulnerabilities comes from various implementations that tried to implement BER/DER parser by transforming something more or less equivalent to ASN.1 grammar into actual parser code by means of C preprocessor macros, which is somewhat obviously wrong approach to the problem, at least in the security context.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: