Security and privacy are not parallel concerns, they’re orthogonal. Strong security absolutely does not imply utmost privacy. I find this to be the most dangerous misconception of the late privacy trend. You can’t just turn security and privacy dials to 11. They’re actually two ends of the same dial, or opposing poles of the same sphere. To increase privacy you must move away from perfect security.
Why? Because security is all about who you trust (and who you don’t). Privacy is about concealing things from people you trust (and especially from those you don’t). Security is best served with strong identity and periodic integrity checks/monitoring. Privacy is best served via anonymity and opacity. If something is private, by definition there lacks the transparency to audit its integrity.
So what I lament is not that a company is trying to achieve both, but rather that as a consumer I’m not educated on the topic and able to make a choice as to where I want to set the dial.
You will continue to see “headlines” like this so long as socially we’re obsessed with trying to implement both security and privacy and fall subject to marketing suggesting some service provides the maximum of both.
If you trust Apple to verify the integrity of apps on your devices and secure your system from unwanted software, then you trust Apple to maintain the privacy around the data needed to achieve such. That’s the whole value prop of their platform and ecosystem. It’s a walled garden with abundant privacy koolaid fountains.
The only reason this is news is because people don’t understand the privacy vs security dichotomy. And because Apple does not provide a way for consumers to choose just how much security they’re comfortable with.
If you don’t trust Apple then stop pretending you do by using their hardware/ecosystem.
While security and privacy are not parallel, they are not orthogonal either.
You can have both. Integrity checks can be done anonymously (e.g.: you could have a p2p network of devices sharing a signed database of certificate revocations).
Encrypting something gives me privacy. Signing something gives me security. Encrypting a signed package gives me both.
They _are_ orthogonal, you’re just saying that you can have some of both which is exactly my my point about them existing on a spectrum.
And it’s not as simple as encrypting data. You have to trust somebody to determine what good integrity looks like and to then verify the integrity information is fresh. The same privacy concern exists if you run OCSP against cypher-text as it does plaintext. You still have a stream of all the things people do. Bad for privacy.
Running a decentralized system means you have to trust all the nodes to not to store data or collude. Same problem in a different way. You simply cannot achieve integrity verification if you don't trust anyone to do it. And this is why they are orthogonal. Trust is not compatible with doubt.
The issue here is that Apple took off the shelf OCSP and applied it in a way it was not designed for. So there _are_ actual problems with their late implementation. They should be fixed. And personally I think OCSP is kinda dumb because it mechanically defeats the advantage of certs (you don’t need a cert if you’re going to phone home for every invocation, just check a hash), but meh.
You're using the extreme cases of each to argue they are "orthogonal". Perhaps at some extreme no one actually lives in, they are orthogonal. Most people don't need extreme privacy or extreme security, so in the cases that matter, this observation (which seems to be a major crux in your argument) is not important.
The point is that the more private you make something the less ability you have to audit its integrity. If sending a 3rd party a list of hashes is a privacy problem, then security is what takes a hit in order to preserve privacy. That’s not an “extreme case”, it’s what’s being discussed in the essay and in this thread.
Similar examples include DNS over TLS vs DNS filtering for content security, and client certs for mutual TLS vs exposing personal information in said cert, and secure neighbor discovery, and IPv6 (can’t have a global IP because someone might track it), the list goes on.
I’m not saying we should pursue security at all costs or privacy at all costs, far from it. I am saying exactly that there’s a balance between the two and moreover that the balancing point may be different for individual people which leads to arguments like we’re seeing here between people who calibrate more on the security end vs people who prefer extreme privacy. And in my experience people very often conflate the two, which makes it hard to have a productive discussion.
Finally, I’d venture to say that the privacy push of late is having impacts on the ability to deploy strong identity because much of the privacy wave lacks the nuance to distinguish between entities you trust and hence with which it’s okay to maintain a stable secure identity, and those that aren’t. Instead the trend lately has been remove stable identifiers (e.g. Apple’s move to fake mac addresses and GDPR’s IPs are PII) and conceal everything no matter what (TLS 1.3 and DoT/DoH although props to Mozilla for making it possible to configure at the network level via DNs).
One major security concern is identity theft so your statement of privacy being orthogonal to security is very likely not generally true, because the amount of parties you share information with increase the risk.
So we need to look at every scenario in particular because a general rule might not exist. I think the mechanism of signature checking and its consequences should be more transparent.
> If you don’t trust Apple then stop pretending you do by using their hardware/ecosystem.
Fair point, but the trust I put in Apple is that I know their intentions which are quite obvious. They want to sell their products. They want their users being safe and content. I trust them for that. Can this trust be extended to not creating barriers that might increase their revenue? Or share user info for profit? Certainly not. I think Apple is better here than competitors, but it is a common error to not attach restricted scopes to security tokens.
edit: To this case there are trivial solutions for signatures that solve the problem without any privacy violation. Maybe Apples process does preserve privacy because they don't cross reference any data. Maybe they just use it for statistical purposes to determine which apps are used. I would be fine with that. But your statement suggest there is no privacy preseving alternative and I think that is technically wrong.
Why? Because security is all about who you trust (and who you don’t). Privacy is about concealing things from people you trust (and especially from those you don’t). Security is best served with strong identity and periodic integrity checks/monitoring. Privacy is best served via anonymity and opacity. If something is private, by definition there lacks the transparency to audit its integrity.
So what I lament is not that a company is trying to achieve both, but rather that as a consumer I’m not educated on the topic and able to make a choice as to where I want to set the dial.
You will continue to see “headlines” like this so long as socially we’re obsessed with trying to implement both security and privacy and fall subject to marketing suggesting some service provides the maximum of both.
If you trust Apple to verify the integrity of apps on your devices and secure your system from unwanted software, then you trust Apple to maintain the privacy around the data needed to achieve such. That’s the whole value prop of their platform and ecosystem. It’s a walled garden with abundant privacy koolaid fountains.
The only reason this is news is because people don’t understand the privacy vs security dichotomy. And because Apple does not provide a way for consumers to choose just how much security they’re comfortable with.
If you don’t trust Apple then stop pretending you do by using their hardware/ecosystem.