Some of Signal's designs for contact privacy, including in the new usernames feature, rely on trust in SGX.
If anyone (including Signal) can pretend to be a secure SGX environment, you're back to trusting Signal's personnel/operations, rather than Intel/SGX, for some of the metadata/contact privacy they've historically touted.
Just to expand on this, since it wasn't originally clear to me from reading your post, the contact privacy feature is about using SGX enclaves for the purpose of populating your known contacts on Signal. When you log into Signal for the first time your phone locally has all of your known contacts, and the Signal app wants to know which of these contacts already have Signal accounts. The secure enclave is a mechanism where you publish your entire contact list from your phone to the Signal servers, then they can send back the subset of those contacts that actually have Signal accounts. The point of the enclave is that this is all done in a way where Signal can't see what contacts you sent them, nor can they determine which contacts were matched and sent back to you.
It wasn't In-Q-Tel; but it was still essentially the CIA. Signal received millions in funding from the Open Technology Fund, an investment wing of Radio Free Europe, an organization founded as a CIA propaganda front.
Agreed, it's a fact that any US corporation will not be allowed to run lest they give the gov access to the data they hold (cref lavabit). I've never really understood the trust given to Signal in the tech community when a hard identity is required (phone #) and it immediately asks you to send your whole contact list to them on first run.
We know from Snowden that metadata about who is communicating with whom, and when, is one of their most valuable data streams. While signal may not be able to turn over the contents of your messages, they absolutely retain a rich stream of metadata.
You mean the telephone contacts / contact list stored in iCloud and Google cloud?
Probably it’s a hard problem to solve and this, using the sgx is signals best guess of an acceptable approach.
I think the nsa has simpler ways to access the data in question than through signal.
Users can generate key pairs themselves, once, and the public keys can be used to sign architectural enclaves post factum. Each enclave's cryptographic hash of the contents is only generated then.
This way, the users are only as secure as they want to be. The code would need to be signed using each one's public key, but we're talking about specialised software here.
Compare this to the de facto standard, where US corporations hold the private keys to everything hardware (off the top of my head, processor, UEFI) and everything software (SSL root keys, IP addresses, DNS).
We already have libpairip and play integrity on android, let's not bring it over to desktop processors.
If anyone (including Signal) can pretend to be a secure SGX environment, you're back to trusting Signal's personnel/operations, rather than Intel/SGX, for some of the metadata/contact privacy they've historically touted.
More info (2020): https://medium.com/@maniacbolts/signal-increases-their-relia...