Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Confidential Cloud is similar to end-to-end encryption, but with the added benefit of letting your personalized AI work for you even when you aren’t using the app.

You control who can decrypt your data

Your employer, we as software providers, and the government cannot decrypt your data without your permission, even with a subpoena to do so.

Your data is anonymized

There is an initial mapping from your application request to an anonymous ID, but after that, even we don't know whose encrypted data is whose.

Your data will never be sold

Not only do we pledge never to sell your data, but we couldn't even if we wanted to because we can't decrypt it without your permission.

Envelope encryption with unique secret keys

Data keys are used to encrypt your data and are themselves encrypted under a secret key. No one, including us, can export the secret keys.

Tamperproof hardware

Decrypted data is only ever stored in memory (source: <https://docs.aws.amazon.com/kms/latest/cryptographic-details...>), which is protected from tampering by a multi-chip standalone hardware cryptographic appliance (source: <https://csrc.nist.gov/projects/cryptographic-module-validati...>).

Protected from 3rd party AI providers

Data sent to 3rd party AI providers for transcription and summarization is anonymized, not used for training, and deleted after 30 days.

Protected from cloud provider and subpoena

AWS KMS (source: <https://aws.amazon.com/kms/>) uses FIPS 140-2 (source: <https://csrc.nist.gov/CSRC/media/projects/cryptographic-modu...>) validated hardware security modules (HSMs) to ensure no one, including AWS employees, can retrieve your plaintext KMS keys.



Limitless is encrypting at rest, not using end-to-end encryption.

E2EE suggests that only the user (or at least only people the user knows about in the case of e.g. group chats) is able to see/access the decrypted data, which is false. Limitless does not decrypt data on the client using a key only the user has access to, it decrypts the data on the server (in this case using AWS KMS) and sends it to the client. Even if we remove just decrypting everyone’s data out of the equation using AWS KMS (since the user does not control the key), you could trivially write a Cloudflare Worker (since you use Cloudflare on your API subdomain) that simply sends the (unencrypted) API response along with the email from the Supabase JWT used in the header to a server that accumulates everyone’s recording names, transcripts, generated notes and generated summaries. If someone gained access to your Cloudflare account they could also do this. You’re advertising Limitless as if you aren’t able to see people’s transcripts even if you wanted to, which is false. Even your employer can if they TLS MitM you with their own TLS certificates, which is not rare. On the other hand, Signal cannot see your data unless they modify client code, nor can your employer unless they install a modified Signal client on your device or install spyware on your device, which is reading decrypted data from memory. This is what separates encrypting at rest and E2EE (which you say your solution is just as secure as and is better than) for the end user and it feels like false advertising. Limitless, your employer and a potential hacker can all read your data, at the minimum while you’re using Limitless.


There is no way to anonymize transcripts where I and others refer to me by name and reference uniquely identifiable information. What happens when the AI provider is served with a warrant or subpoena for transcripts mentioning MacsHeadroom in the 30 day window?

Your "anonymization" does nothing. You have major egg on your face right now.

The only remotely anonymous version of a service like this involves a private LLM endpoint.

I have Mixtral 8x22B and CommandR+ running on my home LLM sever. How can I use Limitless without handing my "anonymized" (but NOT actually anonymized in any way shape or form) PII over to a third party who retains it for at least 30 days (or longer if someone with a badge or gavel asks)?


This is nonsense. If you can send the data over to third party AI providers for transcription and summarization, you can just as well send it to anyone else who asks for it (e.g. government wiretap, advertisers) whilst removing the "anonymization" part. It doesn't matter that you cannot export the keys used to decrypt in-memory if you control the code that then processes the decrypted data.

It is not at all "similar to end-to-end encryption".


Sorry, but what? How do you do any of the AI training or inference without decrypting the data in the cloud? I seriously doubt it's all running locally on the pendant.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: