Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Private AI Compute (blog.google)
18 points by neehao 20 days ago | hide | past | favorite | 3 comments


Trusted Execution Environments are interesting technology. There are a few research papers detailing how they can be attacked—aside from the obvious risk that the TEE manufacturer holds the keys and could, if compelled or willing, share access with others.

In practice, TEEs represent strong privacy guarantees. For businesses handling sensitive data, TEEs allow you to credibly claim that the data is protected during processing, even when computation happens in the cloud.


This article was written weirdly:

> For decades, Google has developed privacy-enhancing technologies (PETs) to improve a wide range of AI-related use cases.

They introduce this random "PETs" acronym after a random string of three words and then never use it. In general this article makes some weird stylistic choices (WSCs).


I’m not sure how this is any more robust than any other privacy guarantee from a cloud provider

Rule of thumb for knowing if your data is private: Unless you can physically destroy the drives/tapes/film yourself personally it’s not private




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: