Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When a company has the ability to push OTA updates to a device locked down with trusted computing, it's not even a backdoor at that point, it's a frontdoor.

I agree political action here is totally fruitless. The UK government and Apple could already be cooperating and you would have no way of telling the difference.



> When a company has the ability to push OTA updates to a device locked down with trusted computing, it's not even a backdoor at that point, it's a frontdoor.

Ideally, everything that runs outside of an app sandbox would be 100% Open Source. Anything short of that is not sufficient to give people full confidence against a backdoor. (Even that also relies on people paying attention, but it at least gives the possibility that people outside of a company whistleblower could catch and flag a backdoor.)


I think so too. It should include full free open source specifications of hardware, as well as fully FOSS for all software that is not inside of the sandbox system, and probably also FOSS for most of the stuff that is using the sandbox, too. Other things should also be done rather than this way alone, but this will be a very important part of it.


I'll go even further and bring up Trusting Trust - whole chain needs to be open source and verifiable.

and you need to be able to compile each and every part of it.


Open source alone isn’t enough. You also need a way to build and deploy the code yourself.


Agreed. And demonstrated reproducibility showing that the result is identical.


> you would have no way of telling the difference

If only specific individuals are targeted, I agree. But if it's pushed to all users, wouldn't we expect a researcher to notice? Maybe not immediately, so damage will be done in the meantime, but sooner than later.


> But if it's pushed to all users, wouldn't we expect a researcher to notice?

Think of the security a games console has - every download arrives encrypted, all storage encrypted, RAM encrypted, and security hardware in the CPU that makes sure everything is signed by the corporation before decrypting anything. To prevent cheating and piracy.

Modern smartphones are the same way.

We can't expect independent researchers to notice a backdoor when they can't access the code or the network traffic.


How long was HeartBleed exploitable? How many people looked at that code? Now, take the source away and make the exploit intentional.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: