Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Uncomfortable Questions About App Signing (commonsware.com)
86 points by interpol_p on Sept 28, 2020 | hide | past | favorite | 37 comments


Maybe I'm misunderstanding something, but we are talking about non-rooted consumer Android, right? Google already has complete control over those systems. That hypothetical oppressive regime could simply ask then to embed their spyware in the next OS update (or have them remote-install a spy app of their choice via the normal play services process[1])

What more potential for abuse would this change bring that isn't there already?

[1] https://www.quora.com/How-Google-play-remote-installation-of...


I don't understand either. It's not like they don't also control the code that perform the signature verification.

I don't really know how app submitting/building/signing works on android, but I'd say the main issue it could add would be that the application can be tempered with between the moment it's built until the moment it's signed by Google.


It's surprising to me how pervasive this idea is that app code can be completely trusted because a developer signs it at some point.

All those super secure end-to-end encrypted messaging apps are just one automatic app update away from uploading any locally stored conversations and keys. And turning off automatic updates isn't enough, Google and Apple have ways to force an update if required.

Maybe for now Apple/Google can convince the courts to not coerce them into using this power, but it is present.


Personally I would say App Signing is the smallest worry. Far bigger are: 1. Mandatory XCode for iOS build/debug 2. Mandatory yearly payments for Apple if you want to release iOS App 3. Mandatory DRM for iOS apps, even for Open Source apps 4. Horrible Apple certificate/provision profile system if you have multiple devices


Sure - but the alternative is currently worse - a nightmare landscape of Trojans and malware.

I actually agree with your basic premise, and think Apple should be required to make a bootcamp for iPhone.

But then as a community we’re going to have to do to work to make a new ecosystem that is both open and safe enough for regular people.

Right now it is one or the other, but not both.


Complete user freedom includes the freedom to activate the orbital footgun array (e.g. kernel level debuggers), it is inherently unsafe.

So we can't have both at the same time.

But what we can have is more stops along the safe-powerful gradient and provide safe APIs/wrappers for the most popular use-cases that are developed through unsafe means.


All security relies on on trust.

Right now, iOS security relies on trusting Apple to evaluate the trustworthyness of third parties.

Fully open systems rely on trusting yourself to evaluate the trustworthiness of third parties.

I believe it’s possible to engineer a framework where there are more options for delegating trust than just these two.

We don’t currently have such an infrastructure, but we need one.


I mean, doesn't Android already have that middle ground? If I install F-Droid, I have to asses that I trust it, but thereafter I delegate trust to them and can install from their repos without having to re-decide every time.


To a certain extent yes, but delegation at the store level is extremely coarse grained.

Also, no automatic updates.


> To a certain extent yes, but delegation at the store level is extremely coarse grained.

I suppose that's true. I'm curious what level of precision you would prefer to see? Accepting a developer's key?

> Also, no automatic updates.

Only because Google decided to be uncooperative in their builds. If you control your own system partition, I believe you can still inject the privileged extension and have this. Google has also implied that this might change with Android 12 (https://android-developers.googleblog.com/2020/09/listening-...).


macOS lets you install a kernel level debugger, though.


Apple never sufficiently instruments code during app review; the curation doesn't actually prevent malware, it just lets them remove it once it's well known.

If this weren't the case you wouldn't hear about people sneaking behavior apple doesn't like into published apps.


This is exactly how iOS works today; in fact Apple recommends you upload your LLVM IR to them and tag your assets as well so they can recompile and recombine your apps for hardware you don't know the existence of yet. Which is nice if you trust Apple…if you don't, then it is very difficult to actually verify that what you're downloading from the App Store is actually what you submitted to the company. With resigning and FairPlay and all the wrappers that Apple applies, it is really difficult to do any sort of verification here :(


Would it be possible to download the app from the store, decompile/unbundle it and compare it with your dev version?


Not easily, unfortunately. Accessing the app files itself is usually not possible on a normal iOS device, and even then they are encrypted with FairPlay DRM (which is easy to reverse–but only on a jailbroken device).


Not really and if I recall correctly, that's the main reason why AppStore is considered incompatible with GPL.


I believe the main reason that it was incompatible was a EULA that had incompatible sections and was unwaivable, although those parts are gone now so whether it's still incompatible is not clear.


It would still be incompatible with v3, wouldn't it?


Why?


IANAL and I have no involvement in the Apple ecosystem, so take with a grain of salt:

My understanding is that GPLv3 requires that anyone who gets a binary can also get the source to it, and can then build and run that source on the same device. Even if Apple now allows distribution of software that demands to also share its source code, it's my understanding that you can't build that code and run the result on your iPhone without either rebuilding/reinstalling every 7 days or paying Apple. That certainly seems to be against the intention of the license, although I admit it may technically squeak by the exact requirements.

On a different note, https://en.wikipedia.org/wiki/GNU_General_Public_License#Leg... also suggests that there's an issue around a person having a copy of an app not being able to share it with other people.


Yes it is with a jail broken device


Telegram does provide verifiable builds on iOS, but you need a jailbroken device to dump the decrypted executable.

https://core.telegram.org/reproducible-builds#reproducible-b...


I’m starting to question the whole app store distribution mechanism. The web is clearly not the best platform for mobile, but as mobile developers, maybe we should make a stand and only develop for the web. Pushing mobile os manufacturers to improve their platform for web, instead of having us cooperate to publish in their own private garden.


Native apps on these devices are locked down to the point that they don't really provide any advantage over the web (and it's extremely rare that the few advantages they do provide (such as guaranteed caching) are actually used in a helpful way.)


They have much harder to block ads and better integration with the is for things like camera, clipboard (even if we have seen its too much clipboard access), notifications, etc. However, the fact that you can't block app ads without a network level ad blocker means they're probably here to stay.


Are you sure the “web” is not already largely Google’s private garden? They basically control every aspect of the web now. Nothing your webapp does or sees for a large majority of its users is outside of Google’s vision...


If Google has their way, a big chunk of the web will be absorbed by AMP. I don't understand why so many talented people at Google continue to support this kind of takeover. It goes completely against the hacker ethic that built the internet.


>I don't understand why so many talented people at Google continue to support this kind of takeover.

It's easy talk yourself into a position that boils down to "I trust us."

Not only are people disinterested in designing around that, most people don't think of it as a problem. Partially because they don't always engage system thinking to that extent, and also it's grueling to admit that your ability to make decisions should not be trusted.

Sure, it's easy to say "I don't trust myself" when asking someone else for a code review. Its expected to miss some things and motives are not really at question. Removing the ability to change goals after the fact, mislead or break promises is different in kind.


Why would you expect Google’s to behave differently?

Anyone who had an old school hacker ethic retired from Google years ago.


> Anyone who had an old school hacker ethic retired from Google years ago.

Some of the Go developers fall into this category, and they aren't retired yet.

However, they are walled off inside one particular part of Google, and it's a part that is carefully isolated from the kinds of things that raise the issues under discussion.


Yes, it’s true that the faceless corporation would allegedly not care, but there are still human beings working there who would benefit from keeping the open web alive (so would their grandmothers, grandchildren, friends, etc). Google isn’t completely run by Skynet yet.


Sure but google selects against those people.


I would expect someone with principles to behave differently. There certainly are principled people at Google (the recent-ish google walkouts come to mind). Maybe the principles I'm thinking of are no longer valued at Google.


So, where did the old school hackers go to then?

Also, this seems odd. Being inside Google is still the position where you can influence this stuff the most.


They got rich and retired, and were replaced by hungrier people with less of the old school mentality.

As this process continued, the amount of influence the old guard had diminished and the appeal of retiring and working on hobby projects increased.


Linux distros work this way. Some group of people (distinct from sw authors) compile and sign the packages, and distribute them.

I guess in this case the code will not be publicly available, so regular joe will not be able to inspect and recompile.

So rather than app signing, the real issue here is lack of trust for the company.


Also, in (most?) Linux distros it's easy to add additional keys to the system's "trusted" list or bypass it completely (AFAIK, `dnf localinstall ./foo.rpm` doesn't care at all about signatures), which makes a significant difference in user freedom IMO.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: