Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Shouldn't perhaps (with caveats, and therein lies the rub), but is there a reason to believe that they won't?

I'm torn on this. On the one hand, yes, a "regular" user should be using a distro that has a wide array of natively packaged software, and relying on that as much as possible. But not all software is distributed this way.

And many "regular" users will be coming from a Windows background, meaning they're not going to recognize the fact that the site they found when googling for "Install Spotify on Ubuntu" that tells them to open a command prompt and paste this command or download this .deb file is actually malicious.

In practice, they're susceptible to the same kinds of attacks they would be on Windows.



That blame still appropriately lies with maladaptive behaviors learned from Windows. The only way to completely stop users being susceptible to the "attack" of them phrasing their desires as web searches and then blindly following whatever malicious instructions come up is to fully remove administrator privileges and lock them out of "their" computers. But doing this at the level of the OS producer is utterly at odds with the foundation of a free and open society.

The incremental way to solve this problem is through various rules based around users engaging with details of the OS. One very simple one of these is "only install software through the system package manager". If users violate those rules, short of the above "solution", there is literally nothing that can be done to help them.


Yeah this conversation is borderline philosophical. What does "secure software" mean? As a software engineer I've always thought about secure software as software that does not have bugs that can be exploited by non-authorized users. Be it privilege execution, code injection, remote code execution etc.

As an end-user, I choose to use Linux because it does not stand between me and my computer. I am the master of the machine. I tell it what to do, and it obeys. That is the relationship I want to have with a piece of tangible property that I paid money for.

So if I do something unsafe, even through ignorance or naivety, I still see that as being my fault. Not the software's. In other words, the software was behaving as expected. There were no bugs. It did what the authorized user told it to do.

But I can see the point of view that secure software could also mean software that makes it difficult for the authorized user to do dangerous things. Especially in an organization setting where the user is not actually the owner of the machine, but is using company equipment and software.


> That blame still appropriately lies with maladaptive behaviors learned from Windows.

As a Linux nerd who started tinkering around 2001, I’m having a hard time with this framing. Being a Linux user around that era involved regularly compiling code that practically I couldn’t review. While the steps involved were different, the behaviors associated with installing software were not inherently more secure than windows at the time.

The main differences were: 1) I was more technically savvy and thus not as likely to install something obviously dangerous and 2) One just wasn’t very likely to run into a repo impacted by supply chain compromise or somesuch. Those same behaviors today are pretty risky in comparison, leading to threads like this one.

I’d argue that it’s less about windows being a source of maladaptive behavior, and more about windows attracting users who don’t know better than to behave the way they do. If those same low-tech users were somehow using Linux at the time, I don’t think they’d have learned anything more adaptive, and/or Linux would have been a much larger target, changing the threat landscape entirely.

But to go a step further, I guess the question that comes up for me is: why does it matter if it’s the fault of Windows? The typical “OS war” rhetoric never made much sense to me.

To me, the point of running Linux or really any desktop OS is to allow me to go beyond walled gardens.

My brother is a non-technical artist/creator type. If he relied only on the system package manager, he might as well use an iPad.

What you’re framing here as “violating rules” is really just “using the computer for its intended purpose”.

Following those rules may fix the security problem, in the same way that never leaving my apartment will probably protect me from most seasonal illnesses.

We need better ways to universally indicate trust in distributed software beyond system package managers.


> The main differences were: 1) I was more technically savvy and thus not as likely to install something obviously dangerous and 2) One just wasn’t very likely to run into a repo impacted by supply chain compromise or somesuch. Those same behaviors today are pretty risky in comparison, leading to threads like this one.

Yes, and I would add 3) there was no meaningful malware made for Linux back then, because it wasn't a target for malware authors. Otherwise you (and I!) would have been hit. I didn't understand half the stuff I installed on my Linux box back then, and many install instructions were positively arcane.

> To me, the point of running Linux or really any desktop OS is to allow me to go beyond walled gardens. My brother is a non-technical artist/creator type. If he relied only on the system package manager, he might as well use an iPad.

Fully agreed! Furthermore, anyone using Linux back then would have found your opinion entirely uncontroversial. The point of Linux back then was freedom to do whatever. Gardens -- even non-walled gardens -- were not the point.


Sure, if you turned the 90's/2000's herd of Windows users loose on Linux, you'd likely run into many of the same issues. But still modulo needed software being available in a package manager or at least with a higher barrier to entry of developing substantive source code, rather than needing to use sketchy warez or "legitimate" download sites that added in less-severe malware.

But the real condemnation of Windows isn't just an "OS war", but rather the fight is about the virtue of understanding the software in front of you. This is critical to security - not in the small (the idea that an individual can audit every line of code they run is a fallacious straw man), but rather on the large scale. Having published source code allows the building of a distributed consensus that something can be trusted, not beholden to the existing power structure, whereas without that you're stuck trusting the word of a single company and/of their auditors.

Like today we see malware continually being bundled with proprietary software, so often that it's considered banal and just how things are (eg web surveillance). Yet in Libre land when something like this happens it's still seen as an exceptional occurrence that needs to be stomped down hard. That directly follows from the underlying attitudes of unilateral "trust us" versus community consensus.

> We need better ways to universally indicate trust in distributed software beyond system package managers.

I'd love to hear you expound on this. The only two solutions I can see are are trusting and sandboxing. Sandboxing and capability security was a radical pie in the sky idea two decades ago, but now we've got two extremely popular implementations - web javascript and Android. So it feels a lot less like a lofty perfect solution, despite the current failings (both of those implementations have shirked being secure against many types of attacks!).

Is there some third way, especially that isn't just a combination of the two basic existing approaches? I'd love to hear one!


What about things that are not in their package manager, like most games?

"Only download through this walled garden [Steam, GOG Galaxy, etc]"? So walled gardens are the answer?


Gardens are the solution, but people shouldn't be locked into any garden against their will. Users should be free to choose the garden they prefer any time they wish, or to start their own garden and invite other's to visit it.

I choose the F-droid garden and the OpenSUSE garden. Other people may prefer other gardens, and they should be free to choose the ones they prefer as I am free to choose mine.

When people criticize walled gardens, it's because the wall is like the Berlin Wall; a wall designed to keep people in against their will.


> When people criticize walled gardens, it's because the wall is like the Berlin Wall; a wall designed to keep people in against their will.

Fair enough. You are right there.

But in essence, it's not that Linux is "safer" than Windows against malware. It's that it's a nerdier culture with different practices that don't translate well to the mainstream. Like user kbenson above who suggested "reviewing the installer"... I hope we all agree that's ridiculous, right?


That is not what I said, there was a conditional on that sentence. The real point I was making was the last paragraph of what I posted. Don't use that method if you care about security, it's never, ever been the sole option in any case I've seen which wasn't meant to be nefarious (at a minimum, there should be directions to manually do step by step what the script automates).

That everyone jumped on the "review what you run" part because they weren't paying attention to what I actually said and it looked similar to other arguments when this comes up I think has less to do with what I said and more to do with people wanting to argue that discussion yet again.

The bottom line is that Linux is no different than Windows in this respect (look at the Deno install directions if you want to see the powershell equivalent of curl piped to bash), and this is more a matter of the developer communities being okay with this method, and promoting it regardless of OS. In that respect, saying "Linux security best practices of curl | bash" was just inflammatory and wrong, and deserved to be called out as such. It's not only a Linux thing, it's not anything like a best practice (it's a convenience method provided that trades away security), and as such the statement is just plain wring.


Ok, I'll readily admit I misunderstood that part of your comment then. I stand corrected.

So let me rephrase what I think is the key point here:

Linux is only "safer" than Windows because it has fewer users and they tend to be more technically minded.

However, were Linux to somehow become as mainstream a desktop OS as Windows, two things would happen:

- The userbase would become less technically minded and security aware. I don't want to call them "stupid" however, they likely know other stuff instead. I can't drive a car for example, am I stupid?

- It would become a juicier target for malware creators, and therefore malware would be as widespread as in Windows-land.

There's nothing magical in Linux that would protect a large and careless enough userbase.


I would say Linux and Windows are likely roughly equivalent in almost all safety concerns for regular users (if we normalize for how good a target they are which makes Windows get targeted more in absolute terms, and ignore technical merits of the kernel and secure access/ACL systems, which I think aren't really what this discussion is about).

There might be a slight edge in Linux in that the main way of getting software, the base OS repos and package manager, are about as trustworthy as you can get if you trust the OS to run in the first place, and generally they are packaging and shipping a lot more software that they've vetted, and built themselves and verified with the system than Windows does, which allows the regular user to not lower that trust as much.

Windows is getting closer, in that they have their store now and you can even use winget to install things from it, but those things aren't packages by MS (even if they might be vetted to some degree), so it's note quite the same. In some aspects it's better and in some worse, actually (there's a benefit to the OS maintainers tracking and building stuff themselves).

Beyond the main OS software, it gets into trust relationships and quickly becomes the exact same regardless of OS, as long as you're allowed to install arbitrary software. This is as opposed to a walled garden, which is explicitly trading away convenience (of one type) and choice for security (and uniformity, but that's not a given), in the same way but the opposite direction as curl piping a script to bash which is trading away security for convenience. I wrote elsewhere in this thread (multiple times, to various degrees) how the trust relationship to the source is the real question, and not really different in the desktop OS cases (in case I wasn't clear and you want more words on it to clarify my opinion).

> There's nothing magical in Linux that would protect a large and careless enough userbase.

Nope, there isn't, just as there isn't in Windows or Mac OS (yet, but they're both heading that direction to some degree with their stores).


Yes, gardens are one answer, and likely the best one currently, for non-webapps. Distro package repositories themselves are the original gardens. People tend to give them a pass because having good incentives have kept them decently honest, but distro package repositories are fundamentally gardens.

Gardens allow you to make a small number of trust decisions, and then trust all the software they have vetted by extension.

Note that I'm leaving out "walled" because multiple software sources can coexist. "Walled" only comes about when some company tries to constrain you to their singular source.


If someone gives you a guarantee of safety, you get to blame them when things go wrong. If you demand to strike out on your own, you have no one to blame but yourself. And you should honestly be proud of taking the risk; it's literally the only reason to use all this proud, evocative language about being trapped and needing to be free.

You want to be cutting edge, but not get cut.


Wait. Linux users "strike on their own" all the time!

Who here is a Linux user and never downloaded stuff outside the repo, or compiled sources and run them without reviewing every security loophole? Linux users are the most "demand-ey" of users, even starting flamewars over being forced to do things this way or that way!

I'm really skeptical that this wouldn't introduce malware if malware authors deemed Linux a worthy target.


Just gardens. Package repositories are just that, you can pick whatever you want.

Games are a bit of special case as they don't exactly play nice with Linux and many of them are also run thru emulation like Proton


But that's it. Games and games downloaded from dubious sources are one of the primary infection vectors. It doesn't happen enough in Linux because there aren't enough Linux users to make it a worthwhile target for malware authors!


For this and other reasons (including compatibility issues), when I want to download a game, I will prefer a version for DOS, NES/Famicom, Game Boy Advance, etc, instead of native code on Linux or Windows. (I also prefer FOSS when available, especially for native code; I will not run any non-FOSS program on my computer that is not running as native code.) (And, in some cases, the game might follow rules that I can just reimplement myself instead. For some types of puzzle games, it is relatively easy to reimplement it in Free Hero Mesh, so I am glad I wrote that program. Other times, other reimplementations can be written.)


<citation needed>

Most people don't go on random torrent sites to get their games.

And most people that do also don't get infected because usually the top rated torrent is just fine. I pirated a lot as a dumb kid without income and haven't managed to catch anything at least.


> That blame still appropriately lies with maladaptive behaviors learned from Windows. The only way to completely stop users being susceptible to the "attack" of them phrasing their desires as web searches and then blindly following whatever malicious instructions come up is to fully remove administrator privileges and lock them out of "their" computers.

And making it a class at school. We have universal education in most places, we can use it for something useful. There's no reason that we have to capitulate to corporations and their moats. We can teach children how the devices that surround them and order them around work, and how to deal with the predators that they'll encounter while interacting with them.


The way of solving it would be streamlining adding new repositories for the 3rd party stuff.

Way too often it's "download some dumbass script running some half-assed autodetection just to add a line of text to config and a GPG key.


I would argue that to make a distro that has a wide array of natively packaged up-to-date software in a world of significant (lets say 25% and above) desktop linux use, then it would be nessecary to discard the level of curation that serves the security that such distros currently uphold.

That is to say, if desktop linux use became significant enough to challenge windows and macos, then either the distros will lag out of date packages, not cover widely requested packages, or not have security through curation. Any of these will likely inherently lead to users downloading packages to install manually.

I say this as someone that has been running desktop linux on and off since 1992; and I wholeheartedly believe that there's no inherent reason linux desktop use cannot be popular.


It's easy: Windows insecurities are the fault of Windows. Similar Linux insecurities are the fault of Linux users.

That's a general theme in OS wars, Windows users are eager to criticize Windows, while Linux and macOS users are eager to defend their OS.


In the specific case of running something blind with curl| bash instead of using a package manager or using something that maintains a degree of isolation (flatpak, docker containers, ...), yes it's the fault of users.

I don't criticize Windows when a user installed a malware and clicked on "Yes" when prompted to allow the app to make changes to the computer in the same way, I don't criticize linux distroes for allowing users to install something by piping directly a curl into bash without checking. I appreciate not having an os that's a walled garden unlike what happens with phones but that does put some responsibilities on users.

For a real world example, if someone gets an std from having unprotected sex without a condom, the blame lies on them for doing so.


Package manager is less safe than curl|bash, they do the same thing, but package manager does it with root privilege, while curl|bash can run in user context.


> Package manager is less safe than curl|bash, they do the same thing

This is so far from true, I wish this misconception didn't exist.

First, a package in a distro repo is maintained by known people, the package maintainers. Some do an exceptional job, some put in less effort, but at the very least there is some oversight on what gets into an official distro package. And there is an audit trail of who approved what and when.

Distro packages are signed and verified by the tooling, so inserting a malicious package is much more difficult.

Distro packages are also versioned and prior versions remain available, so if there is an issue we can trace who/when/why it was introduced and when it was fixed.

None of these protections exist when you just curl some random executable off from some third-party website and hope for the best.

Realize that the website might return a different executable every time. Even if you & I curl it within seconds, we might be served different content. It might return malware in a tiny percentage of cases to avoid detection. It may return code without any versioning, or code that claims a constant version even though the content changes. It is impossible to have reproducible installs when you just retrieve something via curl without any validation or versioning.

(You could, of course, curl it to a file and then compute a hash on the file, assign it a version number, store and look up known hash/versions and so on... but now you're down the path of building an adhoc package manager to resolve the problems with curl|bash. So... might as well use the mature package manager your distro already has because it already solved these problems.)

> while curl|bash can run in user context

No, it runs as whatever user you run it as. You can find plenty of websites saying their installer needs to be run as root.


Package manager != distro repo.


That only makes sense if you don't understand all the things package managers generally do or why curl|bash is as bad as it is, and specifically why it's even worse than 'curl > file.sh; . file.sh'

- Package managers generally sign their packages, and provide ways to prove the integrity of the downloaded packages, so even if someone hijacks your DNS or exploits the box you're downloading your packages from and injects a malicious one, it will be rejected on your side. Yum/dnf do this with GPG, for example.

- Package managers usually track what was done and what files were put where, so there's an easy way to see what was installed and clean it up (which may also be automated). This isn't perfect, as the packages can run scripts as part of install usually, but it is helpful.

- Packages from package managers are vetted by the team that puts them out, and depending on the system built by that same team. For stores, often it's just vetted in some way and the submitter builds it, but package systems are almost always vouching for the packages you get. In the case of the OS package manager, you obviously trust them if you're running the OS, otherwise this discussion has no meaning.

- Whether things are installed as root or as a user (commonly. You can curl|bash as root too, and many instructions prefix it with sudo...), but I think it's usually more important how trusted the sources are. The OS packagers, a well trusted group or company with something to lose, etc is more likely to have put systems into place to ensure safety (and protect themselves from problems if they are hacked). See above. Regardless of whether they're run as root or not, I think they're somewhat safer and more trustworthy, but this is a personal choice, since it's largely based on your own trust of the sources and the systems used in the chain of getting the software to you.

Ultimately, it's less an OS issue and more an issue of what the individual is comfortable with, which spans operating systems. People do stuff just as unsafe as curl|bash on windows all the time too.


>Package managers generally sign their packages

At least on ubuntu and centos I had no problem installing unsigned packages, no warnings, nothing, it's allowed by design. Try it yourself.

>Packages from package managers are vetted by the team that puts them out

It's called distro repo, package managers don't contain packages.

>In the case of the OS package manager, you obviously trust them if you're running the OS, otherwise this discussion has no meaning.

This has nothing to do with security, you can trust your system to run malware correctly too, just like any system does it.


> At least on ubuntu and centos I had no problem installing unsigned packages, no warnings, nothing, it's allowed by design. Try it yourself.

Manually, through yum localinstall or by referencing a local RPM or a manual rpm command, or an unsigned package in the remote repos? I've seen repos configured with signing keys have problems and the install command fails, so I assume you're referring to manually.

I think what you're seeing is that the package manager utilities (yum/dnf/apt/whatever) are all capable of verification, but are also happy to install things without verification in many cases. But if the RPM you downloaded is signed and RPM has had signatures loaded into it and there's a mismatch between what the rpm utility knows about and what the RPM you're installing is signed with, rpm will complain very loudly and fail (I have had to add --nosignature to rpm commands in some cases and import keys into rpm in others).

In addition to at the RPM level, the repos themselves often indicate a gpg key that packages are signed with, which the system package maintainers, which is what I was somewhat ambiguously referring to in my prior comment as package managers (in which I meant the managers of the system packages), will sign all packages they publish with so the integrity of updates and additional software they provide can be confirmed.

Given that, I'm not sure how you can maintain that the package management utils on systems do the same thing as piping arbitrary internet content to a bash prompt. I think you were just possibly a bit mistaken about what the package management utilities are really doing and enforcing with their signatures.

> It's called distro repo, package managers don't contain packages.

It's called the packages the OS provides. It's called many things. My terminology was somewhat ambiguous.

> This has nothing to do with security, you can trust your system to run malware correctly too, just like any system does it.

Sure it does. If you trust your system to run malware from the OS providers, then you don't have to care what other software you're downloading and running, you've already set your trust level of the system to "none".

If you do trust the OS provider (whether that by a Linux distro, MS for Windows or Apple for Mac OS) to not be malicious (and please, let's forestall any digression into privacy, we're talking about malicious intent not allowed by EULAs), then you should trust the other software they provide that's verifiably from them.


I disagree, with the package manager you know that it's been vetted by the distribution maintainer, they ensure that when you install something through the package manager the package comes from them and they ensure the binaries do not change. The package maintainers will also remove packages that have major security issues and the package manager will track the apps installed and automatically update them if there's a security fix.

Curl | Bash usually runs in user context (although I've seen curl | sudo bash in the wild) but there's also concerns of vetting who wrote the instructions, if they actually point to the app or a modified version of the app containing malwares, etc...


It's called distro repo, package managers don't contain packages and can be used to install packages from any source. Some version sensitive programs are distributes as packages, because distro repo moves slowly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: