I have been using Debian GNU/Linux since 2006. Debian 3.1 Sarge was the first version I used. The primary reason for choosing Debian back then was the large number of packages it had in its package repositories. Its exceptional stability came as a bonus.
Over these 17 years, I have used Debian for frontend servers, backend servers, desktop computers, personal laptops, personal virtual machines, team virtual machines, etc. I run my static HTML websites, Common Lisp web applications, ZNC IRC network bouncer, Exim4 MTA, Matrix-IRC bridging clients, etc. all on Debian.
I also spin up fresh new Debian VMs to serve as clean rooms for testing my open source projects and technical guides. For example, when I published my "Lisp in Vim" article, Debian turned out to be a nice system to quickly install all the necessary packages and test out all the steps in my article thoroughly. When I write a shell script to be used across a variety of shells, I run my test suite for the script on bash, ksh, zsh, dash, posh, and yash, all of which can be easily installed on Debian.
Debian has served me very well for a wide variety of activities ranging from things like software development and testing to personal digital chores like writing documents, presentations, splitting and merging PDFs, editing audio/video files, document format conversions, editing photographs, etc. and even having fun, say by raytracing with POV-Ray, running vintage DOS games on DOSBox, or recording live music from my digital piano. No matter what the nature of the problem is, the solution often begins with a simple "apt-get install" command.
In all these years, I have never ever faced any problem with dependency management or package installation in the stable branch. Yes, the packages can sometimes begin to get old by a few years but the whole system remains outstandingly stable. It is a great distribution for long running servers. Debian is also quite good for a lean personal computing environment. In my opinion, the Debian stable version sets the gold standard for stability in the world of Linux distributions.
Same... Debian and derivatives everywhere, including Raspbian (now RaspiOS I think) on an army of Pi and even "RasPBX" back in the days (running Asterisk + FreePBX to power Cisco VoIP phones from a Raspberry Pi 1!) and even Devuan on two machines (a Debian fork without systemd). Well, since the nineties and Debian 1.1...
> Yes, the packages can sometimes begin to get old by a few years but the whole system remains outstandingly stable.
I then compile from source. I do it for Emacs notably and at time I recompiled the kernel (for example at one point I had patched MagicSysRQ and was running a kernel with a modified MagicSysRQ).
Nowadays I like to use the stock, signed, Debian kernel: even if it's not a full UKI yet I like to see "SecureBoot on" (I know it's a touchy subject and I know it's not a panacea and I know I should really run a signed UKI but I haven't set that up... yet!).
I don't bother to "dist-upgrade", except to go from the "hard freeze" to the stable one. Typically I re-install from scratch when a new release comes out. YMMV.
They don't have the same guarantees as a real rolling release distro. My experience of SID is things randomly breaking and having to be very careful whenever you update. And because it is not a "production" intended rolling release, they don't have the same guarantees for security fixes either - they can be quite unhurried.
Many DD are running unstable, so it should be non-breaking enough for most people. The most common breakages are file conflicts. It is a shame we didn't fix such issues yet, as it is a bit difficult to fix the problem, but having testing as an additional source and rolling back the problematic package should fix the issue. I don't remember any other kind of breakages (other than upstream regressions, but this is a rolling release) for the past five years. If you tried unstable 10 years ago, this is now vastly different.
On the other hand, I remember Arch transition from readline 6 to 7 making the system unbootable (2016) if you did upgrade at the wrong moment. I don't know if package containing libraries are now versioned to avoid such issues.
I think the main difference you get by using Arch is AUR (we don't have that in Debian) and, I think, more fresh packages (mainly because it easier to package for Arch than for Debian).
> They don't have the same guarantees as a real rolling release distro.
I can guarantee my distro will bring world peace, help you lose ten pounds and get 10% better gas mileage.
Unlike rolling release distros, Debian doesn't try to pretend that rolling releases are as reliable. It's exactly what it says on the tin. Both are probably equal in terms of reliability, but Debian is actually honest about what that reliability level is.
IME Sid mostly causes issues if you try to cherry-pick programs or libraries from it, mostly because you get into dependency hell. I have successfully done it with aptitude’s (not apt) package solver, but it’s a pain. If you run full Sid for everything, ironically it’s more stable.
That said, on my servers I just run stable and then compile anything I want that differs, and make sure it doesn’t clobber the system’s version. Like altinstall for newer Python versions. I’ve yet to have an issue this way.
> IME Sid mostly causes issues if you try to cherry-pick programs or libraries from it
Instead of doing this, create a Debian Stable (or Testing, if that's what you're running) repository on OBS¹. Fork the packages you want from Debian Unstable into your repo, and let them build against your preferred Debian release. If some of them need newer deps, pull in their deps, too. Add your repository to your system and upgrade that way.
Or if you're not that attached to only using DEBs, just use Nix or Guix to install newer software.
I moved to Linux last year (my T430s was unbearable with Windows, but turns out still great with Linux). Briefly tried some Arch distros (which felt like a time black hole I don't care for) and Ubuntu before settling for Debian. Partly because it seems like a pretty "pure"/minimalistic OS, because I figured it's a good OS for servers too so might as well get familiar with it and because I wanted to run i3 so wanted to avoid the "bloat" from Ubuntu (which I now realize is negligeable).
While it's been serving me well, it still feels quirky and a lot of hardware stuff just didn't work out of the box so I've accumulated a fair few hacks and I dread having to butt my head against those again when this laptop inevitably craps out. Sometimes it's just finding the package you need but I also had to do a lot googling and changing speficic lines from specific config files for a bunch of things. Namely my volume buttons, mute button, Bluetooth, Bluetooth devices, recorder, brightness buttons and others I forget. My volume buttons still don't work if it's through a Bluetooth device but I'm just putting up with it cause I couldn't find an answer fast enough and on a new connect I have to explicitly go in pavucontrol (or whichever it was, "firefox no sound bluetooth debian" is seared in my search history). I never managed to get the Steam for Linux thing working and when I try again it seems to clash with the remnants of some old attempt.
Maybe I just don't have enough experience and it would be the same on any Debian or Linux distro but AFAIK Ubuntu does much better at handling drivers out of the box.
I like the simplicity of it at face value but I'm not sure why I wouldn't go with Ubuntu next time to avoid the hassles or if I'm not being stubborn by not going for Windows with WSL (assuming I don't buy another old horse). Worst is I still need to run a Windows VM in it when I'm compiling something to .exe or some specific Windows only software.
Debian in my opinion is best on servers which will have uptimes of months or even years on wired connections.
In such setups, the hardware driver quirks were most probably solved decades ago and as a bonus you don't have to babysit the server. Just install the OS, forget about it and focus on whatever is really bringing you value.
What drove you away from Arch Linux on your laptop?
Can't say I was drove away from Arch as I just tried Manjaro for a bit while figuring out what distro I wanted to stick to. But it seemed to me that it required more googling to do just about anything, and since a lot of Linux programs are made with Ubuntu in mind it seemed more straightforward to get going with a Debian based distro. Might have misread the situation since I had zero Linux experience, but I figured sticking with Debian was gonna minimize how much time I'd have to be setting and fixing my system. I think there was something in particular I was struggling to get working but my memory is hazy on that.
Probably gonna try again some time but for now I try to think as little as possible about my OS as long as I can code on it.
Most of the info you need is on wiki.archlinux.org, which is so good that users of other distros prefer it to other sources of Linux info. (I'm a Fedora user, but refer to it so often that I have a "shortcut" in my search engine for "site:wiki.archlinux.org", and others on this site have praised it.)
I was on Debian years ago, now on Ubuntu, switching back as soon as I find some time. The direction Ubuntu is taking (especially snap) is simply not something I want to experience. Besides, not much (or any) added value I can see.
Debian is a boring system in the best sense of the word. I've only become a more prolific Debian user in the last ten years or so, but in that time frame, I've found Debian to Just Work(tm) and to reliably keep on working as long as the hardware isn't misbehaving. It's the type of thing we only tend to notice when it doesn't work that way, but it's an impressive achievement.
For a category of software as "boring" as an OS is, I really agree that "just working" is the ultimate, almost sacred quality. I simply can't stand anymore the incessant ritual of software updates, and even the rhythm of something cutting edge like VS Code, on my Ubuntu LTS, is slightly too frequent for my taste.
No you don't, or at least I don't. I need secure software. Frequent security upgrades are a good indicator that vendor didn't do a good job at the start. Once in a while, ok, and debian supports a special channel for security upgrades for this very purpose.
I do agree that with perfect software, written by perfect developers, updates aren't needed. I mean, has anyone ever had to "update" the Mona Lisa? I hold my web browser to the same standard.
I don't get the contradiction. Are you implying that security update don't 'just work' or that Debian doesn't push security update?
The kind of 'just works' that most refer to is that they turn their computer on and start working, rather than turning their computer on, being forced to login to a net account, being forced into a tutorial dialogue for a newly added feature, and then realizing that BigCo deprecated your file format support last forced update because they replaced it with TheNewWay.
I recently bought a NUC (well, two, but only because they're cute) and I needed a boring OS to run my Harbormaster containers on, so I installed Debian.
It's been perfect in the few days I've had it, because I don't have to dodge Ubuntu's ads left and right, or worry about what the hell snaps are doing. Otherwise, it's basically the same as Ubuntu. I very much recommend it.
There's a bunch of stuff in the MOTD every time you log in, and then apt keeps telling you of all the nice security upgrades you could be enjoying if you were paying them, in a very "nice system you've got there, shame if anything happened to it" way.
I remember the days of dependency hell in the early 00’s. I was using Red Hat at the time, and flipping through Linux magazines at a newsstand regularly to keep up on developments. Although things improved gradually on both sides, Debian won the race to effectively solve dependency hell and I rushed to download 9(?) CD-ROM images over dial-up and burn them.
I've recently noticed a new market which Debian is dominating: "official" Docker container images. If you "docker run golang" or "docker run ruby" or "docker run python" (or many of the images tagged as docker official images) then you're running Debian.
I've had several issues with Ubuntu on servers and Raspberry Pis, so recently I've decided to switch (back) to Debian.
In particular, Ubuntu snapd decided to fill my /tmp folder with gigabytes of data for no apparent reason and that made me 'snap' as well.
I also dislike the new Ubuntu auto-install installer[0], which is an absurdly complex kludge that doesn't provide me any benefit over the old and trusted Debian installer.
Android is by far the most commercially successful piece of software to come out of the Linux kernel (granted, pretty far detached at this point).
If that’s not pure enough for you, I have to think that Valve’s Arch-based SteamOS drives more revenue than Canonical’s support contracts.
Estimates point to around 3 million Steam Deck sales and then the software revenue on top of it, which should amount to at least 10x Canonical’s revenue.
Then you look at the elephants in the room with enterprise Linux: I have to think that RHEL pulls in more support contracts than Canonical, Amazon Linux similarly has a place as a default option for Linux in AWS, and you’ve also got Oracle in the mix in terms of not-Debian enterprise distributions with widespread adoption.
I wonder if it was influenced by the fact that apt-get was much more mature than yum/whatever, at the beginning of mass broadband age. Before that, RedHat and RPM almost got rid of Debian.
It might be Debian's slow release cycle. The people who comment & review love new things, because they can comment on them and review. But that creates a massive bias against the silent majority who dislike upgrades and don't want to think about the OS.
Compare Debian to Fedora. Fedora there are multiple releases per year. Debian you have the protection of being in the herd for a good 2 years. Each release is a much more known quality and it is easier to find bugs and documentation for whatever it is that got broken in Bookworm. Investing in workarounds makes more sense because it won't be fixed in 6 months. Etc.
Only software project I know where it seems likely that moving slow is a positive. But slow seems to work well for Debian.
As for Red Hat, it seems a bit dodgy to get involved with a company as an unpaid user. The incentives are never going to line up. Love their work, I just wouldn't use their system if I wasn't paying them.
EDIT Also, woah! Anticipating something like this sort of manoeuvre is a good reason to have avoided RHEL. Corporations. Not long term bastions of freedom.
"However, in 2023, Red Hat decided to stop making the source code of Red Hat Enterprise Linux available to the public. The code is still available to Red Hat customers, as well as developers using free accounts, though under conditions that forbid redistribution of the source code." - https://en.wikipedia.org/wiki/Red_Hat_Enterprise_Linux
> "However, in 2023, Red Hat decided to stop making the source code of Red Hat Enterprise Linux available to the public"
That's not actually true, it's just not being provided in such a way as to be trivially rebuildable. CentOS Stream is the upstream for RHEL and as such you can reconstruct RHEL from CentOS Stream [1], it just requires more effort to track the correct package versions.
[1] yes, I'm aware that has broken on occasion, almost exclusively for EL8 where the upstream / downstream process wasn't fully developed yet
Amen. An OS should get out of the way of the real work as much as possible. Any time spent on the OS itself by and end user is essentially time wasted that can never be recovered by writing or using a better application. An OS is essentially a delivery mechanism, not a goal in its own right.
> As for Red Hat, it seems a bit dodgy to get involved with a company as an unpaid user. The incentives are never going to line up. Love their work, I just wouldn't use their system if I wasn't paying them.
As a Fedora user, I like that someone pays for Fedora development, which (hopefully?) enforces standards in terms of code quality and QA, but also things like workplace policies etc. (e.g. I couldn't use software whose staff has been alleging sexual assault).
I think the only non-corporate distro I'd use is Arch (which I did for a while, but Fedora is just so easy to use!)
In the early 2000s, several things happened at around the same time.
* Red Hat Linux (previously the go-to desktop linux) was discontinued in favour of RHEL/Fedora in ~2003.
* Ubuntu was first released in 2004.
* Around 2000 a new release would come as a CD image and the normal way of upgrading was to wipe out everything except your home directory. This meant trying out a new distro was pretty much as easy as upgrading your existing one.
* As you say, although Yum was released in 2002, apt-get had been around since 1998, and so at that time was more mature.
* Wifi became increasingly common, and almost always needed proprietary binary blobs. Proprietary ATI and nvidia linux drivers became available. And of course MP3 was widespread.
Ubuntu arrived at an opportune moment - former Red Hat users looking for a new home, users used to wiping their system and installing a new distro, and clear benefits if wifi, graphics and MP3 all work out the box.
In my circles, and for a long time, "Ubuntu" meant "I can't install Debian". Because Debian used a text based install that required a bit more knowledge (and reading), and Ubuntu was more user friendly at the time.
Such a weird flex. Debian sure wasn't Slackware, it was one of the easiest distros to install and manage. It took years until commercial distros like RedHat could catch up.
I remember my first time trying to install Debian, being a total noob to the whole thing. I thought parts of the installer were broken, because I couldn't toggle TUI checkboxes. Every time I tried, the installer would instead jump to the next screen.
Years later, I figured out that you're supposed to use the spacebar to select/toggle UI elements, not the Enter key. Nobody told me that, and the text-mode installer certainly doesn't tell you!
Ubuntu 11.04, on the other hand, was no trouble to install.
I think that it's because it's boring which in terms of economics translates directly into savings and safety. Look at RedHat, it has some really nice tools and frankly always had, but if you made it the backbone of your platforms engineering team for the previous 30 years. Well, then I think you would've have needed to "reschool/retrain" your engineers 3 or 4 times as the tool chain has changed drastically. Which is obviously not as dramatic as it may sound, but compare it to Debian where things operate pretty much how they always have.
For some organizations there is benefit to continuously improving their platform engineering, but for many, technology is a tool that's required to do something else. They don't want to keep up with changes, and sometimes the changes aren't useful for them. On HackerNews most of the content and users are technology focused, but in the real world its the opposite. There is a reason why many enterprise organizations still run stuff on mainframes as an example. It's not because they want to, many have even spend hundreds of millions trying to exit, but the thing that's the most important is the stability to their core business. Now I don't want you to get the impression that I'm drawing a comparison between Debian and mainframes, my point is simply that to many organizations the ability to provide a stable support the core purpose of the organization is more important than anything else. Debian does that.
I think it was more that Ubuntu was easier to use on the desktop, so developers would use that locally or in VMs and RedHat/CentOS on the server. But inevitably, they would also then push to deploy on Ubuntu servers to reduce friction. This was true for me, I hated dealing with CentOS because of the minor incompatibilities with Ubuntu, so whenever I had the chance, I pushed/chose Ubuntu on the servers. So little by little, you hook a generation on Ubuntu.
It's a bit like the old Photoshop strategy, you let them pirate the software when young to hook them up, and then they would demand it professionally.
Never seriously used Linux desktop but before several versions of RH, packages were so much older than Ubuntu, it was pretty impractical to use CentOS but people who don't know a thing about servers always chose CentOS because when they Google around, newbie blogs tend to use CentOS for no reason other than RedHat probably was more recognized as a vendor name.
Ubuntu was always easier on the server with root user disabled by default and unattended-upgrades did its job without a hassle with newer packages. (It meant quite different if it was PHP 5 or 7.)
Here, in France, established companies like dealing with big established providers and feel happy with licence contracts. Plus, a version last at least five years. (to compete Ubuntu had to come with four years LTS...) So when they have to use a GNU/Linux distribution, they tend to choose RHEL (Novell could have been a good choice but I don't know why they're not known or not well.) Many countries in Europe and in French speaking Africa countries also love RHEL.
Consequence: when you couldn't sign/subscribe for RHEL, you pick CentOS.
I started my Linux journey (having being a user on SunOS previously) on a IBM ThinkPad 755CX with a Pentium-S. Installed Slackware from floppies, took literally days to compile a kernel or large program. Eventually I was using `slapt-get`, found that Ubuntu could have KDE installed, thought I'd had enough of config, make, make install (or `checkinstall -S`) and installed Ubuntu+KDE (and later Kubuntu).
I still prefer .deb packages so now I'm looking at jumping ship, probably to MX Linux as Ubuntu is using more and more snaps and so far they've just caused me problems; I also have philosophical objections (not necessarily well-founded in logic!) against monolithic packages.
Not sure why I started that reply, ... get off my lawn!
don't forget that in 96-98 we had S.u.S.E as the fourth biggest distro. And I remember that YaST managed dependencies with RPM in a user friendly way. Back in 1998-99 was my favourite distro because of that.
On the early days, if you get a RedHat CD-ROM, you wouldn't be able to install half of the packages there because they would conflict with the other half.
And any update would change the conflicts, to the point in that updating it simply wasn't a thing.
But technically, rpm was way more complete then dpkg. This changed with apt-get, as it took many years to the RPM distros to create yum.
More mature in what way exactly? In the fact that it's faster? Because I wouldn't call starting services by default, or replacing conflicting services very mature. Maturity I would associate with stability and predictability.
And regarding the speed, which is what most people mention online when they compare the two, it's only faster because apt-get update is a separate step.
I doubt "apt-get" was more advanced, but the breadth and quality of Debian's packages were unparalleled. Debian "stable" always just worked, and if you preferred newer packages, Testing and Unstable were usually a good compromise between stability and shiny.
I haven't run Linux on my local machines for a couple years though, so maybe things have changed a bit, but if we're talking about the years where Debian gained "market" (mind?) share, then these are the reasons IMHO.
Yes it was Python for a long time actually. And yet it still had very useful features like rollback and history. Things large government contracts kinda required.
That's mostly true for Fedora, because it's a fast rolling release they don't keep old packages around. It's not true for RHEL who strive to maintain backwards compatibility and stability.
I use Fedora as my daily driver for work and DNF rollback might as well not exist. It's atrocious. Even when turning on the local repo cache function (which keeps RPM packages on your system for some time in case you want to roll back), I have never been able to successfully roll back an upgrade that contained more than ~10-20 packages.
They don't require it at the OS level though, contracts like that require it on the state of the system as a whole and the typical way to deal with this is by using versioned backups and audit trails.
I am thankful that, despite what I see as The Bloating or heaviness of recent mainstream distros/direction, we still have the stable underlying it_just_works framework of Debian as a common factor, enabling user choice across a range of options.
Essential anecdata : for the few computers in my household, the user experience and stability of certain mainstream Debian-based Linux distros cratered hard after, let's call it 20.x, and I doubt it would be possible to achieve the "I set up a 16.x machine for my parents in 2016, and to this day, It Just Runs, No Questions Asked, Ever" in the same way with today's equivalents - unless stepping back to a simpler incarnation of the Debian family. I'm thankful to be able to do that, and in my house, that thanks goes to the ever-present Debian.
"Debian was founded by Ian. First announced on August 16, 1993, by Ian Murdock, who initially called the system "the Debian Linux Release". The word "Debian" was formed as a portmanteau of the first name of his then-girlfriend (later ex-wife) Debra Lynn and his own first name.
What an incredibly disrespectful article. The details surrounding his death, and the nature of the event, and not at all in the public’s interest, and likely serve to cause more anguish to the family and friends. Even the title is written inappropriately.
In Australia at least, the journalistic standard [1] states that reports should avoid sensationalising suicide, and should not report on the methods. It’s also expected for support details to be added at the end of the story (this addendum is often the only indication that the persons’ death was the result of suicide).
For my daily driver I went Freebsd->Ubuntu->Debian and have been really happy since. I tend to run older machines that are gifted to me from friends. Debian combined with xfce I rarely have performance issues and actually way better experience than other distro specifically targeted at older hardware (lubuntu, dsl, etc)
The fact this article makes a point that the Free in FOSS stands for free of cost just as much as freedom. That's not the primary tenant of FOSS even if it is for Debian.
The problem with Debian is their insistence on "stability" by not updating the packages. In the modern world, when software development speed picked up steam quite a lot, releases of new software happen much more often than distribution life cycles.
That's not a problem, that's a feature. If you're on Debian or Ubuntu LTS, you (generally) don't have to worry about something breaking by running `apt upgrade` to keep up with security fixes - only release upgrades can and will give you trouble, but eh, that's to be expected.
Not all software keeps working for eternity without updates. I happen to be the author of a piece of software packaged by Debian that relies on a third party HTTP endpoint that changes semi-frequently, necessitating fixes to keep it working. We also liberally bump the major version number when there are big new features in addition to whatever fixes are necessary. The end result is Debian refuses to update to a working version, and the version in stable is completely broken more often than not. So we had to discourage Debian (and derivatives) users from installing the distro version, and close all issues they open.
It was a long time ago so my memory is fuzzy, I think the DD did briefly touch on that option, but nothing came of it. As a user who has used backports a few times, the experience isn’t that great since it needs to be explicitly configured.
> Not all software keeps working for eternity without updates.
True. And yes, Debian stable packages do not work so well if a package needs to undergo frequent updates. As others have pointed out there are work arounds in place - like backports. Or if you want a Ubuntu / Arch like experience, you could try Debian testing. But they are kludges and for some packages the pain got so great Debian was forced to brake it's own rules and pushes new version into stable. But that's rare - Debian stable is so popular because it's stable.
The underlying reason for that focus seems to be Debian is a distribution put together by sysadmin's. A large chunk of the DD's (Debian Developers) are sysadmins pooling their resources to produce a distribution they can use in their day job. The two things that matters to them are security and stability. It's not an accident that Debian led the way on reproducible builds. This mob really cares about rock solid and secure.
To me Debian model of "sysadmin's pooling their resources" is remarkable. It's one of the few open source driver projects that combines open source and commercial objectives. At its heart is a whole pile of people teaming up in an open source project to create a tool (a server OS) they base their respective (and disparate) commercial products. The open nature of Debian - it's ethos, it's transparency, is critical to why this works so well. It allows the participants who might well be producing competing products to trust each other, and continue to work along side it other for decades now to produce a distribution. It also means it's always going to have a strong server focus.
That focus doesn't mean not to say Debian can't be used for a Desktop. A stable base like that is a very attractive thing to build a desktop on, and that's what happened. It's my daily driver, and I think it works wonderfully - but I also hand install some things from outside Debian (like youtube-dlp), and make liberal use of docker containers when I need to pin certain versions for development. It's a bit of work - but as a developer it's stuff I have to do anyway. It created an opportunity to create a derivative that caters to the desktop users who aren't like me and just want the latest shiny - and Ubuntu stepped into it.
No idea. We talked to the DD about it. Maybe they’re content with the software working in testing and sometimes in stable.
Of course in theory they could also try to backport the changes, but I doubt anyone would bother for anything that’s not hugely popular. I’m tired of distro patches generating support load anyway.
If I were to guess, youtube-dl fits the bill... just like browsers, it has the problem that it needs to update far more frequently than any OS/distro schedule allows.
Except Debian backports yt-dlp for Stable. That's why the answer is backports. If it works for something like yt-dlp, it should work for something similar, especially if it's already in Testing (since that is usually the source for the rebuild for Stable).
why not make an appimage for debian and be done with it? I mean let debian do what debian is gonna do but have an appimage and wash your hands and worries of it?
Many times I’ve add to add external/weird repositories to get some functionalities i rrallneeded (eg: appropriate codecs fir bluetooth audio and a version of bluez that supported by bluetooth headsets) and that kind of tainting really endangers the longevity of a debian-based system.
I understand that the fault is completely on the external repositories and on the user… however the choice then is to be able to use my hardware (or, in general, to the kind if computing i need) or not.
These days I’m using fedora btw, which has been really stable while providing fairly up-to-date packages.
> That's not a problem with debian, that's just a mismatch in needs, you're probably better off with a "less" stable operating system.
I'll be honest, i haven't had any stability issues with Fedora either, and i've been upgrading as often as i would with Debian. And in both cases, i'm using fairly old and known hardware (years old thinkpads, the T440 and the X270).
In which case was it an actual problem for you? What about switching to testing or unstable?
In my case I like to mix stable with a few packages from testing / unstable if needed. The current stable is relatively recent, so right now I just have golang from testing for instance.
I agree with the sibling comment that it's more of a feature: it's more efficient for me to work around the bugs / missing features of packages that won't be update for a while and stick to these workarounds, rather than constantly adapt to new bugs (even if old ones are fixed) or to changing features.
>In the modern world, when software development speed picked up steam quite a lot, releases of new software happen much more often than distribution life cycles.
It's not just a modern thing. For no other platforms are application versions so tightly coupled with the OS version. Mac or Windows users would tell you to take a hike if you tell them they can't usually get new software without upgrading their OS. Similarly Android apps typically target somewhat older platform API versions to support users who have not yet upgraded to the latest base system.
That's one of Debian's key value propositions. "In the modern world", updates introduce zero-day vulnerabilities, critical bugs and unproven features. Debian is the rock solid foundation you can always count on.
On the flip side, Debian doesn't upgrade to a new version when a vulnerability is found in the shipped version. Instead, they try to backport just the fix.
If Debian shipped version n and the upstream fixes the vulnerability in version n+5 (not unusual, as debian stable is routinely out of date for 2-4 years depending on the length of the testing freeze and the point in time of the stable lifecycle), this can range from trivial to near impossible in a timely manner.
They also tend to be opinionated about the software they ship and do include many Debian-specific patches. Most of them are harmless or just small changes to make the software work better with the Debian way of configuring things; sometimes much more catastrophic: https://github.com/g0tmi1k/debian-ssh
The issue with old packages I run into is that I don’t live in a Debian Linux only bubble — I use Windows and macOS too, which makes major version discrepancies between platforms irritating and occasionally a problem (data format changes, feature differences, etc).
For desktop use, distros like Fedora strike a decent balance in my opinion, being much more up to date without sitting on the bleeding edge and usually updating without issue.
Debian stable is best for servers and computers where you don't always need the latest versions of programs but still need security updates. If you're e.g. a desktop computer power-user who does want to use the newest packages, then you may want to look into Debian testing instead.
That's what I do on my server. I run Debian stable to keep everything running smoothly.
If I want the latest version of a service app, I check for a Docker container and run that.
If it's a CLI tool, I'll check for an apt repo, or build it from source then copy it to /usr/local/bin/.
I feel like I'm getting the best of both worlds: a rock-solid OS, and with a tiny bit of effort I can run the newest version of any tool I want. By not doing this systematically on every single package in the OS, there's little risk of breakage.
As always, it's a tradeoff. Sometimes it's a problem, sometimes the same thing is a critical feature.
Which is why it's so important that you have multiple choices. If you need the latest packages, use Arch. If you need stability over multiple years, use Debian.
Good arguments! I don't disagree with them. But let's not forget security updates and let's not forget that a lot of IT work requires modern kernel features. Deep learning is one of them I think (if I didn't misread a few articles extremely badly that is, which is very likely). The other is modern databases that need e.g. `io_uring`.
I appreciate Debian for what it is, and I use it in VMs and containers. But for my own usage I prefer Arch (mostly Manjaro). Yes, even for servers. If you run a full system update at least once a week you will never stumble upon the possible problems of the rolling distro upgrades, by the way.
Over these 17 years, I have used Debian for frontend servers, backend servers, desktop computers, personal laptops, personal virtual machines, team virtual machines, etc. I run my static HTML websites, Common Lisp web applications, ZNC IRC network bouncer, Exim4 MTA, Matrix-IRC bridging clients, etc. all on Debian.
I also spin up fresh new Debian VMs to serve as clean rooms for testing my open source projects and technical guides. For example, when I published my "Lisp in Vim" article, Debian turned out to be a nice system to quickly install all the necessary packages and test out all the steps in my article thoroughly. When I write a shell script to be used across a variety of shells, I run my test suite for the script on bash, ksh, zsh, dash, posh, and yash, all of which can be easily installed on Debian.
Debian has served me very well for a wide variety of activities ranging from things like software development and testing to personal digital chores like writing documents, presentations, splitting and merging PDFs, editing audio/video files, document format conversions, editing photographs, etc. and even having fun, say by raytracing with POV-Ray, running vintage DOS games on DOSBox, or recording live music from my digital piano. No matter what the nature of the problem is, the solution often begins with a simple "apt-get install" command.
In all these years, I have never ever faced any problem with dependency management or package installation in the stable branch. Yes, the packages can sometimes begin to get old by a few years but the whole system remains outstandingly stable. It is a great distribution for long running servers. Debian is also quite good for a lean personal computing environment. In my opinion, the Debian stable version sets the gold standard for stability in the world of Linux distributions.