> Maybe the bar was just really low, and you've got a strong argument if you say it shouldn't be anymore, but we are making progress...
I was one of those people for longer than I can remember. Last year I decided that it was time for me to make this argument. I don't regret the time I spent learning, tinkering, and sometimes wrestling Linux into working. However, while Linux has made a ton of progress, the gap between what "just works" and what users expect has only grown. My time and needs are just too valuable to me now to be messing around with Linux.
You and others make a good point about getting going faster on windows, but one thing that a lot of people don't factor in is the amount of time fiddling with windows as well. Getting it working or fixing problems and other configuration probs. And not all hardware works well under windows or the latest version of windows. There is some real time spent trying to configure windows as well.
The main problem that the article presents is that windows acts like malware which brings a new element to your desktop. I remember getting a popup notification from facebook, this after I thought I turned off all of that stuff. You don't face that in linux. You did with some earlier versions of Ubuntu, but not anymore.
No doubt, windows brings computing to people who aren't technical and makes it easy, but anything beyond the most simple configuration will take a lot of time as well.
I have a Windows PC and an Ubuntu system running on the same NUC hardware, as well as a higher end desktop that I use. I have to twiddle with the Ubuntu system pretty much once every 3-4 months after an update breaks something. The Windows 10 box keeps running... My big desktop has upgraded from 7->8->8.1->10, and is still running, though I'm considering a clean install, as some of the cruft from experimenting with the WSL and Docker for Windows Containers has left things a little funky.
In the end, I find I have to spend far more time tinkering with my HTPC box running Ubuntu, then I ever really had to with windows. I know there are other distros, but Ubuntu is pretty much king of desktop linux here. I have considered switching to Debian proper, or an Ubuntu derivative, but haven't done so.
Oh, and don't get my started on the pain of getting an MCE remote working halfway properly under Kodi... that was a real painful experience. I'm just glad that suspend/resume has hdmi audio after now, where it lost it before some recent changes (daily intel driver ppa).
Apple has really pushed the bar up in terms of what users expect from their computers out of the box, and that has affected Windows as well, so in some sense Linux has been swimming upstream? treading water?
In the mid-90s I remember endlessly fiddling with CONFIG.SYS and AUTOEXEC.BAT to squeeze out that extra 30kb of conventional memory to run some game. Buying peripherals was such a crap shoot that 'Plug and Play' was something that needed to be advertised, and much of the time you were still stuck manually juggling IRQs, moving cards to different slots, etc.
Nowadays, one can buy any Windows or Mac computer/peripheral and generally expect it to work, with some bare minimum of reading reviews.
So while Linux has made tremendous strides in terms of driver support (you mean my wifi actually works now?), things are still very far from zero conf that users have become accustomed to.
Time is valuable, and 2-10 hours spent experimenting with various driver packages, editing text config files, recompiling the kernel, etc. is literally money out of the user's pocket, not to mention the learning curve if you don't already know how to do all these things. That doesn't include the time for extra research to see what hardware is well supported.
Here's an example of install instructions on a pretty well-supported laptop that does Linux [1]. I'd guesstimate an hour of time if you've already done this before, and anything up to 10 hours if you run into unexpected difficulties or are totally new to this.
I like how you picked Arch, which is notorious as one of the few distros without an installer program, and the most difficult to install. I also highly doubt that recompiling kernels are the norm, though I may just be going off of a decade or so of experience...
I've been running Linux for about 4 years now (Ubuntu -> Mint -> Fedora -> Arch over the first year or so, been on Arch ever since), and I've never had to recompile my kernel (I did choose to do it once, but this was because I was interested in trying it out rather than something not working right; I'm a bit masochistic that way).
I'm a big fan of Arch myself, but if you look at the distro's philosophy, manually installing it is actually the point. There are enough other distros if you want to avoid that.
I mentioned Arch because its repos/kernel seem the most up to date and hence most likely to support newish hardware. Plus their docs for getting to run on new laptops seems pretty comprehensive [1].
Looking for directions for that laptop on Ubuntu returns a bunch of frustrated users and rather conflicting testimony:
I was one of those people for longer than I can remember. Last year I decided that it was time for me to make this argument. I don't regret the time I spent learning, tinkering, and sometimes wrestling Linux into working. However, while Linux has made a ton of progress, the gap between what "just works" and what users expect has only grown. My time and needs are just too valuable to me now to be messing around with Linux.