Before the inevitable comments regarding Apple's moves with macOS and how the hacker 'tide' is supposedly turning back against Apple, I think I'll chime in with my two cents. I switched from my 2015 MBP to a Surface Book 2 at the end of 2019, and switched back to my same, six-year-old MBP just a few months ago. To keep things short, whilst WSL puts Windows miles ahead of where it once was, I find that the same old rough-edges still remain.
I often dock between standard-DPI displays and portable mode (with the high-DPI/'retina' display.) I first noticed that Windows doesn't correctly handle DPI switching with window borders, Explorer, and notifications back in 2019 (the old 2x scaling level remains when switching from portable to docked), and filed a feedback with the built-in app on Windows. I worked around this by killing dwm.exe and explorer.exe every time I docked my Surface. This issue was still present earlier this year, and deciding I had enough of dealing with all these little Windows 'quirks', wherever they arose, I switched back to my old Mac.
It turns out that SIP and Gatekeeper aren't nearly as much of a problem as I was led to believe, neither of these features have hampered me once. The Big Sur interface changes, whilst I thought I'd never get used to them, have actually grown on me. Since switching back, I've discovered a lot of quality native apps that simply have no analog on Windows–OmniFocus stands out here. And as always, Homebrew still exists and works just as well as it always has for most of my *nix related tasks.
Hearing about the M1 performance improvements, I can see myself staying on Mac for quite some time yet–I'll upgrade to an M1 MacBook Air once this 2015 MBP (six years old!) kicks the bucket.
Edit: For some context, I'm mainly a .NET developer. .NET Core/5 is a game-changer for cross-platform and development is first class on really any system nowadays. I've settled on JetBrains Rider for my IDE and find myself generally happier than I was with VS on Windows.
It's funny that it has been around 9 years now since Apple introduced retina macbooks and their implementation of HiDPI is good from the start but, while it's getting better, windows and linux still need to catch up.
It’s because Apple was very forward-looking in 1999 and decided to use floating point coordinates for the CoreGraphics API. Nothing on the Mac draws using exact integer pixels (since the death of the transitional Carbon API anyway). Scaling was built in from scratch.
The scaling was quite glitchy in 2009 [0], and in 2021, the only allowed scale factors are 100% and 200% — so you don’t need floating-point numbers for it.
If you want to position an @2x view (1 pt = 2 px) you need to use fractional points. A view at (0, 0.5) in points is (0, 1) in pixels. Resolution Independence was a separate concept.
Well, that's the whole thing. HiDPI in and of itself works well. I use it with a 24" UHD monitor and I absolutely love the setup.
But the main issue with Linux, at least with X11, is that it doesn't handle multiple displays with different DPI settings.
As you noted, QT seems to work fine, as do some other toolkits that have dynamic scaling according to the monitor size. Alacritty does this too, for example (don't know what toolkit they use, though).
But GTK in particular is terrible. Yes, you can configure it to use scaling, in which case it works well, IF you don't need fractional scaling (though recent versions of Gnome seem to support it somehow).
But now, if you're going to use, say, a HiDPI laptop with a "regular", larger external monitor, you're gonna have a bad time. If you enable scaling, the widgets will be huge on the LoDPI display. If you disable it, widgets will be teeny tiny on the HiDPI one.
Note that the problem isn't really with X11 (or at least Xorg) since it does provide the necessary information via RandR to perform per-monitor scaling.
The issue is really with window managers and toolkits. What needs to happen is that window managers ask the applications (the top level windows really) to scale themselves -this can be done not just for DPI purposes- and then the applications should apply that scaling.
(applications should expose that they can support this - like they expose that they support handling a 'close window' event - so that compositing window managers can do it themselves for applications that do not support it, which is also why this needs to go through the window manager)
Qt can do this but AFAIK window managers do not do this because there isn't any standard events for that - i remember an email about this exact topic when i took a look at the Xorg mailing list for WMs some time ago, but it didn't seem to be going anywhere (in that it barely had any replies).
So it really is about standardizing how it is to be done. I think the main reason it isn't done is that such setups are comparably rare (i mean hidpi displays themselves are very rare - if you check any desktop resolution stats they often barely are a blip - and having both a hidpi display and a regular dpi one is even less common than that) so developers aren't that interested in implementing such a thing.
One way it could be done however is also pushing the idea that this isn't just for multimonitor setups: as i wrote above, you can use that for generic scaling, which is useful even for single monitor regular dpi setups (if anything it can be very useful for low resolution setups - like 1366x768 and similar which are way more common than hidpi - when faced with applications with a lot of padding, etc).
The issue with Linux is that nothing is part of the system. There's no single standard GUI toolkit. There's no single standard windowing system. There's way too many variations of everything.
For decades, the standard windowing system was X11; and not just for Linux, for other systems as well.
The issue was elsewhere - it provided standardized wire protocol and you could not break it. It could extend it, but applications not aware of the extension would be not able to benefit. Windows and Mac on the other hand kept the protocol proprietary and you had to use supplied libraries, you could not talk to the display server directly. So Microsoft and Apple could update these libraries; on Linux, there was no such option, all the apps using wire protocol instead of client libraries would be left out in the dark.
This isn't really an issue, you just need people to agree around doing some common stuff (even if they do their own stuff elsewhere).
And this is really not that dissimilar under Windows: nowadays you have a lot of different GUI toolkits and applications still need to "opt-in" and explicitly support stuff like scaling, so you have a lot of applications that do not do that.
They don't even agree with what they are supposed to support from freedesktops.org.
On Windows, Win32 doesn't do that by default because it would break backwards compatibility from those Windows 95 binaries being run on a Windows 10 2021 edition, so it must be a conscious decision to enable it.
> Note that the problem isn't really with X11 (or at least Xorg) since it does provide the necessary information via RandR to perform per-monitor scaling.
It is issue with X11. If you have Wordperfect binary from 1999, it still has to work; it does not matter that RandR provides information, old clients will not ask for it and without it, they will be broken.
That's why Xwayland does upscaling by itself.
> mean hidpi displays themselves are very rare - if you check any desktop resolution stats they often barely are a blip - and having both a hidpi display and a regular dpi one is even less common than that
Cause and effect. They are rare, because using them sucks, making them even rarer. On systems, where they do not suck, they are not that rare. This attitude makes entire platform worse off.
> It is issue with X11. If you have Wordperfect binary from 1999, it still has to work; it does not matter that RandR provides information, old clients will not ask for it and without it, they will be broken.
This is why i wrote "so that compositing window managers can do it themselves for applications that do not support it, which is also why this needs to go through the window manager". You do not need X11 support for that, functionality that exist can be used right now to do it - assuming the applications and WMs cooperate to specify the necessary attributes and events, of course.
> That's why Xwayland does upscaling by itself.
You can also do this with a compositor for applications that do not support scaling themselves (which would be the default state - again this isn't any different than the 'supports close event' flag that already exists for decades now). Without a compositor you're out of luck (for the most part)...
...However!
Keith Packard worked on server-side scaling some time ago [0] (not sure if it was merged with mainline) which would allow this to be done even without a compositor (well, kinda, technically the server does the compositing just for the scaled windows). So even without a regular desktop compositor this should still be implementable, assuming that functionality was (or will be) merged in mainline Xorg. But note that this is only for the case where one wants to run applications that do not support scaling and have hidpi scaling and also not using a compositor. The rest can already be implemented with Xorg as-is.
Those patches were never merged, and I doubt they will be, because there seems to be almost no interest in supporting high DPI displays on X without a compositor. If there was interest in those, at the very least somebody would have to adapt them and figure out how to use them with XWayland. In my opinion, any patch that doesn't fix this for Xwayland/Xwin/Xquartz is probably not going to fly. If you want this, you actually do need some level of protocol support for this, and I want to address a common misconception:
>X11 provides the necessary information via RandR to perform per-monitor scaling
This is not true. If it were true, it would be significantly easier to get DPI scaling to work on X11 and XWayland already. If you're referring to the physical size measurements, there are numerous problems with those, and they should never be used to do DPI scaling. If you're developing for Wayland, you should just ignore those measurements and use the scale provided by the server. If you mean that applications can draw themselves at arbitrary scales, they could always do that, without even bothering with RandR, but that will break down when you try to handle the compositing cases, e.g. stretching a window across monitors and expecting the display/input coordinates to scale correctly. So regardless of what you're trying to get the apps to do, some more work needs to be done there.
Even if those problems were fixed, DPI scaling would still not work without more changes in the server. The major reason why it can't even work is because of input coordinate scaling -- the server needs to be able to scale coordinates based on a factor provided by the client, and RandR does not provide this, and it doesn't make sense to add it to RandR either. If you want to standardize how this is done, the best way would just be to copy the method used by Wayland back into the X server. Part of it might be merging those patches by Keith that you linked, part of it might be somebody finishing this MR: https://gitlab.freedesktop.org/xorg/xserver/-/merge_requests...
> Those patches were never merged, and I doubt they will be, because there seems to be almost no interest in supporting high DPI displays on X without a compositor.
I don't know about that, though the only person i personally know with a HiDPI display uses X without a compositor (they just use a theme with everything being big).
> If there was interest in those, at the very least somebody would have to adapt them
These are only part of the puzzle, they are not really that useful by themselves without the rest of what i wrote, which would actually be the more involved problem. Having scaling without a compositor is more of a "cherry on top" than anything else.
> This is not true. If it were true, it would be significantly easier to get DPI scaling to work on X11 and XWayland already.
I think mixing DPI and scaling is the root of the problem here. Scaling is something independent from DPI settings, though DPI settings can be used for setting up the defaults.
However scaling is useful regardless of DPI.
> If you're referring to the physical size measurements, there are numerous problems with those, and they should never be used to do DPI scaling.
Why not? They can be used to calculate defaults. I've heard that by default the official X server misreports DPI, but many distributions apply patches to fix that - not sure why the X server does this but it is something to be fixed. Personally i never had issues with how the X server reported my monitors' DPI and i've used a bunch of different monitors over the decades.
Regardless this is something that can be fixed anyway - after all if Wayland can get the proper information, so can X11.
> If you mean that applications can draw themselves at arbitrary scales, they could always do that, without even bothering with RandR, but that will break down when you try to handle the compositing cases, e.g. stretching a window across monitors and expecting the display/input coordinates to scale correctly. So regardless of what you're trying to get the apps to do, some more work needs to be done there.
The idea is for RandR to be used by the window manager as an initial/default scaling setup for windows per monitor. The users should be able to alter that of course and even alter per-window (i mean toplevel window here) scaling (via their window manager).
> Even if those problems were fixed, DPI scaling would still not work without more changes in the server. The major reason why it can't even work is because of input coordinate scaling -- the server needs to be able to scale coordinates based on a factor provided by the client, and RandR does not provide this, and it doesn't make sense to add it to RandR either.
IIRC this was actually fixed some time ago to allow scaling via compositors. Outside compositors we're back to Keith's patches. But this is also an issue for applications that do not support scaling.
Basically what i mean is this, to summarize it:
* Applications can either support or not scaling. If they do support scaling they set some attribute to their toplevel windows to indicate that.
* The Window Manager sends events to toplevel windows that support scaling to setup their scaling whenever the scaling changes (the window is dragged to another monitor, the user requests a different scale level via an icon menu or shortcut or whatever). If a window doesn't support scaling (note that an application can have windows that both support and not support scaling, not that it matters much but it is a good idea to avoid associating application support with window support) it gets scaled by either the window manager (if it is a compositor), a dedicated compositor (some compositors can work with other window managers) or the server (with Keith's patches).
There could also be support for toolkits to do the scaling themselves if the window manager doesn't support scaling (e.g. window managers that support scaling can use an attribute in the root window to indicate that support), but that is more complex and may not be really necessary.
Note that only the case where an X application doesn't support scaling and a compositor is not running is where the current server functionality isn't adequate. Everything else should already be there, but it needs support from toolkits (to support the relevant events and of course scaling their contents) and window managers (to implement scaling events). Perhaps RandR's DPI info isn't reliable (though that hasn't been my experience) but that can be fixed, the harder bits are getting toolkits and window managers to support that stuff.
I mean in terms of developer interest, I haven't seen anybody working towards getting those patches (or similar things) merged. It would be easiest to get this working in a rootless server (like XWayland) first and then go backwards from there. In the case of compositing on multiple monitors, scaling isn't independent of DPI settings. They are the same problem, because at some point you will get an app that is mismatched and you'll have to scale it along with its input coordinates.
The physical measurements could be used to inaccurately guess defaults, some Wayland servers might do that, but it's not preferred. The X server doesn't "misreport" DPI in the sense that it's purposefully sending a wrong value, there's a process that it does (I may not be remembering this exactly) where it tries to read from the EDID, which could be wrong values. If the values are there but wrong those will just get quietly passed through, if those are zero then it then tries to fill in measurements based on a default DPI. Random X clients with access to RandR can also change the measurements stored in the server for whatever reason -- basically there is no possible way for this value to always be correct if you just want a number that tells you what DPI to target.
You're right this could be fixed, just like every bug could technically be fixed if given enough attention, but somebody needs to take the time to backport the Wayland method to X, which I don't have high hopes for it happening any time soon. The main thing you're missing in the server, as I said, is scaling of input coordinates. The compositor or the toolkit can't do this, it has to happen in the server, because those events are sent directly from the server to the client. It's probably a better approach to get that working, and then put a prop on the root window, and then hack kwin/mutter/whatever to use that, but I don't think anyone has tried to implement it that way yet.
> i mean hidpi displays themselves are very rare - if you check any desktop resolution stats they often barely are a blip
Do you have a link/cite for this? I couldn’t find easily somewhere summarizing pixel density stats.
“Desktop resolution stats” is not the same thing. All the retina displays have at least a 2x pixel ratio so these tables on a cursory Google search are clearly lumping 13 inch MacBook pros into the 1280x800 bucket for instance.
You got a reference to something that clearly is accounting for pixel ratio?
I can tell from a number of reports available that basically have no resolutions higher than 1920x1200, when we know there are enough retina displays out there that they should be listed.
As to the linked report. It’s hard to discount bias with a browser with 3% market share that looking further seems to be overwhelmingly installed on outdated Windows 8 machines
What exactly is wrong with "scaling" on Linux? I use Qt apps on Plasma and everything just works. Nearly all my displays are HiDPI now.
If you use 2x scaling (or other integer scaling), everything is fine. If you need fractional scaling, things break down pretty quickly, at least on GNOME.
Once you use fractional scaling (e.g. 1.5), GTK applications will scale correctly, but are slightly slower and consume more mattery. However, all XWayland applications are unusably blurry.
Unfortunately, fractional scaling is really needed on 14" 1080p laptops or 4k e.g. 27" screens. With 1x scaling everything is tiny and with 2x scaling gigantic.
The only workaround that worked ok for me was using 1x scaling and using font scaling in GNOME. Many controls and icons are tiny that way, but at least text is readable and not blurry. Of course, this only works up to some extend and when both screens need the same amount of scaling (since font scaling cannot be configured per screen).
On a non-HiDPI display (I use one that is 1680 by 1050 and one that is 1280 by 1024) fractional scaling works fine on my machine running Gnome 34 with apps that bypass XWayland and talk directly with Wayland (which is true of all the apps I use). (Google Chrome needs to be started with a couple of command-line switches to bypass XWayland.)
But your comment makes me hesitate to buy a HiDPI monitor!
I guess this is the issue with saying "Linux" supports or doesn't support things. My software stack of Xorg/Qt/Plasma works fine. Your stack of Wayland/GTK/GNOME doesn't. Should we hold it against "Linux"? Is that a fair comparison to MacOS, where everything uses Cocoa? For that matter, how do GTK apps look on MacOS?
They’ve had wonderful trackpads for even longer and PC laptop OEMs still have yet to catch up. No, I’m entirely uninterested in the touchscreen gimmick of the week.
Funny enough I did a bunch of projects that required pushing like 16K output (video wall, not a personal setup) and at that point Windows becomes the only option because it requires high end GPU and an array of outputs.
I feel like Apple's approach to hardware made their job much easier -- they just ensured that every retina Mac had a display with exactly 2x the pixels as its pre-retina equivalent, so they never had to deal with fractional scaling like Windows often does.
They deal with lots of fractional scaling - at first they may have done simple pixel doubling but now the screens are often at non integer ratios.
In fact you can speed it up slightly by making sure the display resolution is exactly “half” the actual resolution so to prevent MacOS from scaling 8x internally first.
The article clearly states that macOS always draws 2x and just displays the result, individual pixels be damned. Anyone can confirm this simply by taking a screenshot and looking at the resolution (screenshots are always the size of the underlying framebuffer)
Windows has a particular knack for giving the user the impression it doesn't care about them at all. The things that irritate me now are the same things that irritated me last year, five years ago, and in a few cases twenty years ago. They make you go "Jesus is everybody asleep over there?"
For example, if you want a notification icon to always be shown, you have to open a menu, open another menu, scroll to find the one you want, open an option list, and finally hit "show icon and notifications". Why can't you just right click an icon and hit "show icon and notifications"? Because Windows doesn't give a shit about you.
All the ads, the file search that searches Bing and can’t be disabled, the telemetry that sends the websites you browse to Microsoft. There must be some product manager over there whose bonus is based on how much ad revenue they can get out of Windows, no matter the cost. Installing the Windows 11 preview is just depressing, like you say it really makes me feel like they don’t care and expect people just to take it.
For your specific notification icon issue, I thought you can just drag the icon from the pop up tray to the main system tray to keep it always visible?
The insanely deep nesting of little configuration windows is probably the number one thing that makes Windows feel old to me. The new settings app is an improvement, but sadly one that lives in isolation.
And often that's because MS decided not to replicate some capabilities of the old WinNT-vintage configuration GUI in the pretty new touch-friendly interface. Probably the classic example is the trek you have to make, through two or three dialogs of nice-looking but most often useless new GUI, to a cryptic "Change adapter options" button and on to where Control Panel\Network and Internet\Network Connections survives, still the vital dialog and still largely in its original late-'90s form. Oh, and then there's the decision that having separate default devices and default communication devices for audio is too nerdy, so hip young Windows 7 kids don't need to see anything about any default communication device in the shiny new sound GUI. Hilarity then ensues when the VR headset somehow automatically becomes the default-comms-device microphone and you mysteriously become barely audible on voice calls. Of course other big, important sections of the GUI (file permissions?!) seem to have been junked completely without any replacement.
Those settings windows are part of the API and windows has to provide them for legacy software in many cases.
What was absolutely insane was not replicating the functionality into the new control panel setup leading to power users feeling dumb because they can’t find network settings under network settings.
Or the point that many of these configurations can ONLY be accessed by GUI. That's one thing that makes windows (and even MacOS to some degree) so awkward to use sometimes.
Almost every setting in MacOS can be accessed by the command line, and many things only there. Actually I don’t think I’ve found anything so far that can’t be set from the command line and I do a fair amount of that kind of tweaking.
Xcode license agreements? Took me literally more than a hour the first time I installed something on the terminal but refused to finish because I never heard of xcode or that you need to open it manually in order to confirm a license agreement. When I was just installing the ruby stack and it needed some compilers via xcode I guess?
That was on a MBP like 2015 or so. The error message was not helpful at all, I had to ask someone with experience or I would still be installing ruby to this day.
> For example, if you want a notification icon to always be shown, you have to open a menu, open another menu, scroll to find the one you want, open an option list, and finally hit "show icon and notifications".
Or, you know, just drag and drop it into the always shown area.
Window 11 goes a long way towards fixing this. There's still multiple ways to change things, but all of the touch screen settings are gone and the new settings app is laid out logically.
"switched from my 2015 MBP to a Surface Book 2 at the end of 2019, and switched back to my same, six-year-old MBP just a few months ago. To keep things short, whilst WSL puts Windows miles ahead of where it once was, I find that the same old rough-edges still remain."
I had a similar feeling when given a new windows laptop in 2020 to work with that was supposedly better spec'ed than my late 2013 macbook 15". The problem has been windows and the lack of finding flow in it's OS. I've been trying to document workflows & apps here for the past year to see if alternatives are emerging: https://docs.google.com/spreadsheets/d/148zTJUwfVv9xfDcpSoH3... . For me it's not so much the development environment but whether I can quickly deal with administrivia tasks (organizing files, copying things from one form to another, etc.). I'm surprised windows 11 still lacks miller columns: https://en.wikipedia.org/wiki/Miller_columns
My concern about the apple platform going forward isn't so much about the technical, but the material - my 2013-era macbook died recently and found the fact that I could remove the ssd and plug it into a similarly old mac to still be a savior , despite having multiple backups. It's driven me to consider installing linux on an intel windows tablet over buying an M1 macbook - I'm getting leary of both the lack of repairability and really wanting a tablet on stand + detached keyboard/trackpad for sake of ergonomics.
> The Big Sur interface changes, whilst I thought I'd never get used to them, have actually grown on me.
That’s the thing with some people and Apple; I generally moan whenever Apple changes their UI substantially (the change to the “flat” iOS comes to mind first), but soon I come to wonder how I ever liked anything else. :)
They also (intentionally or not) make many of the changes “reversible” with tweaks and other programs. This allows the people who have a visceral reaction to revert and usually over time it softens, and then you find yourself not bothering with it.
The biggest change for me was the switch to zsh which ended up in me changing my default shell everywhere else.
> how the hacker 'tide' is supposedly turning back against Apple [...] To keep things short, whilst WSL puts Windows miles ahead of where it once was, I find that the same old rough-edges still remain.
This feels like a bit of a false dichotomy, WSL isn't the only alternative.
We are so very far from limited to windows vs mac for desktop today, it's easier than ever to use Linux or even a BSD as a daily driver on a PC laptop. Even the article in 2005 noted hackers switching to intel boxes (not mac intel boxes) for FreeBSD and Linux at the time.
Well, if you are self-employed maybe. Or for your own personal home machines, sure. But most employers with IT departments wouldn't be too happy if you had a "nonstandard" OS on your work machine because they wouldn't know how to secure it. Heck, many of them don't even allow Macs.
I know what you mean, and that is probably still true for the majority, for the US at least, and in large corporations, but I think the world is changing.
Linux is the standard OS at my work. We use it on servers, so it makes sense to use it on the desktop too for maximum dev familiarity and compatibility, since it now works well there... it's even started to be used by some of the non-devs to reduce windows/mac update fatigue and general performance issues even though they only need office apps... I'm at a small company but I know of other large companies with big IT depts applying the "locked down OS image" approach to Linux for desktop too.
I find the concept of "windows" as the only safe OS image IT depts are willing to use, as fairly antiquated. Much like IE used to be the only browser you were allowed to use in many big old corps and gov agencies for "security" reasons.
Although this is an aside, the original context of the article was only "hackers" machines, and whatever your interpretation of that is, it probably doesn't involve fortune 500s.
We use it on servers, so it makes sense to use it on the desktop too for maximum dev familiarity and compatibility, since it now works well there...
I am always surprised by these things. I mean, even trying to use something widely-used (nowadays) as Zoom, basically gives me two options (on a HiDPI screen):
- Use X11, but when I share a window and use drawing tools, whatever I draw is displaced by a seemingly constant offset. It's caused by scaling for HiDPI, because if I disable scaling, things work.
- Use Wayland, but I can only share the whole screen. Drawing works, but the overlay is incredibly flaky and often incorrectly registers clicks.
Besides that on both Wayland and X11, doing video call makes the fans blow at full speed and quickly drains the battery when a laptop is not hooked up (probably no hardware acceleration of video encoding/decoding). Oh, and I probably want to use a wired headphone, because using a BlueTooth headsets regularly dropout.
Zoom is just a single example, but there are just so many paper cuts if you want to work outside a solo developer context.
(Windows has a lot of issues as well, but at least basic workflows generally work.)
Zoom does seem to be a piece of crap, but it lets you use the browser which seems to be less bad on linux. At my work we all use google meets and only zoom for some external customers, the experience seems to be much worse with zoom in general regardless of the OS - that said it seems to work ok on Linux in the browser - once you actually get into the meeting, that's the hardest and worst UX experience ever (getting into the meeting).
I haven't used HiDPI monitors on linux yet, being able to use linux easily is more important to me for dev, it's a known area of weakness, and I just don't need the extra pixels for dev so it's easier to just avoid it. If you are doing graphics linux is probably not the best OS for the job tbh, and that's a shame but just the way it is currently.
> Zoom is just a single example, but there are just so many paper cuts if you want to work outside a solo developer context.
I suppose it depends upon how much crap software the workplace forces upon their workers, I understand that it's often mandatory to use a bunch of proprietary software for various types of communication and management stuff.. i.e stuff that's not central to actual dev work. If you can get away with a browser, then it's not bad at all... and when there is an option for a native version, use the browser, because usually it's some horrific electron thing that's even worse.
At my work we all use google meets and only zoom for some external customers, the experience seems to be much worse with zoom in general regardless of the OS
I strongly disagree. Zoom on macOS and macOS works very well (I currently use it for a remote course). My experiences have been far far better than Teams, Google Meet, Skype, Jitsi etc. Most of which I have had to use for work or private. My wife is hoping that her employer will relent sometime and offer Zoom, because she prefers it so much over Google Meet, whatever Blackboard uses, etc.
It could be that your experience is colored by the Zoom web interface, which I only used once on Linux and it was terrible then.
I haven't used HiDPI monitors on linux yet, being able to use linux easily is more important to me for dev, it's a known area of weakness, and I just don't need the extra pixels for dev so it's easier to just avoid it.
For me it's one of those things that can't be unseen. I have happily used lo-DPI screens for decades, but since I can't really stand lo-DPI screens anymore since I had my first retina MacBook. Even though I am doing development 80% of the time, I want my fonts to be crisp.
If you can get away with a browser, then it's not bad at all... and when there is an option for a native version, use the browser, because usually it's some horrific electron thing that's even worse.
I guess that's true on Linux. Unfortunately, very few web apps beat native applications like OmniGraffle, the Affinity Suite, or even PowerPoint.
ok, but all of your use cases and priorities are so far away from the original premise of the article that's it's clear your needs are more inline with a regular mac or windows user than a "hacker" or dev... (just saying) it's not for everyone.
Also which video conferencing app "is the worst" seems to be pretty subjective, but tbh they are all pretty crappy and replaceable, i'm not really sure what this has to do with OS choice.
> i.e stuff that's not central to actual dev work.
For developers that only care about POSIX CLI and daemons, others from us actually don't care about it and do graphics and GUI programming, where the screens and GPGPUs play a big role.
Same here, 10.6 was my last MacOS, it felt like it lost direction and focus after that.
For a 2009 MBP it was a nice experience trying out FreeBSD and various linux distros before settling on Debian. The only stumbling block on that machine was the macEFI, which there were eventually plenty of solutions for - I hear the later machines are a bit of a nightmare to run anything other than MacOS though.
> It turns out that SIP and Gatekeeper aren't nearly as much of a problem as I was led to believe, neither of these features have hampered me once.
They can also both be turned off! Gatekeeper in 30 seconds and SIP in less than five minutes. I legitimately understand why people get worked up about optional features.
I’ve made a serious effort to switch away - using a retina Mac pro with Ubuntu, and a bunch of raspberry PIs.
I love Linux, and I love the PIs, but it can’t see any reason to stick with the Ubuntu desktop. When a pro level M1 or M2 iMac is released, I will be moving back.
> i can’t see any reason to stick with the Ubuntu desktop
Then don't. Linux has hundreds of alternative desktops that you can install and try out in minutes. KDE, i3, Awesome, bspwm, Openbox and many many more are all at your fingertips.
It’s all frankly half baked and clunky, and the time investment in configuration is insane.
Yes, you can find individual features that are better than MacOS spread across them, but you can’t get good design, aesthetics and features in any single one of them.
You can certainly get by with it, but why would I want to just ‘get by’?
Oh yes so much this. 20 years here being told that today is the day of Linux on the desktop. Like hell.
Apple stuff is consistently good. It's not perfect but there's not really anything that kicks you in the balls every day to the point it makes you want to burn all your technology and live in the woods. About 5 minutes with any recently Gnome release on a Linux laptop makes me want to do that.
Linux is always 50% done. That last 30% to get it to Apple's 80% done is boring so they just rewrite everything again or fork some new clothes for the emperor.
The worst experience is windows. There's several layers and each one gets to about 33% done and is replaced with a new one. All the old layers sit there like rings in a tree, occasionally having to be gouged out to fix an obscure issue.
My approach with Linux has been to not even try to make the Linux desktop try and compete with macOS or even Windows. Instead, I really like the "OS as IDE" approach, which has caused me to settle into using most of the Suckless stuff -- dwm, st, dmenu. Live in your terminal and keep each application's function and jurisdiction as small as possible, and Linux is really great. For programming. Only.
You'd probably be surprised to hear that I mostly agree with what you're saying. But here's the difference: MacOS will never feel like home to me. I've used it for months, and tried my hardest to make it's workflow feel natural. Every time I tried to rationalize a concern with the OS though, I'm buried by workarounds/paid apps/subscription services that will supposedly fix the issue for me. Unfortunately, that makes the last 20% of MacOS feel so frustrating to fill: you're always taking one step forwards and two steps back, fighting against a company that wants to take control away from you with every update.
On Linux, I just have a script that I run that sets everything up the way I want it. I'll admit, it took a few hours to make (and has received a few updates over the years), but it feels much closer to that "100%" mark than a fully-customized Mac to me. Plus, I'd rather not rely on the whims of yet another "move fast and break things" company jockeying to take over more and more of my digital life. I'm good here.
It's a two way street. Why would I want to "get by" without a package manager, or "get by" without 32-bit libraries, or "get by" without a functional graphics API? I'm the user: the choice should be mine. If Apple doesn't present that choice, I don't consider Apple an option. Simple as that. As a developer, these points are non-negotiable. Apple has built computers for 30 years now, they should be well aware of that.
> Why would I want to "get by" without a package manager,
You don’t have to on MacOS.
> or "get by" without 32-bit libraries
Why would you want 32 bit libraries? 32-bit support was also ditched in popular Linux distributions (Ubuntu, Fedora, Red Hat) in the same year as MacOS.
> or "get by" without a functional graphics API?
To claim there is no functional graphics API on MacOS is utterly absurd.
Years ago I started using Macs to have a stable Unix desktop that also played well with commercial software while keeping Linux on the side as a hobby. Somewhere around 2010 or so I think finally switched away from Macs to pure Linux after my Mac that worked just fine was dropped from support of the latest version of OS X and by that point Linux on the desktop seemed far more stable and I was working with it much more professionally any way. Linux started getting really good as a desktop OS and I think it peaked for me around Ubuntu 16.04 and got very much to a "just works" state that I enjoyed with Macs and Steam was on Linux along with a handful of games I cared about(mostly Civ 5). Then a weird thing started happening and new updates on the same hardware started having weird issues on my laptop, mostly around sleep and power saving. It was possible to fix most of the issues, but the brief "just works" periods seemed to be over. It was still mostly fine on my desktop, and even improved when Valve launched Proton.
Eventually I used my Linux desktop mostly and later my Linux laptop went away in favor of a Chromebook, which supported their beta Linux environment. It wasn't perfect, but sleep/suspend worked and I could do most "Linux" things I needed to do. I also started using a Macbook Pro at work again after being off of OS X for 5 or 6 years and it was just a reminder to me that a Unix-like desktop that didn't need as much fiddling around existed(of course some of you may point out that the fiddling is more restricted on OS X, which is a fair point as well), but the thermal issues of the later Intel Macbooks kept me from buying one for personal use.
So fast forward to last year. I was moving and some other stuff was in flux and my already long in the tooth desktop needed to be put away for a while and while my Chromebook was useful, I didn't like relying on it as a primary machine. Since I didn't play too many demanding games anymore I thought this would be a good chance to kill two birds with one stone and get a laptop that capable playing the meager game library I had, which meant mostly better integrated graphics rather than discrete, and since it was before Intel Xe came out, a Ryzen 4000 series CPU seemed like the obvious choice with a GPU that was roughly GTX 1050ti class, just enough for me. Sadly it appears that AMD pushed forward with some new kind of "hybrid sleep" or what ever they're calling it and wast supported by Linux very well at all(I'm not sure of the exact state at the time of writing this, there was some talk of support landing in 5.13, but I'm not sure of that happened or if it got pushed out to 5.14). So I did the Windows thing full time for the first time in a very long time with WSL, which I actually grew to like somewhat, but I still wasn't happy with the laptop.
By the end of the end of 2020, Macbooks with the M1 were in the wild and the results seemed very good. And since I was still using a Macbook Pro for work during this time, I still had mostly positive feelings about macOS. It had its peculiarities, but I liked it better than Windows, and required less effort to keep running than Linux, I just didn't like the jet engine that my Macbook turned into when ever I tried to do anything remotely demanding. In fact I've always hated fans, and fan noise was a one of motivating factors in dumping my desktop. My "grail" machine has been a fast laptop with no fans or other moving parts. SSDs have been a thing for a while, but we've not had a rally solid performing fanless machine until the M1 Macbook Air(in the 21st century, I'm not counting the fast for their time fanless machines of the 80s and early 90's). So I felt almost obligated get that machine and that's what I did and haven't regretted it for a second(though ask me again in 4 or 5 years if/when Apple starts dropping support for these first gen ARM devices). It can silently keep up with any work I would do compared to my work 16" Macbook Pro which runs much hotter and louder and so far the M1 Air has been able to do what I've needed it to do.
Switching to windows and expecting an improvement from MacOS is silly. I switched to Linux from MacOs in 2015 and never looked back. I cam dual boot into Windows, but do so less than once a year these days.
This is why Linux exists. I was pretty disappointed by the compatibility/performance of the M1, so I switched back to my desktop with Manjaro Linux installed on it. Suffice to say, I doubt I'll be using my Macbook unless Apple revives 32-bit libraries or switches back to x86 in some capacity.
I'm glad that your MacBook Pro didn't get bricked by the Big Sur update! I heard that there was some issue with the HDMI port firmware, so haven't updated my 2014 MBP.
I have had major issues with trying to replace the battery. Beware of third-party replacements, which idle at 12V. The original battery idles at 2.2V and "wakes up". A voltage spike when waking from battery killed my logic board, and 3 professionals have failed to repair it so far. I've spent over NZ$1900 on 5 replacement batteries to try to get something that works, since September 2020. I just want a laptop that I can use on the bus, with a removable SSD so I won't lose my data when it breaks, and iTunes so I can sync my iPod and iPhone.
Instead of looking for a removable SSD consider setting up some kind of syncing and backup solution. A removable ssd doesn’t protect you from theft or ransomware. I use a combination of time machine, backblaze and syncthing to make sure all my data is synced across my machines and backed up locally and in the cloud.
A removable SSD does ensure that my laptop doesn't become trash after I use it long enough. For many people (myself included) no computer can truly feel "perfect" unless it can be run in relative perpetuity.
The music app in Big Sur still supports syncing to old iPods, I used it with an old 30-pin pre-video iPod nano that I found in a thrift store. It could even properly reformat it from windows FAT32 to Mac HFS format
Chock full of golden material and contrast/comparable points to today:
"If you want to know what ordinary people will be doing with computers in ten years, just walk around the CS department at a good university. Whatever they're doing, you'll be doing.
"
Your point being that this is completely untrue? Normal people have absolutely not learned how to program/automate using computers. All companies I've seen from the inside either use excel for everything or have dedicated personnel for automation. Excel is a good step but it still requires a lot of manual work.
I think it's true, just meant to be a little less literal. Things like communicating over the internet, taking notes and yes, automation are all useful. However, when they start, the UX is academic and inaccessible to a "normal person". The arg is that it takes ~10 years to construct the correct abstractions so that the normal people can do those useful things.
> So Dad, there's this company called Apple. They make a new kind of computer that's as well designed as a Bang & Olufsen stereo system, and underneath is the best Unix machine you can buy. Yes, the price to earnings ratio is kind of high, but I think a lot of people are going to want these.
Actually, I am writing this on an Asus laptop with so-called "Bang & Olufsen Technology". The laptop is fine, though showing its age, but the sound quality through the built-in speakers is awful, and always has been - I now use a cheap Anker bluetooth speaker which is far better.
And to be honest, I was never very impressed with the sound quality of B&O way back in the 1970s, when they were seen to be the height of sophistication.
I get the impression that Apple since Jobs' return has actively gone out of its way to try not to have pundits and the general public make the pretty obvious comparisons to B & O or Bose, because they rightly see it as not a flattering comparison for Apple.
Bose’ QC series of headphones is great. Incredibly comfortable and well tuned with noise canceling that’s still very competitive with more recent headphones.
The newer Bose 700 makes too many compromises to look nice IMO.
You should definitely try the newer Bang and Olufsens, if you‘re into high-end audio - like the Beolab 20 or Beolab 28. They‘re incredible, and actually compare favorably to B&W or KEF in the same price range. The laptop licensing shenanigans are awful though.
MacOS was a breath of fresh air to those of us who wanted Office, Adobe, and a more polished experience than Microsoft or Linux offered at the time.
My friend bought one of the last G3 iBooks. It ran Mac OS X 10.2 that was arguably the first “stable” release of Mac OS X.
I had such a great experience I bought a 15” October 2005 PowerBook as my first Apple laptop.
It was under performant compared to Intel offerings of the day but I badly wanted Mac OS X. I was thrilled when I could buy a 2007 MacBook Pro to replace it with a Core2Duo processor.
Now, the M1 MacBook Air is so unbelievably fast and power efficient.
> MacOS was a breath of fresh air to those of us who wanted Office, Adobe,
I used both MS Word, Excel and Adobe Framemaker on Windows in the 1990s (much less 2005) with very few problems - I was really impressed at how well such a complex piece of software as Frame ran on Windows, and I don't think it now supported on the Mac.
Is it just me, or did all these "hackers" that apparently bought Macs back then not really produce all that much hacker software (or non-hacker software, come to that)? It seems to me that Windows and Linux both had much greater funds of developer software than did the Mac ecosystem.
As someone that uses both Mac and Windows regularly, I find the quality of the software I use on Mac to be generally higher. Cyberduck (FTP), Transmission (bittorrent), Charles Proxy (packet sniffing), Sourcetree (git gui), Pixelmator (image editing), Sketch (vector graphics), Things (todo), 1Password (password management) are all solid applications with polished user interfaces, where I struggle to find anything quite as nice on Windows (several of these have been or are being ported to Windows, with varying levels of polish carrying over).
This is a problem on Linux too. The lack of a Sketch equivalent for Linux bit me just the other day. Akira is the closest thing, but it’s so early in its dev cycle that it won’t be useable for several more years. A lot of people will point to Inkscape, but it’s much more of an Illustrator alternative and as such feels more geared for print.
I might’ve been able to fudge Figma to do what I needed, but even that wouldn’t have been ideal, with how it’s designed for collaborative prototyping more than a generic digital vector graphics editor. Aside from that, I don’t like using Figma simply because they don’t publish specs for their file format and try to lock you in (as opposed to Sketch, which uses a publicly documented file format that has plenty of converters).
As you say, several of these are also available on Windows - I can't think why anyone developing such software would not make them available on multiple platforms.
> it’s next to impossible to create a great cross platform app with a native feeling UI
And who do we have to thank for that? Perhaps there's some business out there that made a chocolate-themed UI toolkit that was notoriously proprietary and unworkable, so much so that it drove the industry into a software drought that only let up when x86 was fully standardized?
Sometimes because APIs? Back when I had a Mac forced on me I was surprised how fast you can discover files and software, something windows back then simply has not managed to offer via their API. The 'finder' or whatever that tool was called simply felt wrong on windows, nothing like the original.
One different example may is the gnome shell desktop. Many of us who use it have basically overcome the win95 approach of desktop design. There is as far as I know no equivalent on windows or Mac. And a port would not be possible as you can't replace their window managers.
Then there is the whole Linux standard toolset, that yes works on windows these days, but obviously feels very foreign and doesn't integrate well to the system.
It's not only a matter of porting software when some things are so fundamentally different
CyberDuck (macOS) is inferior to Ipswitch WS_FTP (Windows). The CyberDuck interface is awkward, it's confusing to tell where your uploads and downloads will wind up.
Cornerstone (macOS Subversion client) is atrocious, God help you if you created a sandbox in the wrong place, deleting it could wipe out your repository. Tortoise (Windows) is so much better.
As a Windows developer recently porting my code to macOS, I am underwhelmed. I expected to be blown away.
Right, that was my point. The comment I was replying to said that all those hackers that supposedly switched to Mac in 2005 didn't seem to have produced any notable software, so I listed a whole bunch of nice software that they had produced.
There seems to be a lot more interesting software on the Mac than Windows or Linux and I've never really understood that.
Think about how many Mac enthusiast podcasts, websites, and blogs there are. Windows should have many more of these, but they don't. And Linux enthusiast sites are very technical and low level. There's no writing about being a Linux user that isn't also about being a developer.
Where are there no companies like the Omni Goup, Cultured Code, and Panic turning out great applications for Windows? In theory, the market is an order of magnitude bigger yet the reality seems to be that there are fewer Windows users willing to pay for great software. Is it because Windows users are not on that platform by choice?
Desktop Linux users are definitely there by choice, yet they too seem to be unwilling to pay for software.
> Where are there no companies like the Omni Goup, Cultured Code, and Panic turning out great applications for Windows? In theory, the market is an order of magnitude bigger yet the reality seems to be that there are fewer Windows users willing to pay for great software. Is it because Windows users are not on that platform by choice?
There are such companies! For video games. Video games and B2B/business-productivity software is where the money is on Windows. Your buyers are PC gamers or businesses.
There are also lots of unsophisticated low-needs users who are on Windows at home because it's what was cheap at the big box store, and/or they learned in the earlier PC days to shop based on spec numbers back when those mattered much for basic home users, and the Apple section of the store doesn't look very interesting if you shop that way.
AFAI can tell (from watching relatives fitting this profile, which is... all of my relatives, plus many relatives of friends) the only way to make money off them, since the death of actual boxed software in stores (which they did used to sometimes buy!), is to be Microsoft and use Win10 to spam them until they buy (maybe not even realizing what's happening or, once they've paid, what they bought or what it's for or how to ever use it—yes, really) or be a straight-up malware scammer type. The problem with those users, if you want to sell them software, is they basically just want/need a browser, MS Office, and perhaps something to play the unorganized folder of pirated MP3s of 50s-90s music the unsophisticated-user side of the family has been passing around for 20 years. They may want something to organize photos and such, but they'll never figure out how to use it right even if you show them how, unless it's high-lock-in, very automatic ("AI" tagging and such), and probably cloud-based (so, likely some Web crap, not desktop software, oh and because advertising and hoovering up user data to train AI is an endless money spigot if and only if you already have massive scale you'll likely be competing on price with "free", so, have fun with that).
> Desktop Linux users are definitely there by choice, yet they too seem to be unwilling to pay for software.
Developing for Linux means receiving support load for all its desktop brokenness (driver problems [mostly video drivers]; issues with any basic hardware+software systems your software relies on to work well for all but the most basic operation, like audio for example; xorg/wayland/DE instability) and every crazy configuration out there.
Sure you can say "fuck off, we support exactly and only [a couple major desktop distros] with the default config that you get if you keep clicking OK on the OS installer and then not changing anything, and only on [list of hardware], report on any other set-up and you will be ignored" but you're already on such a tiny platform that you can hardly afford to annoy/turn-off/generate-vocal-anti-fans-among users. So instead... you develop for literally anything else, or maybe toss out some electron garbage to Linux as an afterthought.
Linux is expensive per-user and doesn't bring you that many users to begin with. Plus, yeah, there may be less willingness to pay for software.
I'm glad you pointed out video games because that's a great counterpoint.
It just seems strange that with a billion desktops, there aren't a few million who would buy software like OmniFocus or Things if it existed on Windows.
Linux userland tools in the last 10 years are incredible; it’s not easy to draw a direct line between that and the Mac, but I’m relatively certain there is correlation.
I’m talking about castnow, youtube-dl and a whole slew of other software which integrates proprietary things. In the 00s things like those were almost all running exclusive to windows.
From my purely anecdotal experiences of attending/organizing a few 500-person-plus hackstons and dozens of smaller ones, at least half of the people I meet there use macs. Its definitely in fashion still.
I switched to a m1 mac last year after nearly 4 decades of low level x86 work, incl. drivers, rootkits, and the like. These days I code in higher level languages such as rust or, ehm, python, and the mac offers a fine experience. Ironically, what pushed me over the fence was the pandemics; most of the video conferencing stuff causes windows laptops to sound like vacuum cleaners.
The mac’s main advantage to devs/hackers is Darwin, and we mostly use it for linux compatible software. Heck, I’ve switched from FreeBSD to osx just to get better laptop hardware and peripheral support.
Most stuff we produce are *nix compatible. I’m no great hacker but everything I wrote to make my life easier is shell based.
I think Windows with ASL will have the same impact, with more and more *nix compatible software being produced.
On the contrary, OS X had first support to Java via the Java Bridge and Apple's own JVM implementation, because they weren't sure if the developer community educated in Object Pascal and C++ would be willing to touch Objective-C, so they decided to play safe with having Java for the ride, doubling down on Java uptake.
Objective-C was embraced by the developer community in such a way that they quickly deprecated Java Bridge and eventually got rid of Java development costs altogether.
I remember running Java applications on a powerpc g4. It was incredibly slow and ugly, no wonder why developers still preferred objective-c over Java on Mac.
That was certainly not Apple's Java, as it had good JIT compiler with code cache, and the same L&F than Objective-C applications, that wasn't definitely it.
I remember the Java IDE like Eclipse or NetBeans being incredibly slow, and ugly, compared to the same software on similar priced computers running windows or GNU/Linux, or compared to more native software like Xcode or TextMate. But perhaps the powerpc g4 was simply a bad CPU for a JVM and Java developers were not very good at designing UIs for Mac.
I also remember having to wait a long time whenever the JVM was starting. It was annoying when a website did contain a Java applet because my web browser would be stuck until the JVM was ready.
It is. It's modality was an anti-inspiration, as in... how the hell can we not have modes (a GUI)... sort of... then everyone started burring crap under menus though feature accretion... hence we go full circle, because people get fed up of bloat, leading to both UX fatigue and performance fatigue, which eventually gives you a kind of appreciation for simple modal interfaces.
People really don't want to believe that we are chemical reactions and that Apple used psychology tricks on us through marketing campaigns.
It's a lot easier to justify buying an Apple product with an excuse than to realize that relentless advertising is what caused people to give Apple money.
I don't buy this. There was something special in that time of having a Unix system with a well considered cohesive UI. Windows was not an option, and 1 click install set-and-forget linux distros weren't (aren't? :P ) really common.
There was a clear gap in the market for a unix system that was up and running with minimal fuss, apple capitalised (accidentally happened upon) this.
Exactly! Psychology tricks would include supporting Linux commercially which is a wildly successful business (even before counting the likes of AWS which supports an OS as part of hosting).
They by definition market their product and win sales. Sales funded Linux and a lot of OSS beyond what hobbyists could possibly have done, an inconvenient truth.
People who want to support their own installation of Linux or FreeBSD do so without considering a commercial OS and a few of them constantly shitpost about how anyone could possibly not make the same choice as they do.
I'm not sure about this. Dev's are by far not the mass market of the PC industry - definitely not enough to achieve large adoption. Office users, casual users, students, etc may not even know what a "Unix system" is. For them marketing, being in the "cool crowd", aesthetics, and broad easy compatibility IMO are the things that make them more comfortable making a purchase. Herd behavior is real, and the lines in front of stores, that everyone they know is talking about them, etc just enforce more purchases. It makes users more secure, and inclusive in their purchase of your device.
More just an anecdotal opinion but to me the iPad and iPhone (the ecosystem effect) and Apple's marketing are really what gave them the leg up on the personal computing level. I still remember those "I'm a PC, I'm a Mac" ads pitching the PC as the uncool choice. That's a powerful message to people and they must of thought it would work otherwise they wouldn't of run that ad - if not to dev's the general public. Individuals were buying them locally as home machines way before companies really allowed them for work because of this marketing and the easy integration to their other consumer devices - at least that's what I saw locally here (not US).
I was specifically responding to the Developer angle of Mac buying. If you want to talk mainstream it was clearly the iMac then the iPod then the iPhone then the iPad that captured society.
I would argue they aren't that separated. If the Mac hadn't got popular with the consumer market it could be said that Linux distro's could of been the popular choice for that market, as it's free and the Apple laptop ecosystem could of languished. The success on the consumer side fuels the viability of the laptop from the dev side as well.
If it never got popular on the consumer end and I couldn't manage my office docs and such on it, while I may use it at home, many workplaces and therefore the mainstream dev may of never adopted the Mac. What I use at "home", is what I want to use when I come into the workplace. There needs to be a critical mass before most workplaces (where many dev's work) to adopt it and therefore develop an ecosystem around it (e.g. your favourite IDE's, tool chains, etc)
So that period of Unix based Mac started around 2002ish (maybe earlier), and was as you say unchallenged until 2007 by dell. Then Ubuntu back then would still have problems with printers/external screens etc (Printers back then were more important than they are now too).
Sometimes users have problems installing Linux, not because of lack of ability, but because of lack of driver support. This is especially true for laptops. I remember installing Linux on my desktop in 2004 as an 11th grader, and I remember having to purchase a $50 serial port modem at Fry's (yes, my parents were still on dial-up at the time) because the modem that shipped with my PC was a "Winmodem," one that had many of its core functionality implemented in software that was only available for Windows.
After my freshman year of college ended, in the summer of 2006 I replaced my desktop (which was dual-booting Windows XP and FreeBSD) with a Core Duo MacBook, which was just released two months before. It felt great using a Unix machine where I didn't have to worry about driver support, and where I can run Unix applications and various proprietary software packages such as Microsoft Office without dual-booting, emulation, or virtualization.
I've stayed a Mac user since, though lately I'm in the slow transition of switching away; I just replaced my 2013 MacBook Air with a Microsoft Surface Pro 7 running Windows 10 (I love WSL!), and I plan to replace my 2013 Mac Pro with a Ryzen 7 or 9 build sometime in 2022 or 2023, which will most likely run FreeBSD. In the interim I've installed many Linux and FreeBSD systems for work and for play.
I have also installed and managed many Linux systems. Doesn’t mean I want to on my main driver. I also don’t want a cheap plastic computer with a terrible trackpad (regardless of how great the intervals are) and until recently, that was pretty much all you could find in the non-Mac laptop world.
Yes I'm sure a good majority of us were mucking with Linux when we were 14... that didn't change the slickness of OSX as a Unix driver. It took Linux UI 10+ years after to get competitive, and still because of lack of corporate software (Microsoft, et al) it's still lacking.
Nah dude, I installed Slackware 7 in 1999 when I was 13, and used Linux up until 2003. Then I bought a G4 PowerBook and never looked back. People who chose OSX for its UNIX foundation weren't too dumb and stupid to use Linux, contrary to what you believe.
"In 1994 my friend Koling wanted to talk to his girlfriend in Taiwan, and to save long-distance bills he wrote some software that would convert sound to data packets that could be sent over the Internet. We weren't sure at the time whether this was a proper use of the Internet"
I also have a girlfriend in Taiwan, and met her while working there for 4 years, at a job partly made possible by PTT BBS (a very old Telnet web forum). I'm so grateful for VoIP, fast Internet connections, and being able to chat to her every night (at 5:20 of course, which sounds like 我愛你 in Mandarin).
"If you want to attract hackers to write software that will sell your hardware, you have to make it something that they themselves use. It's not enough to make it 'open.' It has to be open and good."
Are there any such platforms nowadays, that aren't restricted by a walled garden? (says he, typing this on a 2014 MacBook Pro running 10.13, while charging his iPhone 4S running jailbroken iOS 6.1.3).
Yes I remember the shift around that time. I had my first contact with a Mac at school in ca 2001. OSX was just released and the school received a new order of iMac G3’s as lab computers. What can I say. I hated the machines. Everything was so weird and ugly. It didn’t help that the teachers had to run OS9 for compatibility reasons with the school software. I had a brand new WinXP desktop machine at home which was miles and leaps ahead of the iMacs ;) well for me anyways. The only thing I really liked was iTunes. That is a story on it own ;). It took a few more years for me to come around. It was with the release of snow leopard that I saw that macOS was way better than windows. I switched in 2011 professionally to mac and use it still as my daily driver. But my love for the system that sparked with snow leopard dwindled. I didn’t want to update to Catalina (was forced in the end due to a new machine I got and well Xcode) and still refuse the UI of BigSur. I switched to Arch Linux on my personal machine but have to stay on macOS professionally as the only other option my company IT can support is Windows. Well macOS is no longer the highland for developers, WSL and Apple itself saw to that. But Windows is still a no go area as a daily driver for me.
Apple pumping iBooks into schools was a big coup for them in the early 2000s. A lot of people my age got into Macs and never got out of them, just because of that early exposure.
Yep. For me, I always “dealt” with Windows at school. Computers were just a device you were meant to wrestle with to get a job done and that was just an immutable truth.
Then I moved to a new middle school that would bring in carts of iBooks for us to do work on. Some things were weird (lack of a right click for one), but other things just felt incredibly natural and the design didn’t scream “soulless device for office drones.”
When my parents got me my first computer the next year, it was a Mac, and I’ve stuck with Apple for 17 years since then.
I'm from germany. Here Apple was never a huge player. I learned very late of its existence. My school was a so called OSZ (Oberstufen Zentrum) which is something like school a where you can gain work related diplomas. I'm really bad at describing this. They had a design and print department and they run all on apple.
One of our developers had used mac since the pre-Intel days. He used to develop mostly mac native apps. In recent years it has been mostly web dev in linux.
He ran Linux through vagrant on mac. Now he does the same on pc. He says that the hardest thing about moving is that he lost his browser shortcuts.
Ps he has a company provided PC, Intel and M1 macs on his home desk so really he can use whatever he wants.
Oddly enough it was around 2005 that I was using one platform at my workplace (Windows) and another at home (Linux). And I've used Apple II, MS-DOS, 68k Mac, etc. I decided two things:
1. Any time spent learning the "innards" of an OS, would be spent on Linux. Most of that learning has resulted from tinkering with the Raspberry Pi.
2. All of the software that I use on a daily basis would be platform independent, especially my programming tools.
As a result, right now I have the luxury of being ambivalent about platforms. I actually spend remarkably little time interacting with the platform, mainly setting up networking when I get a new machine. I can choose a new computer based on ergonomics and cost. Windows happens to have the best touch screen support right now, and refurb'd computers are not intolerably expensive.
Correct me if I’m misreading the article, but it’s interesting that the differentiators Graham is citing are relatively low-level, (ie keenel stuff, cpu architecture, and OS hacking), whereas almost everyone in these comments is making statements about GUI elements and even display switching. Kinda goes to show just how abstracted away the concerns of even 16 years ago have become to today’s ~devs~
Really love this footnote: "[1] These horrible stickers are much like the intrusive ads popular on pre-Google search engines. They say to the customer: you are unimportant. We care about Intel and Microsoft, not you."
The tipping point wasn't anything to do with OSX IMO. It was iOS. Still my absolutely biggest grudge against Apple is how shitty and locked in the iOS ecosystem is. I haven't wanted a Mac as a developer machine for the last 5 years but since I have to support iOS, I don't have a choice. I had a year "break" where I actually needed at Windows machine and it was a dream. Touchscreen was so incredibly useful.
While I have no issues with Apple/MacOS (we own several) or Windows (I think the nit-picking is silly) I still remember what Apple did in the PowerPC transition. That left a level of distrust for the company and how it makes decisions.
Context is important here. I wasn't running my own business at the time. I was working for a company that had somewhere around 250 Macs and maybe 10 or 20 PC's.
What happened?
Apple made the transition and, as a result, all software and hardware this company had invested in became obsolete, virtually overnight. We are talking about a non-trivial amount of money and resources.
I saw and experienced the pain that caused first hand. From that point forward I always had this in the back of my mind. As I moved to run my own business with limited funds, the last thing I wanted to face was making any investment that could be subject to that kind of a pole-shift effect. Macs, for the most part, were out.
It's interesting to see the level of nit-picking people on HN tend to apply to a PC running Windows. I think things change when you are responsible for your own bottom line and have to get practical. There's nothing wrong with the hardware or software. At least nothing wrong enough to be a deal-breaker. The proof? Probably tens to hundreds of millions of companies running all kids of businesses just fine using PC's. Compatibility, long term viability, cost and cost of ownership (repairs!) are far more important than being able to right-click an icon to get a convenient function to work.
Microsoft/Windows has always been about long term compatibility. That means things evolve slowly. That's OK.
Aside from that, at least in our case, the engineering software we run won't work on anything else. In some industries you have no other options.
The Linux question and WSL. I don't understand the complaints. I run multiple Linux virtual machines on any of our powerful Windows desktop or laptops. No issues whatsoever. Some of us dual-boot. Other than Linux hardware and other issues, no problems at all. In fact, we carefully select our hardware during builds (or when buying laptops) in order to ensure the greatest level of compatibility with both Windows and Linux software we use.
If there is a solid justification for using an Apple machine, I am all for it. That's why we have several of them. No issues at all. I just don't think the nit-picking is valid or useful any more. If you are in business you just want to get shit done. There's nothing seriously wrong with quality PC hardware and the software ecosystem that runs on it, Linux or Windows.
> Apple made the transition and, as a result, all software and hardware this company had invested in became obsolete, virtually overnight. We are talking about a non-trivial amount of money and resources.
Are you talking about the 68k to PPC transition? Because Apple made an enormous amount of effort to ensure both forward and back compatibility so I don't think your statement is true at all. "Fat binaries" allowed new software to run on 68k systems for years after the transition - you could even run 68k Mac apps on OS X for a few years!
And future versions of MacOS remained compatible with 68k Macs up until (if my memory serves) Mac OS 8.1. Even if you didn't upgrade to OS 9, most software ran on 8 anyway just fine.
They had to draw the line somewhere, but you had a long time to deal with the transition thanks to Apples efforts.
The company had very expensive media software suites that became incompatible. This isn’t about word processors and browsers but rather the case of essential and sometimes expensive businesses tools.
For me is more: Exodus from the Mac
Or even better: Return of the Linux converted Mac.
Great machines that can be use many years if transformed.
The software filoshopy does not relate to the (end of)world we are living.
People that can afford them is an elite and this is not the way to go.
Is Tim Cook planning to go to space just for fun too?
We have a responsibility to the planet!
I do not agree on this based on what I saw generally on the web.
The article was about *hackers* moving from a platform to OS X.
With M1, I saw a lot of content producers, web devs, ... to go to the M1, but not so much hackers... especially as the platform is *very* restrictive in terms of APIs / apps / etc.
A lot of my friends moved from macOS to a GNU/Linux distribution recently, especially as the hardware in general is more and more "as good" as a macbook.
But, again, this is my own opinion based on what I read and saw - I do not have stats to show how many hackers just passed from macOS to anything else (or stayed on an Intel powered macbook).
Nope, it's completely true. I left MacOS when they ditched support for 32-bit software, and was delighted to find that 95% of my software "just worked" on Linux. If you're looking for a true Unix box, you're shooting yourself in the foot by buying a Mac, especially today.
"the hacker" wants a suite of coreutils that are updated constantly. They want a package manager and an extensible system. They don't want their computer to second-guess them, and they don't want a large corporation to decide what's right for them.
This has nothing to do with the false claim about restrictive APIs.
> If you're looking for a true Unix box, you're shooting yourself in the foot by buying a Mac, especially today.
This has nothing to do with the false claim about restrictive APIs.
> "the hacker" wants a suite of coreutils that are updated constantly. They want a package manager and an extensible system. They don't want their computer to second-guess them, and they don't want a large corporation to decide what's right for them.
This is an ideological statement that has nothing to do with the false claim about restrictive APIs.
All of what you said may be true, but it doesn’t make the claim about restrictive apis any less false.
I agree. I used to do low level windows kernel-level hacks such as rootkits, custom memory and network sniffers, and this stuff does not travel because macos is much better designed. But for pretty much any other type of hacking, the good or the nasty stuff, one can make do with just about any platform. People make the mistake by assuming that if the OS is restrictive then they cant hack. Literally the opposite is true. If you want to truly hack, get an OS that tries to resist it :-) and for anything more common, like network-based hacking, using custom peripherials and the like - macos is just fine. Nope. Choosing a platform for hacking is mainly a matter of personal preferences.
Nothing, really. There's plenty that's restrictive about one of several app stores on the platform (the one offered by apple), but - that's a store.
Actually using the OS; you can do pretty much anything you want - up to and including installing practically everything ever written for linux (including entire window managers and the whole lot - I've seen full GNOME installs on top of MacOS). About the only thing you can't do is modify the proprietary binary blobs they give you, but that's just commercial software 101.
This dark idea of an "authoritarian apple" is the same sort of "conspiracy fantasy" that people project onto the motives of political parties they don't like. It's the same sort of "leaps of logic" to assume "oh, yeah, they restrict app store apps from adding kernel extensions, so — they must hate the idea of you having control over anything, so obviously you don't have root access to your own machine.". "Or okay, yeah, you still have that, but obviously they're about to rescind that, any decade now." They're not.
The prediction is wrong, because it's an extrapolation from a bad starting point — which is a total misread of their motives. The motive isn't about authoritarian control; it's about eliminating footguns.
---
For example, it's awful nice to be able to have a power failure, have my machine boot back up, and see all of my windows from the prior session right where I left them, and even have all the data in them refreshed from an autosave. As a system API, not just a per-app thing. Or, it's nice to install a program, and know that doing so is totally self-contained; it's not barfing a bunch of (potentially incompatible) new library dependencies into /usr/ or whatever that could screw with something else I've got installed. That I don't have to sweat over it "altering something" in my system when I do the install process. There are a lot of things like this; things where I basically feel like I've got a "wingman" or someone watching my back, because the folks who wrote it were primarily concerned about designing it so I'm highly unlikely to screw myself (i.e. the polar opposite of `rm -rf`).
It's just way lower stress to be able to focus on the actual problem I'm working on instead of also having to second-guess if my machine's going to betray me. That peace of mind isn't just a fluffy emotional thing; it also reduces cognitive load so I can work more effectively.
I often dock between standard-DPI displays and portable mode (with the high-DPI/'retina' display.) I first noticed that Windows doesn't correctly handle DPI switching with window borders, Explorer, and notifications back in 2019 (the old 2x scaling level remains when switching from portable to docked), and filed a feedback with the built-in app on Windows. I worked around this by killing dwm.exe and explorer.exe every time I docked my Surface. This issue was still present earlier this year, and deciding I had enough of dealing with all these little Windows 'quirks', wherever they arose, I switched back to my old Mac.
It turns out that SIP and Gatekeeper aren't nearly as much of a problem as I was led to believe, neither of these features have hampered me once. The Big Sur interface changes, whilst I thought I'd never get used to them, have actually grown on me. Since switching back, I've discovered a lot of quality native apps that simply have no analog on Windows–OmniFocus stands out here. And as always, Homebrew still exists and works just as well as it always has for most of my *nix related tasks.
Hearing about the M1 performance improvements, I can see myself staying on Mac for quite some time yet–I'll upgrade to an M1 MacBook Air once this 2015 MBP (six years old!) kicks the bucket.
Edit: For some context, I'm mainly a .NET developer. .NET Core/5 is a game-changer for cross-platform and development is first class on really any system nowadays. I've settled on JetBrains Rider for my IDE and find myself generally happier than I was with VS on Windows.