Subpixel font rendering is critical for readability but, as the author points out, it's a tragedy that we can't get pixel layout specs from the existing display standards.
Only on standard resolution displays. And it's not even "critical" then, it's just a nice-to-have.
But the world has increasingly moved to Retina-type displays, and there's very little reason for subpixel rendering there.
Plus it just has so many headaches, like screenshots get tied to one subpixel layout, you can't scale bitmaps, etc.
It was a temporary innovation for the LCD era between CRT and Retina, but at this point it's backwards-looking. There's a good reason Apple removed it from macOS years ago.
Even on standard resolution displays with standard subpixel layout, I see color fringing with subpixel rendering. I don't actually have hidpi displays anywhere but my phone, but I still don't want subpixel text rendering. People act like it's a panacea, but honestly the history of how we ended up with it is pretty specific and kind of weird.
> ...I see color fringing with subpixel rendering.
Have you tried adjusting your display gamma for each RGB subchannel? Subpixel antialiasing relies on accurate color space information, even more than other types of anti-aliased rendering.
> the world has increasingly moved to Retina-type displays
Not my world. Even the display hooked up to the crispy work MacBook is still 1080p (which looks really funky on macOS for some reason).
Even in tech circles, almost everyone I know still has a 1080p laptop. Maybe some funky 1200p resolution to make the screen a bit bigger, but the world is not as retina as you may think it is.
For some reason, there's actually quite a price jump from 1080p to 4k unless you're buying a television. I know the panels are more expensive, but I doubt the manufacturer is indeed paying twice the price for them.
My desktop monitor is a 47” display … also running at 4k. It’s essentially a TV, adapted into a computer monitor. It takes up the whole width of my desk.
It’s an utterly glorious display for programming. I can have 3 full width columns of code side by side. Or 2 columns and a terminal window.
But the pixels are still the “normal” size. Text looks noticeably sharper with sub-pixel rendering. I get that subpixel rendering is complex and difficult to implement correctly, but it’s good tech. It’s still much cheaper to have a low resolution display with subpixel font rendering than render 4x as many pixels. To get the same clean text rendering at this size, I’d need an 8k display. Not only would that cost way more money, but rendering an 8k image would bring just about any computer to its knees.
It’s too early to kill sub pixel font rendering. It’s good. We still need it.
Reading this message on a 4k (3840x2160 UHD) monitor I bought ten (10) years ago for $250usd.
Still bemoaning the loss of the basically impossible (50"? I can't remember precisely) 4k TV we bought that same year for $800usd when every other 4k model that existed at the time was $3.3k and up.
It's black point was "when rendering a black frame the set 100% appears to be unpowered" and the whitepoint was "congratulations, this is what it looks like to stare into baseball stadium floodlights". We kept it at 10% brightness as a matter of course and still playing arbitrary content obviated the need for any other form of lighting in our living room and dining room combined at night.
It was too pure for this world and got destroyed by one of the kids throwing something about in the living room. :(
Because apple controls all their hardware and can assume that everyone has a particulr set of features and not care about those without. The rest of the industry doesn't have that luxury.
The artifacts created by subpixel AA are dumb and unnecessary when the pixel density is high enough for grayscale to look good. Plus, with display scaling, subpixel AA creates artifacts. (Not like display scaling itself doesn't also create artifacts - I cannot tolerate the scaling artifacts on iPad, for example)
Apple cannot guarantee the pixel density will actually be high enough. They make computers and tablets that can attach to any external monitor.
macOS looks *awful* on anything that isn't precisely 218ppi. Other than Apple's overpriced profit-machine displays, there are two displays that reach this: LG's Ultrafine 5K, and Dell's 6K (with its ugly, extraneous webcam attached to the top). Other 6K monitors were shown at CES this year but so far, I haven't actually found any for sale. EDIT: Correction, LG apparently doesn't sell the 5K Ultrafine anymore, at least on their website.
That means, the odds are incredibly high that unless you buy the LG, or drop a wad on an overpriced Studio Display or the even worse valued Pro Display, your experience with macOS on an external monitor will be awful.
That's even before we get into the terrible control we have in the OS over connection settings. I shouldn't have to buy BetterDisplay to pick a refresh rate I know my display is capable of on the port it's plugged into.
The funny thing is that in some ways it's true. Modern phones are all retina (because even 1080p at such a resolution is indistinguishable from pixelless). Tablets, even cheap ones, have impressive screen resolutions. I think the highest tea device I own may be my Galaxy Tab S7 FE at 1600x2500.
Computers, on the other hand, have stuck with 1080p, unless you're spending a fortune.
I can only attribute it to penny pinching by the large computer manufacturers, because with the high-res tablets coming to market for Chromebook prices, I doubt they're unable to put a similarly high-res display in a similarly sized laptop without bumping the price up by 500 euros like I've seen them do.
> like screenshots get tied to one subpixel layout
we could do with a better image format for screenshots - something that preserves vectors and text instead of rasterizing. HDR screenshots on Windows are busted for similar reasons.
It looks like the DisplayID standard (the modern successor to EDID) is at least intended to allow for this, per https://en.wikipedia.org/wiki/DisplayID#0x0C_Display_device_... . Do display manufacturers not implement this? Either way, it's information that could be easily derived and stored in a hardware-info database, at least for the most common display models.
I don't think any OS exposes an API for this. There's a Linux tool I sometimes use to control the brightness of my screen that works by basically talking directly to the hardware over the GPU.
Unfortunately, EDID isn't always reliable, either: you need to know the screen's orientation as well or rotated screens are going to look awful. You're probably going to need administrator access on computers to even access the hardware to get the necessary data, which can also be a problem for security and ease-of-use reasons.
Plus, some vendors just seem to lie in the EDID. Like with other information tables (ACPI comes to mind), it looks almost like they just copy the config from another product and adjust whatever metadata they remember to update before shipping.
I don't understand why, this has been a thing for decades :(. The article is excellent and links to this "subpixel zoo" highlighting the variety: https://geometrian.com/resources/subpixelzoo/
“Tragedy” is a bit overstating it. Each OS could provide the equivalent of Window’s former ClearType tuner for that purpose, and remember the results per screen or monitor model. You’d also want that in the inevitable case where monitors report the wrong layout.
Subpixel rendering isn't necessary in most languages. Bitmap fonts or hinted vector fonts without antialiasing give excellent readability. Only if the language uses characters with very intricate details such as Chinese or Japanese is subpixel rendering important.
HN username for an extremely talented software engineer and software-engineering communicator, justine.lol. Probably most known around here for her cross-platform C code (cosmopolitan) and relatedly redbean, a zip file and tool that is also an executable file-server-file that hosts itself and can produce other such self-hosting cross platform executable zip file servers.
I wrote a paper this year (after about a 15 year hiatus) and apparently submitted papers, this was to an elsevier journal, now automatically go though a plagiarism detector of some kind that gives a score that can't exceed some value. We actually failed the first time because we'd published the paper on Arxiv first, which seems like a big flaw in the system. I don't have a lot of faith in, nor do I support automated detection like that, but it does happen systematically now.
Not without false positives, which are unacceptable (unless you're willing to put the work saved from plagiarism checking into absolving the innocent).
I was forced to submit essays to a plagarism checker in high school (well over a decade ago). The company running this checker had, in their infinite wisdom, included a check for writing complexity under the assumption that a 10th grader who writes like an adult is probably cheating. This was embarrassing for everyone.
I had a high school English teacher who became suspicious of a piece of writing I did for this reason—no automated checking was necessary! he called me in to talk with him about the work, probed my knowledge a bit, satisfied himself that it wasn't plagiarism, and gave me a compliment for writing at a high level.
False positives are not necessarily deal-breakers, as long as the teacher treats the situation carefully before taking any disciplinary action.
YouTube is effectively a Google Podcasts platform already.
Paying for YouTube Premium to avoid ads is some of the best money I spend each month. I wish I could do the same for Spotify podcast ads. I do pay for Spotify already to avoid ads in music, but I can't seem to avoid them for the Joe Rogan Experience.
Lots of podcasts do not exist on YouTube. There is no way to subscribe to feeds or auto-download episodes or any of the dozens of quality of life features that podcast users have been enjoying for over a decade.
I subscribe to YouTube Premium but to consider it a replacement to a proper podcasting app seems naive to me.
If you like things inshittified, then it seems like a great idea. Let's put big tech in between me and the things I want to consume, when there's no need, and let them begin to try to make money off of it and make everything worse!
You mean like introducing ever increasing ad loads and removing background playback to a premium tier to create “value” for subscriptions while simultaneously trying to resurrect cable television despite the Internet being a thing, for example?
I love how hard people will advocate for their YouTube subscriptions like it’s the best thing ever.
YouTube’s recent changes are so Hollywood entertainment exec-esque lately (hey you got to make earnings at all costs!), as they gradually enshitify the Internet’s most popular and important content type, and they can do pretty much whatever they want because, really, they are an effective monopoly for UGC video delivery.
People LOVE espousing their “ad free subscriptions” and rail about the cost of video infrastructure, because they probably don’t remember when copyright broke down and there was a glimmer of the lack of false scarcity digital delivery actually could have created. You can’t download a car.
Google and Comcast literally were the ones who killed off peer-to-peer and decentralized solutions and made sure no competitor emerged. Bram Cohen had it right, but freedom scared big tech and video killed the radio star.
But please tell me how awesome YouTube “ad free” accounts are. In the meantime, enjoy shorts!
I miss the free Internet and actual disruptive innovation so much.
Curiously, the reply to yours calling out all these "YT premium is great" replies as potential Google employees was flagged. Why? It's not an invalid assumption.
I err on the side of people have been so sick of YT's badgering that Premium truly feels like a great solution - despite that only being the case because of Google creating the problem in the first place.
“Please don't post insinuations about astroturfing, shilling, brigading, foreign agents, and the like. It degrades discussion and is usually mistaken.”
Enshittification is becoming a big deal on YouTube. The set of topics that YouTubers need to walk on eggshells around in order to avoid being demonitized is getting out of control, and is becoming increasingly reminiscent of the sort of constant, normalized self-censorship that's accepted as the norm on the internet in China. And it's entirely driven by the whims of advertisers.
But, also, I feel like they've sort of got me as a captive audience right now because YouTube's the only video platform that has a sizable international creator base.
> Paying for YouTube Premium to avoid ads is some of the best money I spend each month.
It’s a ridiculous amount of money to spend to financially support Google’s exploitation of content creators and abusive “copyright” system, just so you can watch content they (a) already stream for free and (b) is already full of product placements, stealth marketing, sponsored segments, etc.
No way in hell I’m paying a huge premium for that.
I loved the idea of Youtube Premium at first, but I feel €15/month is exorbitant amount for what we're getting in return. Same for Facebook/Instagram which will have a similar cost starting from February.
> but I feel €15/month is exorbitant amount for what we're getting in return
I think it comes down to how often you use the platform. YouTube has been my primary media source for the last twelve years and ads are what paid for that. Now that I can pay $14 (USD) per month to give the people I watch a bit more, have no ads, downloads, and a few other nice little features, it was a no brainer for me. I have always gotten _way_ more mileage out of YouTube than HBO or Netflix.
But if you don't, then it's probably not worth it and having an advertisement will pay for it instead.
That seems high but I know pricing can vary significantly across countries. Is that somewhere in Europe for example? Or any government or Apple taxes included?
in the US, going through the official website, it shows $14 a month or $11.66 a month if paying for the upcoming year at once. Premium and Music are part of the same product here and I'm not sure if you can unbundle.
For comparison, Spotify Premium here looks to be $11/mo.
On one hand I wish they had an "occasional tech video" tier - maybe even the option for me to choose which creators get the credits.
At the same time I can't shake the feeling that Google just enshitified the experience so much, that now "it's logical" to take a sub to make it decent again.
I also left Spotify because of podcasts. I did not want to listen to podcasts, and they couldn't get that through their heads, shoving it down my throat, advertising it to me, even though I paid for an ad-free experience. Drove me crazy, so I left.
What did you switch to? I tried Tidal, loved the noticeably higher-quality audio, but there were two dealbreakers for me - no crossfade playback (for running) and no downloads to the desktop app (for travelling). I could just about have dealt with no downloads on the desktop app, but they actually removed the crossfade playback a while ago because, according to an official reply on their support forums "it didn't add to the customer experience", which to me seems like such a blatant lie to cover for something else like they found it difficult to do for some technical reason.
I switched to a combo: For myself, I'm back to a music library that I own. No DRM. For my wife, I signed up for Apple Music. She wants more variety than our library.
Thanks. I've been getting back into the Apple Music app a bit - I have a large library of CDs that I've digitised too, but I've discovered so much on Spotify over the years that I think I'd miss that side of it. I don't really have time to read music websites (and despite reading Melody Maker and NME quite a bit when I was younger) I find most of it overblown.
Like a sibling comment, I now also buy CDs and LPs of albums realeased in the pre-streaming era, depending on which format the album was made for. But I always immediately rip the CDs into FLAC because I'd forgotten how brittle they can be in terms of a slight scratch ruining one or more tracks.
And for more recent stuff that I really like, even if I hear it on Spotify, I try to buy it on Bandcamp or similar, so there's some cash going directly to the artist.
What all that doesn't solve for me is discovery, which I find is significantly better on Spotify than the Apple Music service. (Or the worse audio quality of Spotify compared to Tidal - my test album is always Selected Ambient Works 85-92 by Aphex Twin. There's no hiding when it comes to the synth hi-hats at the beginning of Xtal.)
> Seriously considering just getting a record player so I can get rid of these middle men though. It's too much.
Do it! I started buying CDs again, and it's great. I get the best quality sound — we can argue about that another time ;) — all the music is _mine_, and after I convert it to FLAC for my walkman or iPhone (play via VLC) it's as convenient.
Companies have redefined advertising to exclude advertising for their own platform (and people will defend this for them and claim that "promotions aren't ads" or something like that).
At this point, we need a bill in Congress to establish that "ads are ads" and that "ad-free must be ad-free".
I came here to remind everyone that college radio streams are almost all ad-free and still a great means of music discovery.
Also many many many hobby/community radio stations streaming narrowcasts focused on your favorite obscure musical genre.
Radio.Garden is a fun place to behold the breadth and depth of what's available, right now, subscription free.
I wonder if that's a sign of things to come. Nowadays it's (in most cases) pay so you don't see ads, but in the future it will probably shift towards pay so you see fewer ads kind of model.
I mean Spotify already kind of does it with JRE. It doesn't feel right that he's got his own ads despite the very lucrative deal he got, and I'm still forced to listen to him drone on about yet another generic vpn service even though I'm paying for a subscription.
So glad someone said it. All these "just pay for YouTube Premium" comments make me sad. Killing all competition with effectively endless funding for your own thing, then charging for the only game in town should be illegal.
It's extremely expensive to store and distribute every video online. Plus, I use it like 10x a day. I dunno, that makes it worth it for me over Netflix which I use maybe once a week.
I pay for content too, just never to companies that drove all competition out of business with their bottomless war-chest, and only then started adding ads and asking for money. Fuck. Google.
If you don't like YouTube, you can watch TV instead, it's still there. Or read a book. "Extortion" – give me a break... Video creators choose YouTube because it gives them a chance to make an income from their videos, or at least an easy way to upload and host them.
This is why people should never be given anything for free, ungratefulness and entitlement is the only reward. I'll greatly enjoy when everything is behind a cheap paywall, giving creators their fair share and leaving penny pinchers in the dust.
We're starting with react web, planning to expand from there to other surfaces, including React Native.
I suspect the gap from the current version, to React Native support is rather small. It's open-source, so feel free to open an issue so that it can get upvoted/prioritized, (or even better, you can work on it too ;) )
Yeah, or wireframing, or low-fidelity prototyping. Definitely not new advice, but I guess it'll be somebody's first exposure to the idea, so they'll learn a new thing today.
Yea, I find this article kind of weird in the number of comments it has drawn. Low-fidelity wire framing for workflows has been a UX/UI standard for like a decade now.
the public-at-large i'm exposed to seems to be more apt to assume they're aliens -- so frankly anyone that assumes it's just an innocuous weather balloon is a breath of fresh air for me.