Sorry if this comes across as overly facetious — I’m sure you have a reason for doing it that way! — but would it not be easier just to bow to convention and rename your .js files to .jsx?
Probably. It's just that I've always used .js for my projects (decades). Such a rename would likely result in configuration changes to the other tools I use, but indeed they are better documented. When faced with a multiplicity of conventions I pick one and stick to it; the tools are flexible enough to work with it I'm sure, the real issue is of discoverability.
But it’s not just convention… JSX files are not valid JS files. Also, as a programmer, I would be annoyed to open a JS file and find out it’s actually something else.
I'm seeing a lot of comments here about macOS/iOS unification, but I think people are getting worked up about nothing.
What do macOS window styles have to do with iOS? iOS (mostly) doesn't have windows!
What does the MacBook Neo have to do with iOS, other than coincidentally using some of the same components? Maybe Apple decided to make a cheaper Mac because they thought people might want to buy a cheaper Mac.
They are trying to use a common design language across all their devices, sure. But you would hardly expect them to do the opposite! They might try to make a hybrid tablet/laptop or something at some point, sure, but none of their current moves point inevitably in that direction. Except maybe for software notarization, but that has nothing to do with window corners or cheaper laptops.
I dislike Tahoe too, but this particular thing is not new.
I just did an image search for "classic macos" and one of the first hits was from https://www.versionmuseum.com/history-of/classic-mac-os. Look at those System 1 screenshots, from 42(!) years ago -- round corners on Puzzle and Calculator, square corners on Note Pad and Control Panel! No consistency at all, isn't it infuriating?
Article author here. I think the quoted claim is somewhat misleading. There are at least two different ways to interpret a UI feature as "not new":
1) The feature has been in the operating system all along.
2) Something analogous existed 40 years ago and then disappeared long ago.
You're referring to 2, not 1.
The only reason I chose Calculator app for my screenshot is that its window is very small, which allowed me to make a small screenshot, because people may be reading the blog post on small phone screens. In other ways, admittedly, Calculator is not a great example, because its window is not actually resizable, and thus it's not the type of window that you would normally place in the corners of your screen, like a resizable document window.
Rounded corners on a "widget" type of app are not as objectionable. As other commenters have noted, the calculator in "classic" Mac was a special Desk Accessory. In contrast, on Tahoe, the varying corner radii affect ordinary document-based apps.
TextEdit, for example, did not start to have rounded bottom corners until Mac OS X 10.7 Lion, which was itself much maligned for bringing the iPhone UI to Mac.
You're right, what I had in mind was 2, although a bit more general; I think there have been similar kinds of inconsistency in the Mac UI since the beginning, in various forms, almost always intentional.
So I think it would be wrong simply to say "the UI has gone a cliff, they've just thrown away their own HMI guidelines." You can certainly dislike what they've done (and I do dislike it!) but they at least have a somewhat logical goal in mind -- in this case, making the corners neatly fit various different kinds of window content.
Having said all that, there are also some real bugs and unintentional glitches, like scroll bars and other widgets not fitting correctly. I'd agree that seems to be happening more often in recent years, so their quality control has gone downhill.
> So I think it would be wrong simply to say "the UI has gone a cliff, they've just thrown away their own HMI guidelines."
My own view is that Mac OS X 10.5 Leopard was the beginning of the end, and it's gone downhill ever since.
> they at least have a somewhat logical goal in mind -- in this case, making the corners neatly fit various different kinds of window content.
I would emphasize "somewhat", because as noted, the corners do not neatly fit the window content at the bottom, which is ironic, because Apple claims that their intention is to emphasize the content, but their implementation actually clips the content!
That led me to https://www.folklore.org/Desk_Ornaments.html which is a very fun read. Interesting to note that the UI style of the DAs is actually not consistent at all, some have round corners and some don't.
I particularly like this Bill Atkinson tidbit at the end:
Bill Atkinson complained to me that it was a mistake to allow users to specify their own desktop patterns, because it was harder to make a nice one than it looked, and led directly to ugly desktops. [...] So he made MacPaint allocate a window that was the size of the screen when it started up, and filled it with the standard 50% gray pattern, making his own desktop covering up the real one, thus protecting the poor users from their rash esthetic blunders, at least within the friendly confines of MacPaint.
(He was totally right, making your own desktop patterns was fun but the standard checkerbard was far and away the best choice.)
“Well actually” in System 1 and later Classic macOS the puzzle and the calculator are ”Desk Accessories” that is applications that can run simultaneously as other apps, even though the operating system does not support multitasking. The rounded corners are there to distinguish them from the current running application.
Yep, I'm aware. Just like Tahoe, it's intentional and there's a rationale behind it. It may or may not be immediately obvious depending on the user, and people may or may not like the way it looks.
Clarke’s book The Lost Worlds of 2001 goes into a lot of detail about the process (and is a great read in its own right). His take was that the book should say “a novel by Arthur C Clarke, based on the screenplay by Stanley Kubrick” and the movie should say “screenplay by Stanley Kubrick, based on the novel by Arthur C Clarke”.
I think Kubrick was very much the dominant force in the partnership, but they did work quite closely together.
I’d like to recommend Kate Beaton’s book Ducks to get a vivid feel for what these “man camps” are like. That book is about camps attached to oil fields in Alberta, but the “AI camps” described here sound very similar.
The existence of temporary accommodation for workers in construction projects should not be the issue. It seems like this is a necessary and sensible thing.
The problem is with the quality of that accommodation.
It is also worth noting that there should not be an issue due to the fact that the accommodation provider also supplies accommodation for asylum seekers, because they should be providing acceptable accommodation to those people too.
You can probably add prisons to that list too.
Workers, immigrants, and prisoners all deserve reasonable living conditions. Why people are being housed in a place is irrelevant.
The AI link in this story seems to be simply because there are construction projects involving AI, that seems rather spurious. They wont be the first or last construction projects. Those workers deserve (and probably don't get) the support they need whether they are building a data center, a Casino, or a hospital.
Or you could click the link in the article where they talk about the temporary housing for data centers, including the perks they’re including like “free steaks” and golf.
Oil fields in Alberta are a very different situation than high budget AI data centers in the US.
What makes it very different? It sounds quite similar to me. Each is a lucrative business that requires lots of physical infrastructure to be built out, and therefore needs a large but temporary influx of construction workers and engineers.
How is it not different? These aren’t remote oil fields. The workers could commute to the data centers if they didn’t want to stay at temporary housing.
The article and the one it links to say that the temporary housing is a perk that they’re offering to try to entice workers. It includes gyms, nice food, and activities like golf.
The comparison above to bad oil fields in Canada is arbitrary. Not all temporary housing must be like oil field accommodations in remote Canadian oil fields.
Well, hang on, the brief TechCrunch article we're discussing here links to two different Bloomberg articles. The first is from 2018 about "housing for men working in remote oil fields", the second from 2026 about a data center in Dickens Country, Texas.
I think you're getting overly fixated on "remote Canadian" here. West Texas is plenty remote. Those temporary workers in Dickens County must far outnumber the local population. If people wanted to commute, where are they going to commute from? The closest big city is Dallas, four hours away. (Edit: I tell a lie, Lubbock is closer if that counts.)
It sounds like you're maybe envisaging a Googleplex, a cool campus where young college hires will want to come and hang out with like-minded peers (and work for long hours as a convenient side-effect). I definitely think it's going to be much more like an oil rig -- people will be paid well, and a decent amount of money will be thrown at entertainment and benefits, but fundamentally it's a place to house hundreds of men who have no reason to be there except that the work has to happen at that specific site.
This article and the linked ones specifically talk about "man camps", not even something like "company towns" where they're maybe trying to establish an actual long-term community.
> It sounds like you're maybe envisaging a Googleplex
No I’m envisioning what the article is describing combined with my experience with construction projects. You’re the one injecting other stories about Canadian oil fields to the story about something completely different.
I'm confused about how we can be interpreting the same short article so differently. It says: "This style of camp was popularized as housing for men working in remote oil fields." So the living conditions in Canadian oil fields seem perfectly relevant.
I didn’t say they were running it locally. A single tab of ChatGPT eventually eats a few gb of ram if you leave it open and are using it constantly, which is how I see most “casual” people using it in my vicinity. I’ve seen it push 2-3 gigs regularly.
People also tend to just leave a lot of tabs open all the time. Even when they are paused I just see people’s computers slow to a crawl because of this constantly.
I agree it should be enough and it even often is, but unless we’re going to get everybody to change their habits and get websites to be more efficient with tam usage (like how at work every single HubSpot tab I have uses 300-400MB which is ridiculous) 8 will become an issue, especially on macOS and windows. The way things are going it’s going to become a real problem in 5 years, which is relevant when we’re talking about ewaste. I just don’t see an 8gb mac neo lasting 8-10 years.
This has been my experience. I understand why other people disagree with me, but this is just what I’ve witnessed around me.
Obj-C does have a "nonnull" annotation now (apparently added to assist Swift interop). One of the final jigsaw pieces turning it into a really pleasant language.
It is a really pleasant language, but I think the <nonnull> annotation is for initialization only - compiler checking against initializing an object ptr with a null value - and does not prevent crashing when addressing an already released object
> does not prevent crashing when addressing an already released object
I don’t know what behavior you’d expect here or in what situation you’d encounter this for a nonnull reference. You’d have to be really living dangerously I’d imagine. The footgun was that nonnull isn’t enforced. And anyway, leaks were more the issue.
GP mentioned ‘chasing null pointer exceptions’, then parent mentioned that the language ’now does have nonnull annotation’, prompting me to explain that that does not prevent null pointer exceptions.
So, not living a dangerously. All that can be held against me is being “dangerously” imprecise on HN - definitely not good either
nonnull doesn't really do anything in pure objc. It warns if you assign the nil literal to a nonnull pointer and that's it. The annotation is almost entirely for the sake of Swift interop (where it determines if the pointer is bridged as an Optional or not).
Yeah, Objective-C++ is surprisingly great. It sounds like a terrible idea, but the bridging works pretty much seamlessly, and Obj-C and C++ don't actually overlap all that much so they don't step on each other's toes. Each language has strengths that shore up the weak spots in the other.
+1 to Objective-C++. It makes for some surprisingly clean, compact code, best of both worlds, really. And the bridging between ARC and CF types is really quite magical, more languages should have that ability to be expressed in an older language without stripping everything out.
I just wish there were Objective-C bindings for more CF classes without having to mess with C.
For better or worse, he’s mostly know in the “street/urban art” world (which is much bigger than graffiti). And one of the features of a lot of the art in that scene is high technical mastery paired with “low” / populist motifs and composition.
That's an interesting distinction. I hadn't really noticed that but it makes a lot of sense.
I suppose Banksy would be close to the crossover point between those two worlds? The ideas and the chutzpah are the main attraction, but generally 'low' populist motifs, without high technical mastery. Someone you could either look up to or sneer down on from either side.
It's a good question whether Banksy really is a crossover, or only a crossover in market terms. I would definitely call him a high master of stencil technique though, some of that stuff is pretty hard to pull off.
As clever as his art is, I think he's still very much an outsider in the capital-W Art World, which for his part he's often trying to prank. (Which they richly deserve, see Exit Through the Gift Shop.)
Things like the self-destroying painting were high-concept but also completely staged. For another artist getting rich off his contempt for the Art Market, but solidly on the Art World side of the fence, see Maurizio Cattelan.
One person with a foot in both worlds is Alex Face but he's mostly known in South-East Asia. I have a feeling it'd be easier to find examples in Asia than in the West.
They said "shallow and uninspired" but that's separate from "requires immense skill and patience". The point is, whether or not the process is cool and impressive, is the end product really very interesting?
It can be valid to criticize something as uninspired even if you're not capable of doing it yourself. Movie critics would have a hard time otherwise.
In this case I wouldn't be quite as dismissive, personally. But if you've seen one, have you seen them all? Probably yes.
reply