Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I agree with the overall sentiment. There's a loud minority of developers out there who are nostalgic for something they never actually experienced. It's a reaction to the explosion of complexity of computers and the increasing depth of the stack of software we must depend on to get anything done. It makes us vulnerable because we must depend on leaky abstractions since there's too much software to fully understand all of it. And sometimes there are bugs and it only takes being burned once or twice before you become suspicious of any software you haven't (or couldn't have) written yourself. But starting from scratch kills your productivity, maybe for years!

I'm truly sympathetic. The simplicity of the microcomputer era has a certain romance to it, especially if you never lived through the reality of it. There's a sense of freedom to the model of computing where only your program is running and the entirety of the machine is your own little playground. So I get why people make stuff like PICO-8 and uxn.

I agree with the criticism of Jon Blow's rhetoric, even though the tone is harsher than strictly necessary. Blow's criticisms and proposed solutions seem overly opinionated to me, and he's throwing out the baby with the bath water. He describes things like compile time and runtime hygienic macros like they're a new invention that hasn't existed in Lisp since before he was born.

However, I think targeting uxn is unfair. Uxn would be viewed better as an experiment in minimalism and portability. I think of it more like an art project.

It's unfair because the author is comparing mainstream languages that benefit from countless users' input and innovations over the course of 60 years to a niche system with a small handful of developers that has existed for maybe 2 years or so. That's a strawman if I ever heard one.



"There's a loud minority of developers out there who are nostalgic for something they never actually experienced."

Having actually experienced it, all the way back to writing business applications in mainframe assembler, I am not nostalgic for it. Today I write mostly in Rust.


Likewise, as a child born in the early 80s, my family's first computer was an Apple IIGS, and I routinely used Apple IIe computers in school from 1st through 7th grade. I wrote Apple II assembler during that time (albeit heavily relying on routines copied from books and magazines). And while occasionally fooling around with an Apple II emulator with an Echo speech synthesizer, or even just listening to one [1], makes me nostalgic for my childhood, I don't miss the experience of actually using or programming those computers. Things are so much better now, even (or perhaps especially) for disabled folks like me.

[1]: https://mwcampbell.us/a11y-history/echo.mp3 This file has a real Echo followed by an emulator. Unfortunately, I forgot where I got it, so I can't provide proper attribution.


I’m from the modern web dev generation so to speak, but just yesterday had an amazing conversation with my dad, who did what you describe.

He explained to me what a “patch” actually means, or meant back then. He was talking about debugging and implementing loaders for new hardware and such, then he mentioned a “patching area”, I asked wtf that means and apparently the release cycles then where so slow, that a binary left some dedicated space for you to directly hack in a patch. He then would change the machine code directly to jump to the patched area so he could fix a bug or extend the behavior.

Contrast this to the arguably wasteful CI/CD setups we have now.


> Contrast this to the arguably wasteful CI/CD setups we have now.

To clarify, are you arguing that modern build and release processes should patch binaries in-place, or something else?


As a kind of middle-ground: CI/CD burns our abundant cycles out of band. That’s a good place to spend extravagantly.

I’m all for burning more cycles on tests and static analysis and formal verification etc. before the software goes on the user’s machine.

But we all live every day with “good enough” on our machines every day. I think there’s a general consensus that too much software spends my CPU cycles like a drunken sailor.


It's also the question of what the burned CPU cycles actually buy. Cycles burned on testing buy us less buggy programs. Cycles burned on your CPU that do stuff like out-of-bounds checks or garbage collection also do that.

But most of them are burned on layers upon layers of abstraction that do not actually do anything useful safety- or correctness-wise; they're there solely because we've turned code reuse into a religion. Which wouldn't be bad by itself, if that religion had a firm commandment to not reuse bad solutions - yet that is exactly what we keep doing, again and again, patching it all with the software engineering equivalents of duck tape and glue to keep it all from falling apart. Why is C still at the bottom of virtually every software stack we run? Why do we keep writing desktop apps in HTML/JS? Does a simple app really need 20 dependencies of uncertain quality?

JavaScript is a good example. It's not a bad language because it's high-level - to the contrary, that's the best part of it! It's a bad language because, despite being high-level, it still gives any user ample opportunities to mess things up by accident. We need something just as (if not more) high level but better for general-purpose software development.


I like bitching about JS as much as the next guy, and as someone who has implemented ECMA-262 I guess I’m more entitled than most.

But let’s not get carried away with it. Eich had 9 days from unreasonable manager to shipping physical media. To do kind of a cool Scheme/Self thing from scratch in 9 days? I’ve been on some high-G burns, but that’s fucking nuts.

But since there’s no scientific way to quantify any of this, I’ll throw my anecdote on the scale in favor of my opinion and note that Brendan Eich was a hard-ass Irix hacker at SGI before he followed Jim Clark to Netscape.


I'm not blaming Eich. And I very much doubt that anyone originally involved with that project thought that their tech would be the foundation for large server-side and desktop apps.

But, regardless of how we got here and whose fault it is, we're here - and it's not a good place.


I’m not arguing. Just observing the difference. Different times, different needs and practices.

For example back then it was common to understand the whole machine code of a binary in total. We’re talking no abstraction, no runtimes. Portability and virtual memory were luxuries.

I definitely think CI/CD could be less wasteful, but I don’t necessarily think we should manually patch binaries in place.


There were other moments in time between mainframes and today.

For example, when I scroll a two-page document in Google Docs my CPU usage on an M1 Mac spikes to 20%. For an app with overall functionality that is probably less than that of a Word 95.


Rust can also be used to write business applications that will compile cleanly to mainframe assembler (at least if your mainframe is 64-bit and runs Linux).


That is better served with mainframe languages, thankfully C wasn't never a relevant language on mainframes.


That Jonathan Blow talk is awful. Repeatedly Blow stands up what is barely even a sketch of a straw man argument for a position and then just declares he's proven himself right and moves on. I can hardly see why even Jonathan himself would believe any of this, let alone how it could convince others.

And at the end it's basically nostalgia, which is a really old trap, so old that the warnings are well posted for many centuries. If you're stepping into that trap, as Blow seemingly has, you have specifically ignored warnings telling you what's about to happen, and you should - I think - be judged accordingly.


Assuming you're talking about the "collapse of civilization" talk, I think the primary appeal is that he's confirming the feeling many of us have that software, particularly user-facing software on personal computers, is going downhill. And I use that metaphor deliberately, because it reminds us that things get worse by default, unless we put in the work to counteract the relevant natural forces and make things better.

Whether he has any real solutions to the decay of modern software is, of course, another question. It makes intuitive sense that, since previous generations were able to develop efficient and pleasant-to-use software for much more constrained computers than the ones we now have, we can gain something by looking back to the past. But those previous generations of software also lacked things that we're no longer willing to give up -- security, accessibility, internationalization, etc. That doesn't mean we have to settle for the excesses of Electron, React, etc. But it does mean that, at least for software that needs qualities like the ones I listed above, we can't just go back to the ways software used to be developed for older computers. So, I think you're right about the danger of nostalgia.


> But those previous generations of software also lacked things that we're no longer willing to give up -- security, accessibility, internationalization, etc. That doesn't mean we have to settle for the excesses of Electron, React, etc. But it does mean that, at least for software that needs qualities like the ones I listed above, we can't just go back to the ways software used to be developed for older computers. So, I think you're right about the danger of nostalgia.

I think that's precisely the danger. It's the danger of using nostalgia to feed into a purity spiral. We can simultaneously acknowledge that there's problem where we create bad software that wastefully uses resources on a user's computer while understanding that part of modern development has made computing much safer and more accessible than it used to be. Instead of looking _backward_, we can look _forward_ to a future where we can continue to be safe and accessible while not being as wasteful with a user's resources.


We need to look both backward and forward, because the past still has so many useful lessons to teach (which is a far better way to learn them than making the mistake that prompted them in the first place!). The problem is blindly repeating the past, not looking into it.


> He describes things like compile time and runtime hygienic macros like they're a new invention that hasn't existed in Lisp since before he was born.

I suspect that this is where some of the inspiration comes from, because he mentioned being a fan of Scheme at a young age. At the same time he wants strong static typing, fine grained control and power (non restrictive).


> It's a reaction to the explosion of complexity of computers and the increasing depth of the stack of software we must depend on to get anything done.

People say this with a straight face, and I don't know if it's an elaborate joke of some kind or not.

We're building a tower of babel that requires supercomputers to barely run, and somehow end up defending it.


In most english translations the tower of babel was struck down by god because it represented the ability of humans to challenge the power of god. If we are indeed building a tower of babel that's cool, it means that "nothing that they propose to do will now be impossible for them."¹

The comment you're replying to is not defending the explosion of complexity but pointing out that we resent it precisely because we find ourselves dependent on it. The article is pointing out that we tend to take bad or counterproductive paths when we try to free ourselves from that complexity though.

¹https://www.bible.com/bible/2016/GEN.11.NRSV


> There's a sense of freedom to the model of computing where only your program is running and the entirety of the machine is your own little playground.

These days, we call that little playground a 'sandbox'. But I think OP's point is that sandboxes can be a lot more efficient that what they see w/ uxn. It's not exactly a secret that the whole FORTH model does not easily scale to larger systems: that's why other languages exist!


Quick question: What's the "uxn" you're referring to? Hard to google 3 letters.


Only what the article was about.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: