Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The story about evolution is totally wrong. What really happened is that, time after time, each generation of computers developed until it was very capable and sophisticated, and then it was totally replaced by a new generation of relatively simple, stupid ones. Then those were improved, usually completely ignoring all the lessons learned in the previous generation, until the new generation gets replaced in turn by something smaller, cheaper, and far more stupid.

Beautifully said. But for the last couple decades not even that is true. It's just been more and more of the same OSes, doing basically the same things. What it means to develop a native app is pretty fundamentally unchanged since Windows 3.11 or ISX days.

Software has changed, but only neo-mainframes. All the groupware and multi-tenant and scale-out software systems the author decries as lost? Rebuilt as online systems, powered by communicative capitalism and vast data-keeps ("the cloud").

> We run the much-upgraded descendants of the simplest, stupidest, and most boring computer that anyone could get to market. They were the easiest to build and to get working.

Unsaid here is I think where we really got on the fixed path, is what begat honogenization. Everyone else seemed to have been competing to build better hardware+software+peripheral systems against everyone else (with some close alliances here and there).

1989 changed everything. The EISA bus was the reformation of a thousand different competing computing industries into a vaster cohesive competitive ecosystem, via the Gang of Nine. https://en.wikipedia.org/wiki/Extended_Industry_Standard_Arc... . This commodified what used to be closed systems. Everyone either adapted to the homogenized common systems, or, one by one, the DECs and Suns and every other computer maker folded or got sold off.

The business model, to me, defines what happened. Giving up on being proprietary, giving up on controlling "your" ecosystem was the hinge point. Computing was boutique and special systems, competing on unique and distinct features and capabilities. With the rise of interchangable computing parts it transitioned to a more collaborative & much more competitive system. Distinction carried the risk of being too far afield, of your stuff not being compatible with the others. You had to play in the big tent. Versus all the other upstarts playing against you. Meanwhile the legacy featureful unique & distinct business-model players never could get enough market share to survive, could certainly never compete downmarket, and slowly has positions eroded up and up market until those last upmarket places collapsed too.

The love of lisp here has many of the same pitfalls. Sure it's an elegant modifiable system, one that can be shaped like a bonsai tree into distinct and beautiful forms. And I hope we see the rise, I hope very much, that we make software soft again, that big overarching ideas of cohesive computing where OS and apps and user's will blend together in dynamic fashion rise again. But the peril is disjointedness. The peril is a lack of cohesion, where different users have vastly different bespoke systems and commonality is lost.

I'm tempted to stop here while I've said little that feels controversial. Putting in anything people dont want to hear risks the greater message, which is that the new ways could arise via many many means. But, I do think: the brightest most malleable most user controlled system we have is the web. Userscripts are very powerful, and they grant us power over disjointed chaotic industrial systems, which have no desire to let the user shape anything. I believe that, with a little effort, with tools already available like web components, we can build very sympathetic forms of computing. HTML like LISP can make visible and make manifest all the components pieces of computing, can be an S-expr like system, as we see with modern front-end-derived router technology for example. The whole web might not be terraformed in our lifetimee, we might not dislodge the contemporary industrial web practices to make all the web excellent, but I think many stable wonderful media forms that spread and interoperate can take root here, I think they already resemble the alive system of the gods the author so nicely speaks to. I too speak, for malleable systems everywhere (not mine but https://malleable.systems ), and for the web as one conduit to bring us closer to the gods.



First, you comment was a nice addition to the lproven article and was thought-provoking.

>> The story about evolution is totally wrong.

Actually, this is exactly the story of evolution. Mammals are the simple, stupid things that showed up when the MUCH smarter and capable dinosaurs ruled the world. But there is a HUGE amount of homology in the biological tree of life. That is true in computer evolution too. But more in core hardware and less in platforms and in software - which is what lproven was mostly writing about. Core hardware evolves more like basic metabolic and signaling pathways and channels - very slowly and with stability. Hardware platforms evolved in pretty dramatic step functions. Some of those steps did effectively obsolete everything that came before - like the EISA bus. But computer software is very much like human software - it can be largely rewritten in a evolutionary blink of the eye.

> But for the last couple decades not even that is true.

It's not true in computers for the same reason it's not true with humans in ecological evolution. We've won. There will likely be no next evolutionary step biologically. That's a change from the history of the last several hundred million years and one of which we don't yet full understand the ramifications. In computer tech, it's why we now have the Magnificent Seven, and why it differs from the Nifty Fifty. The Nifty Fifty didn't have an evolutionary moat.

I was very actively in the computer industry from 85 to 2000 and witnessed first-hand the very Darwinian evolution at work in computer tech. My first job (88-90) was at Westinghouse building a bespoke nuclear plant operating system running on minicomputers and Sun workstations. Then a new species arrived running on EISA bus and Windows OS and they kicked our ass even though we'd been doing that type of work for two decades.

I also saw it at my second job (91-95 - studied neuroscience in between) at Dataviews - the dominant computer workstation software vendor at the time. We ran on all the major and most of the minor Unix workstations. We'd port the software to your workstation for about $100K and had many takers - which meant that it ran on basically everything. But it didn't run on Windows. And so we hit a cost/produtivity wall and had other companies kick our ass that were Windows-native just as had happened at Westinghouse.

The web arrived and HTML/Javascript kicked the ass of everything else. I was an X/Motif expert and trainer in the early 90s. Great tech. Dead end because it's license couldn't compete against free.

Is machine learning different? Certainly, there have been some step functions ("Attention is all you need") but mostly it's been a slow evolution of Moore's Law and better backpropogation stacks. I did my early independent research in AI at CMU in '88 and mostly ran my backpropogation models in Excel. Biologically, this isn't so different from a slug brain vs a human brain. Same basic hardware but 100,000x more capable. If I'd told my 20 year old self that in 40 years we'd have backprop models with a trillion parameters that could run on PCs, I'd have said "yeah right - in some very distant future". But it was/is a pretty distant future.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: