Excuse me for sounding rough, but - isn't this reinventing comp-sci, one step at a time?
I learned about distributed incrementally
-monotonic logs back at the late 90s, with many other ways to do guaranteed transactional database actions. And I'm quite certain these must have been invented in the 50s or 60s, as these are the problems that early business computer users had: banking software. These are the techniques that were buried in legacy COBOL routines, and needed to be slowly replaced by robust Java core services.
I'm sure the Restate designers will have learned terribly useful insights in how to translate these basic principles into a working system with the complexities of today's hardware/software ecosystem.
Yet it makes me wonder if young programmers are only being taught the "build fast-break things" mentality and there are no longer SW engineers able to insert these guarantees into their systems from the beginning, by standing on the shoulders of the ancients that invented our discipline, so that their lore is actually used in practice? Or am I just missing something new in the article that describes some novel twist?
When I was in school I had an optional requirement. You had to take one out of 2 or 3 classes to graduate. That was compiler design, which was getting terrible reviews from my peers who were taking it the semester before me, or distributed computing. Might have been a third but if so it was unmemorable.
So I took distributed computing. Which ended up being one of the four classes that satisfied the 80/20 rule for my college education.
Quite recently I started asking coworkers if they took such a class and was shocked to learn how many not only didn’t take it, but could not even recall it being an option at their school. What?
I can understand it being rare in the 90’s but the 00’s and on were paying attention to horizontal scaling, and the 2020’s are rotten with it distributed computing concerns. How… why… I don’t understand how we got here.
So many people I work with don't "get" distributed systems and how they interplay and cause problems. Most people don't even know that the ORDER you take potentially competing (distributed) locks even matters -- which is super important if you have different teams taking the same locks in different services!
The article is well written, but they still have a lot of problems to solve.
I went too far the other way. Concurrent things just fit my brain so well that I created systems that made my coworkers have to ask for help. One that still sticks in my mind after all these years wanted to ask me to knock it off but lacked the technical chops to make it a demand. But I could read between the lines. He was part of my process of coming around to interpreting all questions as feedback.
There’s about 20% of reasonable designs that get you 80% of your potential, and I’m a world with multiple unrelated work loads running in parallel, most incidental inefficiencies are papered over by multitasking.
The problem is that cloud computing is actively flouting a lot of this knowledge and then charging us a premium for pretending that a bunch of the Fallacies don’t exist. The hangover is going to be spectacular when it hits.
> The hangover is going to be spectacular when it hits.
I'm honestly looking forward to it. We constantly deal with abstractions until they break and we are forced to dive into the concrete. That can be painful, but it (usually) results in a better world to live in.
Cloud will come back in your lifetime and maybe mine. Everything in software is cycles and epicycles. Hyperscaler hardware is basically a supercomputer without the fancy proprietary control software, which is third party now.
I think your points are pretty spot on - most things have already been invented, and there's too much of a move-fast-and-break-things mentality.
Here's a follow-up thought: to what extent did the grey-beards let us juniors down by steering us down a different path? A few instances:
DB creators knew about replicated logs, but we got given DBs, not replicated log products.
The Java creators knew about immutability: "I would use an immutable whenever I can." [James Gosling, 1] but it was years later when someone else provided us with pcollections/javaslang/vavr. And they're still far from widespread, and nowhere near the standard library.
Brendan Eich supposedly wanted to put Scheme into browsers, but his superiors had him make JS instead.
James (my source was an insider in the Java team at Sun, pre-Marimba) wrote java.util.Date, which I had my one assistant (Ken Smith of Netscape) translate from Java to C for JS's first Date object, regrets all around but it "made it look like Java".
I wish James had been in favor of immutability in designing java.util.Date!
This is certainly building on principles and ideas from a long history of computer science research.
And yes, there are moment where you go "oh, we implicitly gave up xyz (i.e., causal order across steps) when we started adopting architecture pqr (microservices). But here is a thought on how to bring that back without breaking the benefits of pqr".
If you want, you can think of this as one of these cases. I would argue that there is tremendous practical value in that (I found that to be the case throughout my career).
And technology advances in zig zag lines. You add capability x but lose y on the way and later someone finds a way to have x and y together. That's progress.
I learned about distributed incrementally -monotonic logs back at the late 90s, with many other ways to do guaranteed transactional database actions. And I'm quite certain these must have been invented in the 50s or 60s, as these are the problems that early business computer users had: banking software. These are the techniques that were buried in legacy COBOL routines, and needed to be slowly replaced by robust Java core services.
I'm sure the Restate designers will have learned terribly useful insights in how to translate these basic principles into a working system with the complexities of today's hardware/software ecosystem.
Yet it makes me wonder if young programmers are only being taught the "build fast-break things" mentality and there are no longer SW engineers able to insert these guarantees into their systems from the beginning, by standing on the shoulders of the ancients that invented our discipline, so that their lore is actually used in practice? Or am I just missing something new in the article that describes some novel twist?