Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I used to love these kinds of articles three decades ago. Then you get a programming job with budgets and deadlines and even stupid decisions based on politics and you hate all that at first until through experience you realize that even poorly engineered cars can get product from point a to point b and do so all over the world. Free markets eventually only have time for perfect solutions. And a perfect solution according to markets is the solution that does the job for the lowest cost over time. For a website that will support a two week marketing campaign, you don't need anything talked about in this article. In fact the only responsible decision is to ignore this type of approach and just build the damn thing with someone that has a track record. And then throw it out and move on. Based on empirical evidence, there isn't really a problem at all with the way people program. Markets have already mostly figured out the rare cases when such robustness is really needed. And it's rare. The only "right" way to program is to take as many stakeholder requirements into consideration as possible. And those requirements are rarely around program correctness. So this article is good (although I think you're really looking at functional programming by this time in history) but first make sure a top priority is program correctness before getting into the mode suggested by the article. One final stakeholder requirement that's always a priority: you have to be able to find qualified developers, and what developers learn is based on popularity and fashion. It's a real world constraint even if it's distasteful to the idealist and well intentioned types who write these articles.


And then, five years after you've left the company and some system inevitably collapses with nobody having a clue as to what went wrong you'd finally realize the wisdom of all that.

But it's no longer your problem.

So, please don't take it personal, but 'the well intentioned types who write these articles' tend to be the people that then get called in to clean up the mess.

And then - belatedly - the job gets done properly to keep the company in business, assuming there is still time enough to do so.

Just this week I had a nice inside look at the kind of mess gets left behind when the original duct-tape-and-spit guy leaves the company and lets his former co-workers clean up their mess. It isn't pretty, and a chapter 11 isn't an unlikely scenario, so forgive me if I take a harsher than usual look at the attitude that causes this sort of thing.

Note that the 'free market' doesn't have a horizon much longer than the next quarterly shareholder report, and that your typical software product lives a multiple of that interval. So software made with short term goals in mind will create long term headaches.

Your two week marketing campaign gets a pass. But your decade long backend project does not, nor does your real time medical device controller, ECU, database system or operating system.


> And then, five years after you've left the company and some system inevitably collapses with nobody having a clue as to what went wrong you'd finally realize the wisdom of all that. But it's no longer your problem.

If that were a problem in reality, the markets would be punishing companies where that happens. It's not a real problem that that happens. Management pretends to be upset, but in reality it's not a huge deal. Entropy is normal in apps and everything else. To continue the analogy from my original comment, do companies really go into crisis mode when one of the many cars in their fleet inevitably "collapses"? No. They build or buy a new one and life goes on.

> nor does your real time medical device controller, ECU, database system or operating system.

Yep. Those are the rare cases I talked about. It's a tiny fraction of total programmers building databases and stuff like that.


> If that were a problem in reality, the markets would be punishing companies where that happens.

Oh they do, I can show you plenty of examples. But it is never the problem of the people that created the issue in the first place.

Think of these things as time-bombs of technical debt. They'll blow up sooner or later, usually later, and that makes it that much harder to deal with the fall-out.

Also: for all the lessons about economy made here: I would happily argue that doing things right is actually cheaper in the long run and possibly also cheaper in the short term, by applying the techniques described in proper measure you can save yourself a ton of headache.

But of course that would first require a basic understanding of what the article is trying to put across, which if your time horizon is short and your deadlines are looming likely isn't going to be on your agenda.

> It's a tiny fraction of total programmers building databases and stuff like that.

Software you build tends to live longer than you think and tends to be incorporated into places that you can not foresee when you make it.

The 'tiny fraction of total programmers building databases' should include the huge fraction of programmers building embedded systems, APIs, operating systems, libraries and so on. All of those will have life-spans in the decades if they're done halfway right.


You seem to have a poor understanding of both entropy and markets. Even the perfectly built program will soon become useless. Car companies are quite profitable building cars that "collapse" far before the actual potential which might be a car that lasts 50 years. But no longer can pass emissions tests... Zuck is about to surpass Buffet as the third richest person in the world. On an app built with PHP! I don't think much more needs to be said to support my original point.


> You seem to have a poor understanding of both entropy and markets.

And you're a bit assuming and rude. Your argument also isn't as bulletproof you want to make it sound. What is the argument here, anyway? There's no need to improve technique for an average programmer because an outlier system (Facebook) is written in a language commonly associated with poor programming practices, with some handwaving about markets and entropy sprinkled on top?


Sorry if I came off that way. I was in a rush on the way to an event and I thought I was just being honest about the weakness in his argument.

What's the argument here? That stakeholders have requirements that don't have to do with robustness like budget and deadlines and that your software has a shelf life and sometimes it's ok if it eventually breaks, just like cars and even the laptop I'm typing this on will. Is that an unreasonable perspective?

And Facebook is an outlier? Really? Even when we add Wordpress, Wikpedia, Flickr, MailChimp and a long list of the most successful websites in the world to that list?


> And Facebook is an outlier? Really? Even when we add Wordpress, Wikpedia, Flickr, MailChimp and a long list of the most successful websites in the world to that list?

Yes, FB is an outlier -- one out of million companies. Only 5-10 companies out of those millions made this current model work. So their existence and "success" proves absolutely nothing.

You have a strange understanding of the word "successful".

Facebook is certainly not "successful" because it neglects good tech. If anything, they rewrote PHP itself so as not to have to rewrite their customer-facing software. How is that for your "tech excellence is not important" argument? They rewrote the damned runtime and even added a compiler.

So please define what "successful" means to you. "A lot of people using FB" is a temporary metric, even if it lasts for decades. It's not sustainable per se. It relies on hype and network effect. These fade away.

@jacquesm's points are better argued than yours. Throwing words like "free market" and "entropy" does not immediately prove a point.

I will give you the historical fact that there are many throwaway projects but he's also right that the fallout from the tech debt they incurred is almost never faced by the original author. Throw in the mix the fact that many businessmen are oblivious on what do the techies do in their work hours exactly and one can be easily misled that technology perfection is not important. Seems that you did.

Final point: I am not arguing for 100% technical excellence. That would be foolish. We would still be refining HTTP and the WWW in general even today and internet at large would not exist. But the bean counters have been allowed to negotiate down tech efforts to the bare minimum for far too long, and it shows everywhere you look.

(My local favourite restaurant waiters' smartphone-like devices for accepting and writing orders are faulty to this day because some idiot bought a cheap consumer-grade router AND made the software non-fault-tolerant, being an everyday example.)


> Only 5-10 companies out of those millions made this current model work

Stats? Evidence? I mean hundreds of of thousands of companies use PHP and other forms of less than perfect tech.

Websites all over the world seems to get the job done even when JavaScript with all its warts is used. I like JS for the record, but it does have warts.

> even if it lasts for decades.

You're saying the same thing I said. That stuff breaks. That companies come and go in and out of fashion. I also think it's interesting that you're calling FB an example of tech excellence but saying it's going to fade away. Choose one?

> How is that for your "tech excellence is not important" argument?

I never made any such argument. Not even close. I only said quality is not the only requirement and might sometimes not be a requirement at all.

Most of the code I write is high quality. I put a lot of effort into code reviews too. I mentor more junior devs around quality. My original post is actually much more nuanced than you are claiming.

> Final point: I am not arguing for 100% technical excellence. That would be foolish. We would still be refining HTTP and the WWW in general even today and internet at large would not exist.

Exactly. That's in the spirit of my original post. Maybe re-read it to see that we mostly agree instead of making my position into something it really isn't?


> Stats? Evidence? I mean hundreds of of thousands of companies use PHP and other forms of less than perfect tech.

Oh, I meant companies at the scale of Facebook. There aren't too many of them, would you not agree?

> I also think it's interesting that you're calling FB an example of tech excellence but saying it's going to fade away. Choose one?

FB does a lot of open-source projects. Their devs are excellent. That doesn't mean that their main value proposition is not comprised of code of the kind you speak about. No need to choose one, both can coexist in such a huge company like FB.

> I never made any such argument. Not even close. I only said quality is not the only requirement and might sometimes not be a requirement at all.

Well alright then. I am not here to pick a fight, you should be aware that you came off a bit more extremist to me and a bunch of others than you claim. But these things happen, I can't claim your intent because of a few comments, that's true.

Me and several others' point is that quality plays a part bigger than what you seem to claim. I also knew many devs that decided they won't ask for permission to take the [slightly / much] longer road and this decision paid off many times over in the following months and years.

Sometimes businessmen simply must not be listened to. I can ship it next week alright. But I can skip a few vital details, namely that I did not take into consideration some stupid micromanagement attempts to teach me how to do my job ("nobody cares about this arcane thing you call 'error logger' or 'fault tolerance', just get on with it already!"). Such toxic work places should be left to rot, that is a separate problem however.


> You seem to have a poor understanding of both entropy and markets.

You're hilarious. On an annual basis I end up being the deciding factor in the allocation of a fairly large sum of VC money and tech quality is a big deciding factor in that.

Fortunately there are plenty of successful companies that do a much better job than what you are describing you are doing.


What I described is taking stakeholder requirements into consideration like budget, deadlines, and the expected useful life of the software. That's not best practice? If it's not could you describe what I should be doing differently? I thought it was a step up when I finally realized software quality is not the only requirement competing for attention, but since I've spent my entire career paying close attention to what's best practice I'm also willing to learn from you and improve.

Every piece of software should be high quality even if it's a throw-away website used for 2 weeks? You'd expect the programmer to give a mathematical proof for the website code?

I also described that stuff does eventually break like the laptop I'm writing this on and it's not end of the world. We expect stuff to break.

Can you explain why my wife and almost every accountant in the world amortize intangible assets like in-house developed software and give it a useful life span?


> Can you explain why my wife and almost every accountant in the world amortize intangible assets like in-house developed software and give it a useful life span?

That's learned behaviour post factum. Had we (the software industry at large) done better then they wouldn't have the countless examples to learn from and turn them into a habit. Don't conflate things.

> I also described that stuff does eventually break like the laptop I'm writing this on and it's not end of the world. We expect stuff to break.

You are arguing extremes. The fact that a physical object will eventually suffer wear and tear no matter what has zero correlation to the fact that most software can be much more robust and long-lived but extreme time and money constraints prevent it from being so.

Our points of view can meet but not until you admit that a learned behaviour is something that can be changed if enough people with money stop turning cutting corners into an olympic sport.


> Had we (the software industry at large) done better then they wouldn't have the countless examples to learn from and turn them into a habit.

Nonsense. Stuff breaks. Everything. Even stuff made to a very high standard.

> most software can be much more robust and long-lived

It can't. Not because it can't be much more robust. But because most software is simply obsolete after a few years or maybe ten years if you're lucky. Software doesn't live in a static world where nothing changes. Laws changes, accounting practices are modernized, entire industries come and go, and everything is in a constant state of change. Maybe you haven't been around long enough to see it. I have. I've seen perfectly built in-house custom inventory software replaced a few years later by something like SAP because upper management decides the pros of having an integrated logistics system far outweighs the features of one in-house app. Sorry, but I've been in the business world far too long to fall for the idea that software is ever long-lived. There are some rare cases that it can be. The 40 year old COBOL programs some banks run to process massive amounts of transactions over night. And guess what? As long lived as they are, they are being re-written little by little because it's damn hard to find anyone under 65 who is actually interested in maintaining them. Software does not live in a vacuum.


Oh well, guess I better go into research and find academic investments because all this "make this web app yesterday" crap is getting very old and annoying...


tl;dr I have thousands of lots of horse carriage wheels built to the highest standard that will last another 500 years. Any interest? Quite a few AM radios too...


I don't but I'd still like to learn a thing or two from their creators, they obviously knew how to get stuff done that lasts. ;)


They didn't know that. Things that last also retain their usefulness. My entire point really. As an aside, let's hope you didn't use up extra resources that are not renewable in making something so durable but eventually useless. Would I favor regulations that prevent the opposite? Stuff that breaks before the end of its useful life just to create another purchase? I would consider it. Both ends of the spectrum create problems.


You give too much credit to the businessmen. They don't care about environment, they care about more sales and there do not exist any lengths they would not go to in order to achieve those.

I'd say producing one durable piece of tech vs. 5-10 non-durable pieces of tech is still more sustainable for the environment, would you not agree?


I love strong regulations that try to reduce or eliminate negative externalities (like pollution and toxic waste), especially when the cost of the those externalities are collected at the point of purchase. They aren't very popular though. When we try things like a sugar tax to reduce the $$$ billions a year that diabetes costs us, people starting screaming about nanny state and their freedom.

I'm not sure how we would require products be durable though? Who would be Czar of how durable things must be? Seems like that person would have a huge amount of power over which industries are profitable and by how much. I'm open to ideas though.


As an Eastern European I would definitely entertain the idea that totalitarianism is not entirely wrong. IMO the world needs politicians with stronger will who are less obsessed by next elections and are religiously persecuting criminals and opportunists... however the politicians are parts of these criminals so yeah, welcome to 21st century.

In any case, regulations have proven time and again to mean absolutely nothing unless enforced very strictly and with a very heavy hand (flat percents of the offender's gross income, and I mean from 20% and up, not some petty 1-2%). But that won't happen -- lobbies, rings of companies, "anonymous" donations, things like that... the status quo is too deeply entrenched. But we can dream, right?


But just imagine how much richer Zuck would be, had he written Facebook in C and assembly!


I agree with you whole-heartedly.

My programming career has gone from "I don't know the best way to write something", to "I know the best way to write something and I'll do it this way every time", to "often it doesn't fucking matter, just get it done and move along to the next problem".


For some applications it does not matter, so if you can afford to write throwaway code then more power to you.

But for many applications it does matter and for those cases you are essentially creating technical debt that will someday be somebody else's problem.


It depends on the part of the industry one works in. I'm basically a backend developer of web and mobile applications. I don't only code them, I also design or help designing most of the ones I work on.

Most of what we do in this market is reading some form data, JSON, XML, parse it, read/write to a database or some other API, gather results, send them back to the browser as either HTML or JSON.

I checked right now when it was the last time I had to devise some clever algorithm. I was at the end of 2015 in a Rails project. I remembered all the stuff about invariants from my CTO back in 1995 and it did really help. However in this market that's a once in 10 years occurrence, unless one keeps grinding through coding interviews on artificial problems (because those companies are affected by the streetlight effect.)

So, from my point of view "for most applications it does not matter, but for a few of them it does." It probably matters when writing some of the algorithm in the web browser I'm using right now.


It's clear a lot of people are taking my initial posts and comments wrong. I work really hard to not create technical debt. All I said was that there are competing requirements you should be weighing and not all of those requirements are technical. And that even the highest quality stuff eventually does break. Of course you should always strive to build high quality and even beautiful code. I take pride in my work. But part of that pride comes from being able to juggle multiple competing requirements and make the best decision for the company.

Sometimes creating technical debt is the right decision. Sometimes it's "get over this budget hump using two devs instead of five or we go out of business". And then you do get over it and all of a sudden the company is hugely successful and it's a good problem but you're working really long hours just trying to keep up with helping all that cash flow in... the real world rarely makes conditions so perfect you can write perfect code.

I strongly dislike it when I make the decision to create tech debt, but I will at least leave comments or documentation for the next guy on the parts I think could use more love.

And it's rare I do create it. I actually spend a large part of my time refactoring code and making it better and reducing technical debt. It's one of my favorite things to do. And that points right back to my original post. You know what's interesting about the code I refactor? It's working code. It solves the problem. I wouldn't want to build something bigger on it without refactoring. But I also wouldn't curse out the guy who wrote it. He solved the problem at the time within the budget and time constraints and other competing requirements he was juggling. Good for him.


> Sometimes it's "get over this budget hump using two devs instead of five or we go out of business". And then you do get over it and all of a sudden the company is hugely successful...

Seems you are judging from the SV / startup / USA bubble. Most of the world works in VERY different conditions than that.

I mean yeah, bosses whipping their devs for maximum throughput happens everywhere but the combination of factors you describe seems to be specific for USA.


I've worked in Europe, Asia, the USA, for quite a few startups, huge corps, and have started a couple of successful companies here and there myself. I have a long and varied career that I am very grateful for.

A lot of startups are actually obsessed with quality to the point of failing. I've seen that. I've also seen the opposite. There is a lot of variation in the myths founders create that they follow like a religion because they believe it's the one trick they need for success ;-)

You do eventually run out of money. It's so much more important to ship something to customers to get some feedback than it is to get it perfect. It's a very tricky balance to get right though. It has to look and work good enough to not scare potential customers away.


> It's so much more important to ship something to customers to get some feedback than it is to get it perfect.

I don't disagree, that's unequivocally true.

As a programmer myself however I know that "I'll get back to it later and fix it" is usually a lie...

Haven't founded a business yet and I think I'll do that eventually but it also seems that "do your market research first and foremost" is an universal rule.

EDIT: As an European I should add that most of us never start a business unless they already have several customers willing to pay lined up in an orderly queue. I feel that too many Americans (probably not only them) start a business based on pure enthusiasm and hand-waving that motivated them during a few business events where people vaguely expressed an interest in their idea.


I feel like you're essentially saying the same thing I did, just worded differently?


Sometimes you just don't get a choice to do it the right way, or the most elegant way.

You've got a project manager breathing down your neck, pressure to deliver functionality, you've explained the technical debt issues etc - and you're still told to do it the wrong way.

Then years later, the only thing on the code is your name and some very dodgy looking decisions. That project manager is still probably kicking around blaming people and earning a fortune hehe.


> You've got a project manager breathing down your neck, pressure to deliver functionality, you've explained the technical debt issues etc - and you're still told to do it the wrong way.

Do the right thing and quite that toxic work place. You will help natural selection: extreme corner-cutters should go out of business. I knew a guy (programmer) who destroyed a company by simply leaving.

You can only tell a shoe craftsman to create shoes out of cow dung and grass for so long.


"If that were a problem in reality, the markets would be punishing companies where that happens."

Quite the opposite: markets have been rewarding it for some time. The richest companies mostly had buggy software. What got them revenue was everything but flawless quality. Then, once their customers were locked in via other tactics, the customers kept paying them so long as the software continued to work with a switch costing too much. They also often patented anything that could block competitors.

Even quality-focused customers often want specific features even if it leads to occasional downtime. Also, releases improving on features fast. I think Design-by-Contract with automated testing can help plenty there with the pace necessary for competitiveness in a lot of product areas. The markets don't care about perfection, though. The company's priorities better reflect that.


The market doesn't care about security either. That does not mean it shouldn't be a priority.


Why should it be a priority? Who should pay for it if customers are ok with the status quo? Where is the competition offering to fill the market gap with products that are security minded? I'm not in love with free markets as the end all to solve all problems worth solving, but I think these questions are worth answering. It's either customers willing to pay for something or taxes. Security will probably end up being much like national defense. No one willing to voluntarily pay for it, but it being in the best interest of all to be "forced" to pay for it.


Because otherwise one day you might find yourself facing bankruptcy.

I'm a strong advocate for liability for software producers because it seems we as an industry are categorically incapable of doing the right thing. Until it directly affects the bottom line this likely won't change.

Customers are not 'ok with the status quo', they're clueless, and the only thing that changes is corporate profits.

In the end the difference between doing it right and doing it wrong is more related to long term vs short term thinking than that it would affect the bottom line in a more dramatic fashion (such as would be the case with liability).


> Because otherwise one day you might find yourself facing bankruptcy.

> I'm a strong advocate for liability for software producers because it seems we as an industry are categorically incapable of doing the right thing. Until it directly affects the bottom line

These two statements seem to contradict each other. If it's not directly affecting the bottom line today, how would one go bankrupt?

I do agree with you there should be some force pushing to eliminate this negative externality. We could compare poor security practice with toxic waste. In general the force I'm talking about is government that creates smart regulations. You'd like to do it by allowing consumers to sue after the damage has already been done. I'm not going to get into that debate, but both of us have proposed solutions and I agree either would be an improvement over what we have today.


Yeah it has a priority; low priority. Nobody wants to pay for it and it's not costing them in the market.


That's exactly my point. The markets pay for what they care about and ignore/punish what they don't. They rarely pay for security. They rarely punish insecurity. Even in security, it's usually just enough to not look incompetent when a breach or lawsuit happens. Both consumers and businesses care very little about software quality or security if assessing by what they buy, use, and stick with. You can easily prove this by giving them choices between feature- and security-focused products. Even when the latter are free/cheap and highly usable, the market still decides against them massively. The voters also don't push for regulation or liability of this stuff. Many straight-up vote against it.

So, the management at these companies operates in a market that barely cares about security or mostly cares about appearances/perception. The incentive structure rewards working against quality or security. The costs are externalized with little happening to counter that. So, the rational actors ignore quality/security as much as they can. Programmers should act no different in a system if maximizing selfish gain or minimizing work.

Personally, I'm a utilitarian that considers security a public need. I strongly favor regulations and liabilities to increase the baseline of our security. Just cover the basics like memory safety, input validation, secure admin interfaces, error handling, backups, and recovery if nothing else. The stuff we can already do today with free tools that the suppliers just don't care about. That's not what the market is, though. So, I can't blame people in it for giving it what it wants if they risk losing money or perishing focusing on idealistic goals. I do encourage those doing business with utilitarian style, though. It ranges from easy to hard work they don't even have to do. Also especially glad when I'm one of their customers. :)


Mothers used to die from doctors not washing their hands. The lack of a price signal didn't mean it wasn't a problem, it meant none of the doctors understood how to solve the problem (and neither did the patients).


Just to add, when Lister introduced antiseptic methods he was met with strong resistance from those same doctors who were equal parts annoyed with the messenger, and the message. It’s a hard thing to realize that you’d be killing thousands of people in your ignorance after all. It took quite a long time for his methods to be widely accepted and put into practice. Even when understanding emerges, you have to watch out for the entrenched interests defending themselves against change.


The market will care about security when it's more profitable to do so.

The market isn't a static, designed thing. It's an organic beast that will change and consume you if you don't change with it.


The truth is somewhere in the middle. What I’ve noticed over the years is that if you allow yourself to get in the habit of writing quick and dirty code you learn the wrong habits and gradually lose the ability to write correct code for complex problems. So I do favor the correctness approach.

But ... code has to be maintainable, meaning it should be simple to the person that maintains it next. Typically that means no cleverness and no obscure languages or frameworks. Choosing eiffel only makes sense if you know the next maintainer will be proficient at eiffel.


There is no need to resort to tools such as Eiffel to take some very good lessons about what the article is trying to say. Time has moved on since then, Eiffel has had its day, but just like Smalltalk and other obscure languages there is some underlying truth that is well worth studying.


> Choosing eiffel only makes sense if you know the next maintainer will be proficient at eiffel.

Choosing Eiffel makes the next maintainer proficient in Eiffel by definition, because that will be the job requirement for the maintainer position. Unless the people responsible for hiring cheap out, that is.

What you're advocating here is optimizing solutions strongly towards being maintainable by cheap, interchangeable workforce. It's a valid goal - presumably one the management would like - but sometimes (often) it's not worth the extra cost in complexity, both early and later on.

(Tangentially: programming is a profession. It should be entirely expected of people to be able to learn new things on the job.)


There's another issue here: many companies, when finding themselves in need of a proficient Eiffel programmer will just take one of their good programmers in other languages and ask: Would you like to learn Eiffel? And programmers usually enjoy learning new things, so someone will say yes.

I personally did that few times in my career: accepting job with technology I barely knew at the time, because I thought it would be fun to learn it.

You can imagine how that usually works out: software written by someone who was learning the programming language on the job.


> What I’ve noticed over the years is that if you allow yourself to get in the habit of writing quick and dirty code you learn the wrong habits and gradually lose the ability to write correct code for complex problems

As the original commenter who started this thread, I'd like to make it clear that I agree with you and I don't write quick and dirty code. Or at least very rarely. Even for stuff that has a very short shelf life, I write code that usually has very few bugs and that I'm usually proud of because of exactly what you said: I've done it so often it's a habit now.

I've always strived to write the best quality code possible within the constraints. Sometimes those constraints were even my own lack of knowledge. But after three decades of doing this I'm starting to think I'm actually getting to be pretty good at it. ;-)

So I wasn't suggesting to just write bad code in my original comment. Just to have a broader view of where quality goals sit within the many competing stakeholder requirements. A programmer who doesn't let perfect be the enemy of good is a better programmer.


The idea that it doesn't matter seems like survivorship bias. People see all the companies getting their databases dumped without a hit to their stock price/valuation and declare caution a waste of time, but what of the companies you never hear about that died because someone messed up in the rush to ship?


One indicator is how many companies are bought by turnaround specialists or fire-sales or aquihires to competitors.

Those are pretty good bellwethers for the effects of bad technical management.


Five years as an example is probably the outer limit and so it supports your case well. But I've seen badly designed software cost twice as much to put right (compared to putting the effort in at the start) within a few months. It's a false economy over and over but that gets buried in all the drama that follows.


>>For a website that will support a two week marketing campaign, you don't need anything talked about in this article

>[...]

>And then, five years after you've left the company and some system inevitably collapses with nobody having a clue as to what went wrong you'd finally realize the wisdom of all that.

So guy puts up a web site for $500 in consulting fees that is a 2-week project. It makes the company $7 million over the next 72 months because it becomes literally the biggest inbound channel.

Are you saying he shouldn't have built it for $500? What should he have done?


You should probably read the whole comment.


I suppose - and this applies generally, not just to programming - people (myself included) don't exactly like market's idea of "perfect". Because free-market perfect isn't just nice-sounding "solution that does the job for the lowest cost over time" - it's the borderline cheapest, ugliest, worst solution that's only barely fit for its purpose, and if it was any worse, it would be unsellable. It's aiming for the absolute lower bound - as any possible way to make it cheaper is taken.

The problems with market!perfection are many - mostly resolving around short-term optimization, externalities and lack of alignment between market values and human values. But we have brains that can be used to get better outcome than what the market incentivizes by default.


> The problems with market!perfection are many - mostly resolving around short-term optimization, externalities

Almost all bugs, security of otherwise, are just negative externalities. I think the public has been conditioned to accept that software not only comes with bugs, but many of them, and there's nothing that can be done about it. Companies/developers/publishers are not penalized much at all for buggy software, so much so that it's a common business tactic to deliver crappy software that doesn't even accomplish what it states it does, much less bug free and without security problems, with the understanding that it can be fixed up after delivery with little negative consequences.


Totally agree. After decades of programming my "aesthetics" around what makes a high quality program are far higher than what the market will (usually) pay for.

This applies to many professions we consider a craft. I'm sure some guys slamming up 2x4s for carbon copy houses in the suburbs would rather be building timber frame homes with inlaid custom woodwork.


The problem starts when your constructipn guys start to cheap out to the point they violate some expectations you may have about your house that you don't realize. Like that walls should actually support the load with some margin, so that the whole thing doesn't collapse after first hole you drill. Or that elecrical cabling shouln't me aluminum wrapped in paper. It may sound like a strawman, but we have tons of regulation (paid for in blood) in this space precisely because markets incentivize people to cheat if they can get away with it.

I see markets like combustion. A powerful force if you can contain and channel it, but an absolute disaster if you let it roam free.


I don't think that's correct. You're describing the marginal solution that the market produces, not the average solution. And because demands are always changing, the market never converges on that optimum you are presenting as a bogeyman. It's not realistic any more than all profits being competed away to zero.


Yes, I've described the solution in the limit, in the same sense a physicist could write: lim t -> ∞ ... A limit doesn't itself happen in real life, but shows you where a system is going.

As for a proof that this is happening, there are plenty of examples if you look at the highly competitive spaces and consider goods you usually buy, and how they evolved over past decades. Food and tools are two obvious cases that come to mind.


> One final stakeholder requirement that's always a priority: you have to be able to find qualified developers, and what developers learn is based on popularity and fashion.

But popularity and fashion are not disembodied forces controlled by the whims of the gods. They are the sum total of decisions made by people. We could make different choices, but the belief that this is futile is a self-fulfilling prophecy.


How could we make different choices? These choices are rarely made by developers unless it's their own startup. For most IT jobs these kinds of decisions are made by upper management who have to answer to a board of directors, and ultimately to shareholders. No one ever got fired for using Java. (I personally quit C++ and Java many years ago because I thought both were a bit of a mess. Maybe things have improved with Java - I haven't followed it much).

Upper IT management can't just say "we're switching to Haskell because fewer bugs". They have to sell the idea to the folks who will pay for those massive changes which include retraining existing employees. And management does take into consideration how big the hiring pool is. That affects costs. A surge in demand for Haskell programmers would of course create a higher cost for hiring and keeping them.

So they continue to hire Java programmers and students graduating from uni make sure they know Java.

Where is the opportunity for us to make different choices?

Should students refuse jobs in Java? Or without picking on a language, jobs at companies that have the highest software quality standards? That's a lot of idealism and responsibility to ask of someone whose is just hoping they can get a foot in the door and start their career.

Or is there someone else you have in mind that could be making different choices? It does seem like my own personal choice to give up on Java has had zero effect on its popularity.

There is far more at play than the "the belief that this is futile is a self-fulfilling prophecy" although I agree that too is a factor.

I do have hope though. One thing I've seen happen is more and more programmers who were introduced to functional programming at uni and who manage to sneak ideas from that in wherever they can. I think we are slowly moving away from the worst parts of the object oriented paradigm and adopting the easier and better parts of functional programming.

I'm looking forward to see what we end up with. So far, I think the evolution has headed in the right direction and we are getting better and better at this thing we call coding.


> How could we make different choices?

By exercising our free will. By having the courage to stand up and say, "Yeah, I get that we have a huge investment in Perl code, but Perl really sucks so how about we do this new project in Python instead? Or Scheme? Or Common Lisp?" Or, "Yeah, I get that we can get it done faster by doing it in Java, but that will incur a huge amount of technical debt, and also make it so that the really cool kids, the ones who get why Java sucks, won't want to work for us. So how about we make an investment in our future and try Clojure or Scala or Rust instead?" If enough people do that, eventually one of these overtures will get a green light. Whatever organization does that first will eventually accrue a competitive advantage, and that will in the fullness of time change the dynamic.

But it won't happen if no one even tries.


> By exercising our free will. By having the courage to stand up and say

Did you miss the part of my comment where I said I gave up Java and C++ for reasons of code quality? That was well over 15 years ago. I am doing my part actually.

I've had the "courage" to talk to management countless times in my career about what I think the best choices would be to improve quality. I don't even think it takes courage. It's just a conversation and attempt to sell an idea. Happens all the time in business.

Do you think my description of the challenges around getting change to happen were more accurate than the parent's "the belief that this is futile is a self-fulfilling prophecy."?

I hardly think my comment claims it's futile when I concluded with being hopeful that change has and is happening.


> Did you miss the part of my comment where I said I gave up Java and C++ for reasons of code quality?

No, but the bulk of your comment sounded pretty defeatist to me:

"These choices are rarely made by developers"

"No one ever got fired for using Java"

"Where is the opportunity for us to make different choices?"

"It does seem like my own personal choice to give up on Java has had zero effect on its popularity."

It's true: a single person's decision to stand up against the system is unlikely to have an effect. But if everyone makes that choice, it will have an effect. And that is more likely to happen if more people stand up and say that you should make that choice rather than whine about they tried and failed.


Being realistic about what you are up against is not defeatist. Is there anything I mentioned that's not a realistic assessment? Is it me that's negative, or is it the actual situation? And I also mentioned where I think the promising opportunities are coming from.


Realism and defeatism are not mutually exclusive. That's the thing about self-fulfilling prophecies: if you believe in them, then they are actually true.


You haven't yet answered any of my questions. Since this doesn't seem like a conversation but more like you trying to take my comments in the most ungenerous way possible, I'm going to stop here. But if you want to answer some of my questions and discuss the massive challenges around changing the global dev culture I'll be happy to continue.


> You haven't yet answered any of my questions.

That's because you're asking the wrong questions.

"Is there anything I mentioned that's not a realistic assessment? Is it me that's negative, or is it the actual situation?"

Those are the wrong questions. The answers to those questions don't get you any closer to a solution to the problem. They only get you to the conclusion that the situation is hopeless and that you should give up.

OK, you want an answer? It is both you and the situation that is negative. But the reason that the situation is negative is because of people like you who have decided that the situation is negative and that the only reasonable thing to do is to give up. And you know what? You're right. That is the only reasonable thing to do. Which means that the only way to solve the problem is to be a little unreasonable and carry on despite the fact that it makes no sense.

That's why I'm doing things like this:

https://github.com/rongarret/ergolib


I haven't given up. You've still refused to acknowledge that my very first post talked about where I think the promise is coming from. Universities and young programmers who have learned and embraced functional programming.

If you think you have the right answers to change a globe full of programmers and to make businesses focus more on software quality, then by all means you should go on to prove to the world you are right.

I'll continue to assess the situation realistically and do my thing. I've never once failed to point out to management when I think things are being done poorly. I'm often the new guy on the team who starts measuring and reporting tech debt to management. You're super quick to judge a stranger based on a few comments in a forum.

When I put the concerns of the company first (instead of being idealist and obsessive about quality as the only goal that ever matters) management takes what I say seriously. So when I do speak up about quality they listen. That's my path. Fine that yours is different.

Maybe if you asked more questions and attacked less you'd find that those who you think are your enemies are actually allies and just have a different idea about the best way to improve things.


> You've still refused to acknowledge that my very first post talked about where I think the promise is coming from.

So I went back and re-read your first post:

> How could we make different choices? These choices are rarely made by developers unless it's their own startup.

That sounds very negative to me.

> For most IT jobs these kinds of decisions are made by upper management who have to answer to a board of directors, and ultimately to shareholders.

Ditto.

> No one ever got fired for using Java.

Ditto.

> Upper IT management can't just say "we're switching to Haskell because fewer bugs". They have to sell the idea to the folks who will pay for those massive changes which include retraining existing employees. And management does take into consideration how big the hiring pool is. That affects costs. A surge in demand for Haskell programmers would of course create a higher cost for hiring and keeping them.

Ditto ditto ditto.

> So they continue to hire Java programmers and students graduating from uni make sure they know Java.

Ditto.

> Where is the opportunity for us to make different choices? Should students refuse jobs in Java? Or without picking on a language, jobs at companies that have the highest software quality standards? That's a lot of idealism and responsibility to ask of someone whose is just hoping they can get a foot in the door and start their career.

Ditto.

> Or is there someone else you have in mind that could be making different choices? It does seem like my own personal choice to give up on Java has had zero effect on its popularity.

Ditto.

> There is far more at play than the "the belief that this is futile is a self-fulfilling prophecy" although I agree that too is a factor.

Mostly ditto.

And then, after wading through that sea of negativity, we finally get to this:

> I do have hope though.

Very well, I acknowledge that in your first post you talk about where you think the promise is coming from. But do you see how someone might come away with the impression that you were not entirely optimistic about it?


Yeah, you're being very ungenerous. Just for example, it's not negative to say that startups have the power to make their own language choices.

I've never thought an accurate description of the current environment and the challenges around changing it are a pessimistic view of things. Nor optimistic. Just realistic. I prefer realism. Is that bad?

If you want to see it as negativity, I think that says something more about you than me.

One of the reason I described all of those things AND asked questions is because I was hoping someone would actually come back with some stories about how they have overcome those challenges.

Saying that Java is still the most popular programming language only because people haven't tried enough seems to me to be both inaccurate and dishonest, and does nothing to help change that. Clearly what we've done in the past has changed nothing. Time to look at what we missed and to try something other than "just do it".

If we really do want to use better programming languages and techniques (I'm pretty into TDD myself) then it's very important to understand what management thinks about those ideas and why, and how we might influence the real decision makers. I've tried the wrong way enough times in my career to to understand the right way. Selling an idea to upper management usually has to come with low risk and a guaranteed return on investment.

That's neither optimistic nor pessimistic, neither positive nor negative. It's just the simple truth. Don't kill the messenger.


> And a perfect solution according to markets is the solution that does the job for the lowest cost over time.

Yep. If we actually valued correctness, the market would place some cost on incorrectness. Security is a common example of this. If we wanted, we could punish software developers or publishers for shipping security bugs (for example, making them liable), and then we would see an immediate shift in how software was written.

That's not necessarily an endorsement of that position (it lacks enough nuance to even be remotely considered a good idea), it's just a fairly illustrative example.


The market can't solve solve the quality problem, that's a fantasy.

Sw quality is so poor across the board that there is almost no one producing high quality, reliable software. My bloody phone can't properly select text in this box for example.

Sw companies have optimised their production for decades to develop software that's just good enough to ensure profit and they've trained customers to expect and accept poor quality.

Is anyone surprised when Windows reboots for updates in the middle of an important presentation? No, that's how Windows works.

Companies will even reduce quality to improve profits and still get away with it if the customers don't notice, don't understand or are too apathetic to take action.

Often there is no action to take that will sufficiently punish the offending company.

Security is a great example: it's been bad for so long that core pieces of technology are fundamentally insecure and can't be significantly improved without total rewrites. Punishing developers and companies would result in a collapse of commercial software development, so humanity collectively accepts that software is by nature insecure.


> but first make sure a top priority is program correctness

That's the thing, this doesn't happen in real world and it is questionable if the concept of correctness is useful at all. As it is based on unvalidated and incomplete assumptions about the world, never fully correct themselves. Sort of correctness of incorrectness.


Correctness can just mean "does what it's intended to do". This is fairly unambiguous in cases such as if you intend to implement A* Search. If you hit a NPE and your program explodes, or if you aren't calling your heuristic (central to A*), then you have skimped on correctness. Furthermore, basically any time you hit something like a runtime error you likely did not intend to hit it. Therefore, regardless of problem domain, taking approaches that minimize these types of errors is an uncontroversial way that correctness can be prioritized.


People don't even think consistently, so correctness is an ever-moving target (aka a negotiation).


Case in point: from reading discussions about the C standard, my impression is that a compliant compiler is "correct" if it reads your program that dereferences a null pointer and blows up the world.


If a null pointer handler in a nuclear warhead firmware caused immediate detonation, that would certainly be correct according to C standard ;).


I agree with your analysis, but please notice that this new trend has completely redefined the idea of programming job, to the extent that, had I known about it in advance (say, 30 years ago), I wouldn't have chosen this profession in the first place. So the honest advice to "well intentioned types" (as you put it) is to switch to something else, otherwise you will feel miserable for the rest of your life. (I just bring your post to logical conclusion, which I happen to agree with entirely)


It makes me a little sad to read this because I don't think there are zero opportunities for jobs where quality is the absolute priority. People write code for medical devices for example. In which case you have a legal obligation and an ethical obligation to write the highest quality code possible.

I wasn't making the case that we should always abandon quality. Just that we understand its relative importance to other business requirements. That we don't build a $2,000 lock for a $10,000 safe to protect $1.

I'd like to think the majority of code I write is high quality. I am proud of most of the code I write. And I spend a lot of time helping junior devs refactor their code to be more robust, deal with edge cases, and be more maintainable, etc.

I've also been in the industry for a few decades now, and I'm more excited than ever about the opportunities for quality. The code we were writing in COBOL back in the day is nowhere near the level of quality of code that can be written today. Hell, we weren't even measuring bugs and quality back then.

I encourage you to have a look around for opportunities where your employer's values and goals are better matched to your own values and goals. If you are super interested in software correctness, you shouldn't be building throw-away websites for short-lived marketing campaigns. But that doesn't mean those jobs aren't important too. Just that you shouldn't be the one doing them.


The world needs idealists and well intentioned types like these, because the improvements in software quality won't come from the burnt out, disillusioned, just ship it already types like you.

What software wisdom would you share with us instead? That projects have deadlines and budgets? That the perfect solution does the job at the lowest cost over time? That the right way to program is to take the customer's requirements into consideration?!

Honestly it looks like you gave up a long time ago and you're just trying to convince the rest of us that mediocrity is the way to go. No thanks.

And it's not like Meyer is advocating for some hardcore formal verification... he's merely pointing out that design by contract can improve software quality. The same DbC which has first class support and could be implemented in many languages. Even the C++ boost library recently added support for DbC; it's kind of ugly and probably bloats the object files, but it's there.


> burnt out, disillusioned, just ship it already types like you

I love programming. And I've been at it for three decades. Zero burn out. I don't believe in just ship it, and I take a lot of pride in my work. And yes, part of that pride comes from having a better understanding of customer requirements than someone idealistically insisting we cannot ship working software because the aesthetics of the code aren't pleasing enough yet. That's why I get paid a lot more than idealistic junior devs. Whose idealism I do appreciate, and who I enjoy mentoring. Most of that mentoring being around how to improve the quality of their code... I never said to ignore quality.

Does it sound like I'm saying the world doesn't need idealists when I said this in my original comment?

"So this article is good (although I think you're really looking at functional programming by this time in history) but first make sure a top priority is program correctness before getting into the mode suggested by the article"

Maybe it's your own cynicism that gave you such an ungenerous interpretation of what I thought would be helpful advice to other devs. Whatever it is, that's on you, not me.


Maybe you are a champion of quality in real life and mentor of the less experienced, but that's not how you come off in your comments and I'm not the only once noticing that.

Yeah we know that not all projects can be built to the most stringent quality standards. The reality is that a lot of them have piss-poor quality though: they crash, lose data, get easily hacked, etc.

And when that's the reality the software engineering profession dooesn't need somebody preaching about requirements and keeping a balance between quality and other concerns, it needs quality fanatics.


> that's not how you come off in your comments and I'm not the only once noticing that.

I agree there are some people willingly ignoring the positive things I said in my original comment, and the other positive things I've said in other comments. My guess is that they would rather my view not be nuanced because otherwise there's not much left to attack. I'm happy to try to correct any misinterpretation, but I don't think it's all on me because it's not all comments that disagree with me. Quite a few comments show that people are getting my nuanced view. I would encourage you to re-read my very first comment with an open mind and decide if it's a balanced view based on decades of experience, or if I was actually really saying quality isn't important. If I was, it's very strange I would say the article in question is good. Which I did say.

I disagree that the software profession needs quality fanatics. The software profession needs quality fanatics when it makes business sense. Nobody ever stayed employed by building a $2,000 lock on a $1,000 safe to protect a $1 bill. What you built might be of the highest quality, but when you build with blinders on and ignore every other business requirement you aren't a fanatic for quality. You're irresponsible. I take pride in my ability to do what's best for the company. It's often difficult and requires making tough decisions and tradeoffs and risk management and making sure everyone involved is aware of all of those things. That's what a professional looks like. Not a perfectionist obsessed with quality to the point they become difficult to work with. Don't let perfect be the enemy of good.

> The reality is that a lot of them have piss-poor quality though: they crash, lose data, get easily hacked, etc.

Not the projects I'm on. I refuse to ship code that is that bad. I will get fired first.


> Honestly it looks like you gave up a long time ago and you're just trying to convince the rest of us that mediocrity is the way to go. No thanks.

That's exactly how I summarized most of his comments yet he still claims it's not true. I don't know, I am interested if he will reply to some of my comments so we can better judge what he actually had in mind. Him claiming that everybody misunderstood him is not a helpful discussion starter. :)


I've been thinking on similar lines for a while. It seems like the number one thing it comes down to is that more deep thought is required for a number of these 'obviously better' techniques (I think they generally come down to being more declarative than imperative, saying how the results should look instead of describing how to get them).

So lots of people are inclined to say things like, "then don't be lazy, do the work and think deeply"—but for one thing, 'deep thought' should be considered as a conserved resource and you have to choose what to spend it on (it's not automatically better to always spend it); and for another, I think the actual depth required for perhaps the majority of real-life problem domains is impractical, or perhaps sometimes the domain is even fundamentally not amenable to an elegant mathematical solution. So you have to contort the 'better' language to do inelegant things it's not really suited for anyway.

I think the constraints on the 'triples_from' structure from the article is a bit misleading and makes a good example of what I'm talking about. It looks super simple, but if you note how this constraint

  across tf.item as tp all tp.item.source = tf.cursor_index end
is actually working, it's not so much that the language gives a particularly elegant/powerful means of specifying these constraints as it is that 'triples_from' was structured in such a way that it would be easy to specify the constraint. My feeling is that trying to do this for real-life things would rarely ever be so neat.

If the domain you're modeling has mathematical elegance to begin with—absolutely, do the deep thought and uncover the invariants. But if it doesn't, then you'd be 'programming wrong' by trying to use such techniques.

The alternative I've been thinking about is just developing far more powerful/pervasive visualization tools so that you can write relatively mindless imperative code and just see where things are going wrong more easily. (consider the author's example of multiple data structures which must be kept in sync, but it's hard to tell when they get out of sync). Not trying to self-promote too much but you can follow my profile to the project if you're curious.


Regarding the constraints, the article also makes a very important point: it's not all or nothing, you can have _some_ invariants and pre/post conditions easily.

I.e. size of arrays or "nonblankness" of strings, or balanced invoices.

The simple fact of thinking about the constraints, will help design code.


> Based on empirical evidence, there isn't really a problem at all with the way people program.

I disagree strongly. I would say that software development is a tire fire. Every day there are reports of bugs in software causing all kinds of real-world problems. These days software bugs can kill people[1]. And there are only going to be more CPUs and more software and more bugs going forward.

> Markets have already mostly figured out the rare cases when such robustness is really needed.

I don't see how this is a supportable statement in a world that includes the "Toyota Unintended Acceleration" bug? (Among so many many others.) When you say, "a perfect solution according to markets is the solution that does the job for the lowest cost over time" aren't you effectively saying we should only expect software to be as good as has to be to protect the liability of some corporation? Not to take a cheap shot, but uh, I think the families of the people who died in the car accidents-- excuse me, collisions ("'Accident' implies there was no one at fault.") --might have a different attitude to software correctness?

In any event, your entire argument is predicated on the idea that correct software is expensive. But this is only true because we, as an industry, have not made effective use of the the available tools to write correct software!

It's not expensive to write correct code. It doesn't take longer either. We just don't do it.

Dr. Margaret Hamilton figured out how to write bug-free software as a side effect of her work on the Apollo 11 program years ago and nobody noticed.[2] (She's the person who coined the term "Software Engineer" BTW.) Byte-for-byte, flawless code is no more expensive than buggy code. And maintenance costs are near zero, so it's actually cheaper. You just have to use the right method (which has existed for about 30~40 years, nearly totally ignored.)

Please PLEASE don't make excuses. We've can get this right, even if "the market" doesn't notice or care.

To reuse my lumberjack metaphor from my other comment: Chainsaws exist. They are safer and faster than axes. There is in fact no way that cutting down trees with an ax is objectively better than cutting down trees with a chainsaw.

You are maintaining that, since trees are felled well enough with axes today, there's no sense in investing in chainsaws and chainsaw training, not now, nor ever in the future.

You're saying that "the market" only really needs wood cut by axes and doesn't value wood cut by chainsaws because all the furniture made with it is good enough, and we should just live with splinters and chairs that break and drafty houses, etc...

Frankly, it's kind of a lame argument to find on a pro-tech forum. Fancy sophisticated technology is awesome, unless it threatens to obsolesce your favorite ax, eh?

> you have to be able to find qualified developers, and what developers learn is based on popularity and fashion.

This is the source of the problem: we're not "engineers", we're not even car mechanics! We're barely above the level of kids building go-carts in their backyards out of junkyard parts. We should be ashamed of our fashion-driven-ness. The amazing thing is that we got the business people to go along with this![3]

Maybe someone should just start offering chainsaw-cut wood at cheaper prices than the hand-hewn crap and see if "the market" likes that noise? (In case I'm being too arch, that's exactly what I'm about right now. I'm so freaking passionate about this.)

"Program correctness" shouldn't have to be "top priority" because it should be the default. You should get it "for free" along with the software for the same price because it costs nothing extra.

[1] For example: "A Case Study of Toyota Unintended Acceleration and Software Safety" Prof. Phil Koopman. September 18, 2014. Carnegie Mellon University https://users.ece.cmu.edu/~koopman/pubs/koopman14_toyota_ua_...

[2] Nearly nobody noticed. I know it sounds crazy but it's true: There's a simple, easy method to develop bug-free software. James Martin wrote it up in a book "System design from provably correct constructs: the beginnings of true software engineering." in 1985. That's probably the best source if anyone wants to read up on it.

[3] A programmer in language or framework A who cannot or will not learn language or framework B is not a good programmer. (I don't mean putting "Won't do PHP" on your resume, I mean that you can't or won't learn it in the first place.) Q: "What if we can't find Python devs? There are so many Java devs. We should use Java." A: "Why would you hire a Java guy who can't do Python? Even to do Java?" I've had that conversation a bunch of times.


A lot of things wrong with your arguments. When you say correctness, you are pushing for a specific method for reducing amount of problems caused by defects, while these problems is what we actually care about when we talk about reliability, not defects, not bugs, not correctness. Yours is just one such method, and one of the expensive ones. This is very important distinction if you want to understand why "proving correctness" cannot get anywhere. For a lot of software even if we need reliability there simply exist more objectively better ways to achieve it.

Business is another thing, nobody convinced them to go along with anything, it's the other way around. Incentives to produce software the way it is produced actually come from businesses as they are the ones paying for it. Engineers just go along with it.

I get that software industry is very dogmatic and it's hard to see things for what they are. But we should at least try.


> When you say correctness, you are pushing for a specific method for reducing amount of problems caused by defects, while these problems is what we actually care about when we talk about reliability, not defects, not bugs, not correctness.

I'm using "correctness" to mean software that is bug-free.

The system that the software is a part of may still have problems, but none of those problems should be caused by bugs in the software components. It's also possible to build bug-free software that solves some other problem than the one you have. Both of those issues are orthogonal to the issue of why the industry doesn't adopt methods that generate bug-free code.

> Yours is just one such method, and one of the expensive ones.

I expect to be able to train normal, non-programmer folks to be able to use it (a HOS-like system that permits elaboration of a top-level spec into bug-free working code) to develop bug-free programs. If that works it may be so cheap that it depresses the market for "real" programmers. I should be so lucky.

But regardless, these methods (HOS, Cleanroom, etc.) just aren't that dreadfully expensive. And bug-free code, once written and paid for, can be reused. The cost analysis has been on the side of "provable correctness" for longer than my lifetime.

> This is very important distinction if you want to understand why "proving correctness" cannot get anywhere.

Well I don't accept your assumption that '"proving correctness" cannot get anywhere'. My whole point it that it has gotten places and we're all ignoring it because we prefer to use e.g. C and Java and {{POPULAR LANGUAGE}}...

> For a lot of software even if we need reliability there simply exist more objectively better ways to achieve it.

If you are talking about "reliability" of software I don't know what that means other than bug-free.

I know you can build reliable systems out of unreliable parts, but to do that there still has to be some reliable system orchestrating them and compensating for failures.

But again, I'm not saying there's a method for reliable systems, only reliable software. The absence of the former doesn't invalidate the existence or desirability of the latter.

> Business is another thing, nobody convinced them to go along with anything, it's the other way around. Incentives to produce software the way it is produced actually come from businesses as they are the ones paying for it. Engineers just go along with it.

The "suits" aren't to blame for this one. They only know what we tell them (to a first approximation.) If you tell your boss, "I can chop better with this ax than that chainsaw." you're lying or ignorant. How is management supposed to even know the possibility is there if the programmers doesn't tell them? Or they bring it up and the response is "Sure, I'd love to use {{TECH}} but {{EXCUSE}}."?

There's a cost for bugs. If you can get bug-free programs for the same upfront cost as buggy ones (and my whole argument is that we can, but don't) then there's no upside and only downside to accepting buggy software, and methods that permit buggy software to be written.

We could use chainsaws, instead we use axes and claim chainsaws are too expensive and axes cuts fine.

It's not because the bosses, who only care about board-feet[1] per worker per hour, are too cheap to buy chainsaws.

[1] "The board-foot is a unit of measure for the volume of lumber in the United States and Canada. It is the volume of a one-foot length of a board one foot wide and one inch thick. " ~https://en.wikipedia.org/wiki/Board_foot


I've never heard of many ax related deaths and I'd imagine an out of control chainsaw is far more dangerous than an out of control ax. The vast majority of lumber industry related deaths are due to falling trees. If we took the chain saws away and replaced them with axes and hand saws, far fewer trees would fall. It would greatly reduce production, but I'm not convinced it would result in more deaths. Quite the opposite.

I do get the point of your analogy, but it's really a very poor analogy.

> If you can get bug-free programs for the same upfront cost as buggy ones (and my whole argument is that we can, but don't)

This sounds like a massive market opportunity. One then wonders why absolutely no one has exploited the opportunity. There's something missing here.


(In this metaphor, you can even use your ax to carve a chainsaw out of a tree in your spare time, and then use that! But we don't. "It's too expensive."

As an example, here's a way to get Logical Paradigm programming in your favorite language http://minikanren.org/ It's simple, easy to understand and implement, and you can use it to do things like type inference and type checking with flexible and powerful constraints to define and ensure invariants and stuff like that. This stuff isn't expensive or even that challenging, we just don't do it.)


I did a little digging and even started reading "System design from provably correct constructs: the beginnings of true software engineering". From what I can see, it's mostly garbage used to sell the products of the HOS company. James Martin was on the board Higher Order Software, Inc. A company that despite claiming being able to provide "bug free software" went out of business.

Here's Edsger Dijkstra debunking the books written about HOS.

https://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/E...

An evaluation by the US Navy concluded. "the HOS literature tends to advertise their ideas and products more than making a contribution in substance to the field of Computer Science. The author recommends that USEIT not be used in the TRIDENT program or any program development at NSWC. Even for a high level system specification, USEIT is not seen as a good choice. A mathematical functional notation or a PROLOG-like notation appears better suited for that purpose. The examples in the Appendices of this report, especially Appendices C, D, and E, show that a LISP-style mathematical notation is more compact and normally easier to read than the control map notation of USEIT. On a more positive note, the author considers the functional approach to system development and programing very promising. Systems so conceived and programs so constructed are more amenable to analysis and therefore, in principle, more reliable and better manageable."

http://www.dtic.mil/dtic/tr/fulltext/u2/a198753.pdf

So are we just talking about functional programming when we say HOS? If so, the pros and cons of functional programming have been discussed at great length.

Here is an interview with Simon Peyton Jones (one of the creators of Haskell) talking about why Haskell is "useless". It's a short interview worth watching if you want to understand the nuances and challenges and costs around creating high quality and bug-free software.

https://www.youtube.com/watch?v=iSmkqocn0oQ


Cheers! (I reply on sibling comment.)


> I would say that software development is a tire fire. Every day there are reports of bugs in software causing all kinds of real-world problems.

All of human history is a tire fire then. Most stuff around the world is poorly engineered and just gets the job done. Wooden bridges with rope and wire holding them together and no analysis whatsoever done on what load it can bear. For most of history those bridges made up the majority of the bridges in the world. And it worked just fine until modern transport put higher demands on bridges. Yet, you still find clunky wooden bridges all over the undeveloped world, and they continue to work.

Should we try to do better? Of course we should. But someone has to pay for it. It doesn't happen magically.

Regarding the Toyota Unintended Acceleration bug, that's gross negligence if the top priority coming from management wasn't quality and if someone can prove that, they should end up in jail. And I would not excuse the developers either. I would never ship code that I know might kill someone. I would rather quit and work as a cashier. Please re-read my original post because I never said quality isn't important. I only said that quality is not always a top priority, and that in some cases it should be a low priority. A website for a 2 week marketing campaign will never kill anyone. It would be a waste of resources to insist on anything other than just shipping it once it works.

If you're honest with yourself and look around, you do it all the time in your own life too. You draw a diagram on a piece of paper or a white board to explain a concept and then you throw it away or erase it. You don't carve it in stone just because it will last longer and someone a hundred years from now might find it useful. You put up a simple rope barrier to keep people from stepping on newly planted grass. You don't erect the wall of China. Temporary "low quality" solutions that will later be dissembled and thrown out (or save the rope for reuse at least) are often the best fit based on the requirements.

> You just have to use the right method (which has existed for about 30~40 years, nearly totally ignored.)

I'm very skeptical that markets would completely ignore an opportunity to beat the competition if the costs were exactly the same but the results were higher quality. But I'm going to look into this. Thanks for the reference. I'm going to read James Martin's book and see if I learn something new that will help me write better code.


> All of human history is a tire fire then.

Yes. (We have spent the last 10,000 years recovering from the Younger Dryas and we are only just now getting back on our feet. Heck, most of us still think agriculture is a good idea when really it's about the dumbest way imaginable to relate to the soil. But I digress.)

> Most stuff around the world is poorly engineered and just gets the job done. Wooden bridges with rope and wire holding them together and no analysis whatsoever done on what load it can bear. For most of history those bridges made up the majority of the bridges in the world. And it worked just fine until modern transport put higher demands on bridges. Yet, you still find clunky wooden bridges all over the undeveloped world, and they continue to work.

Ah, but none of those bridges are built out of electrified math.

Software is electrified math and it can be perfect.

And it's self-referential: we can write perfect meta-code that emits only perfect code.

> Should we try to do better? Of course we should. But someone has to pay for it. It doesn't happen magically.

My point is not that we never try. My point it that the world contains many attempts and most of them have been ignored by most working programmers.

> Regarding the Toyota Unintended Acceleration bug, that's gross negligence if the top priority coming from management wasn't quality and if someone can prove that, they should end up in jail. And I would not excuse the developers either. I would never ship code that I know might kill someone. I would rather quit and work as a cashier. Please re-read my original post because I never said quality isn't important. I only said that quality is not always a top priority, and that in some cases it should be a low priority. A website for a 2 week marketing campaign will never kill anyone. It would be a waste of resources to insist on anything other than just shipping it once it works.

Let's assume, for the sake of argument, that I'm wrong and correct software always costs more than incorrect software. In this scenario (which may well be the REAL scenario) you have put your finger on the important bit: we're talking about the location of the inflection point.

Allow me to reference Randall Munroe, "Is It Worth the Time?" https://xkcd.com/1205/ It's a handy chart that shows, "How long can you work on making a routine task more efficient before you're spending more time than you save? (Across five years)"

It's not precisely what we're talking about, but it's got the same flavor: how much do you expect to use the buggy software vs. the cost of correctness...

Now my point would be: The industry should have had a house-on-fire urgency around reducing the cost of correctness to shift the infection point downward so that all but the most trivial software can be made correct economically.

We should have been doing that since forever (or at least sometime after the Apollo 11 mission.) Instead we generally ignore these sorts of things.

> If you're honest with yourself and look around, you do it all the time in your own life too. You draw a diagram on a piece of paper or a white board to explain a concept and then you throw it away or erase it. You don't carve it in stone just because it will last longer and someone a hundred years from now might find it useful. You put up a simple rope barrier to keep people from stepping on newly planted grass. You don't erect the wall of China. Temporary "low quality" solutions that will later be dissembled and thrown out (or save the rope for reuse at least) are often the best fit based on the requirements.

Have you been to Daiso? It's the Japanese dollar store. Pretty much any human problem that can be solved by ten ounces of plastic can be solved in Diaso for $1.50. I'm not generally into consumer culture, but I love Daiso.

You don't need 'temporary "low quality" solutions' if you have Daiso.

It's not that I don't use hacks, or don't respect them, it's that we're so far behind where we should be in terms of off-the-shelf solutions (to programming) and we don't seem to be quick on the uptake...

> I'm very skeptical that markets would completely ignore an opportunity to beat the competition if the costs were exactly the same but the results were higher quality. But I'm going to look into this. Thanks for the reference. I'm going to read James Martin's book and see if I learn something new that will help me write better code.

God bless you! (I collect powerful ideas and I cannot tell you how many times people have said, "If $FOO is so great, why doesn't everybody use it already?"... I don't know! I don't freakin know! It makes me sad. All of human history is a tire fire, indeed.)

- - - - - - - - - - - - -

This is my reply to your later comment on this same thread.

First, wow, I'm impressed. You are actually doing the homework and I tip my hat to you with great respect. Seriously, that's the nicest thing you could have done and I really appreciate it.

Second, yes the language and presentation around these "HOS" ideas has apparently always been really bad, with the issues you describe. It also doesn't help that the necessary background knowledge and jargon wasn't wide-spread at the time.

Third, yes it was panned by Dijkstra and the Navy, I've read both of those reviews, and their objections are not without merit. But, and I say this as someone who has huge respect for Dijkstra, they were both wrong: they both missed the fundamental advantages or "paradigm", if you will, of how HOS et. al. works.

(Also, have seen that Simon Peyton Jones interview. And no, we're not just talking about Functional Programming, that's kinda orthogonal. E.g. Haskell helps you write code with fewer bugs, HOS prevents them in the first place. Another way to differentiate them is that if you're typing text in a text editor to make software you're not doing HOS regardless of the langauge.)

So, poor marketing, bad reviews, obscure principles and the general disinterest of industry led to this powerful technology languishing.

Yet, I insist there's something there. Let me try to convey my POV...

In modern terms I can describe the crucial insights of the HOS sytem concisely. Here goes:

Instead of typing text into a flat file of bytes and hoping it describes a correct program, the HOS method presents a tree of nodes that is essentially an Abstract Syntax Tree (but concrete and there's no driving syntax because there's no source text.) The developer edits the tree using only operations that maintain the correctness of the tree.

This is like "Par Edit"[1] in emacs, or a little bit like some of what J. Edwards is attempting with Subtext[2], or the old "syntax-directed programming environment called Alice"[3] for Pascal. (Again, it's not that no one has ever tried anything like this, my whole point is that powerful techniques for writing software with fewer bugs in have been around for a long time and we, in general, don't use them.)

The main difference from these is that HOS uses a very simple and restricted (but Turing complete) set of operations to modify the tree: Sequence, Branch, Loop, Parallel. (There are some "macros" built out of these operation for convenience but underneath it's just these four.)

Starting with a high-level node that stands for the completed program you gradually elaborate the tree to describe the structure of the program and the editor/IDE enforces correctness at each step. You literally cannot create an incorrect program.

Apparently normal people, accountants and such, could sit down in front of the IDE and,with a little training and coaching, learn to describe their own work processes in it and essentially write programs to automate (parts of) their own work.

I've been working towards bringing this to market, on and off, for years now. In fact, my first programming job was the result of a talk I gave on a prototype IDE at a hacker convention about fifteen years ago. I have just finished implementing type inference and type checking for my latest vehicle: a dialect of the Joy programming language. It has been slow going (I lead a chaotic life) but I'm on the cusp of having something I think will be really great. If it works, it will revolutionize software development.

Quixotic, I know, but somebody's gotta tilt at those windmills...

Anyway, thank you again for taking the time to look into this. I can't tell you what that means to me personally. I know the "Provably Correct" book is terribly written, but I urge you to try to look beyond that. All I can really honestly tell you is that I'm convinced there's something really important and useful there.

[1] "ParEdit (paredit.el) is a minor mode for performing structured editing of S-expression data." https://www.emacswiki.org/emacs/ParEdit

[2] http://www.subtext-lang.org/

[3] "In a syntax directed editor, you edit a program, not a piece of text. The editor works directly on the program as a tree -- matching the syntax trees by which the language is structured. The units you work with are not lines and chracters but terms, expressions, statements and blocks. " https://www.templetons.com/brad/alice.html


That's about all the time I have left to spend on this. If you truly have discovered a way to do what you're claiming and it's has just been a victim of bad luck and poorly written books in the past, then there is a huge market opportunity and I wish you the best in bringing it to market.

Interesting discussion. Thanks. I will keep my eye on this space from time to time.


Well met. You're in rare company to have gone and looked, I appreciate that. Cheers!

Check my submissions in a month or two, I'll put in a "Show HN" when I have something to show.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: