Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Do you know why we have pointless language wars, and meaningless arguments about what is good in practice?

As a veteran language warrior, I resent the claim that my efforts are "pointless". There's a lot of terrible software out there, and one of the reasons for this is that inappropriate choices were made on the outset.

Now, I'll be the first to say that there isn't "one language to rule them all". I recently got into a debate over C++ (which I dislike) where I learned that there are people making judicious use of this language, which is still inappropriate for 90% of the purposes to which it is put, but seems to suit some needs perfectly in a way that neither a high level language nor C can.

Software engineering is a generalist discipline: how to solve problems and add value to organizations using software. It applies whether you're writing performance-critical weather simulations or web apps. It has rules (such as "prefer immutable data and referentially transparent functions") which you must know to be a decent software engineer but that you will end up breaking 1 to 10 percent of the time, because to break said rule (e.g. using mutable state) will be the most tasteful decision.

I think skill as a software engineer is pretty easy to measure-- increasing scopes of contribution, reliable competence on harder architectural problems. Domain-specific programming skills are much harder to measures, and skill of that variety is n-dimensional without question.

Software engineering, I'd argue, is the liberal arts of programming. You need to know about algorithms, but not necessarily become an expert. Same with AI, programming languages, operating systems and databases. You study all these things to learn how to learn, so that if you find yourself, some years later, needing to tune a NoSQL database, you know how to find the knowledge, autonomously if necessary, and apply it.



Software engineering is a generalist discipline: how to solve problems and add value to organizations using software. It applies whether you're writing performance-critical weather simulations or web apps.

I agree absolutely. I've done jobs in scientific computing (whoops! knew nothing about interfacing with hardware, had to learn) networking (whoops! knew nothing about SNMP, had to learn) and stock trading (wtf is an intermarket sweep order? time to learn again.) There's a role for specialists, and I like the fact that I become more valuable and more independent as my domain knowledge increases, but even more I like that my skills transfer from one vastly different domain to another.

Someday maybe programming will be specialized. Like medicine: there's no way we would let a chimpanzee doctor operate on a human or even prescribe drugs to one. Right now there are natural areas of specialization, but specialization just isn't what it's cracked up to be. Anyone who has dealt with hired guns in software knows that specialists can't compete with smart generalists who are close to a problem. It's as if botanists performing surgery on close friends and family members always achieved better medical results than surgeons operating on people they didn't know. What's the use of specialization in a field like that?


> I recently got into a debate over C++ (which I dislike) where I learned that there are people making judicious use of this language, which is still inappropriate for 90% of the purposes to which it is put, but seems to suit some needs perfectly in a way that neither a high level language nor C can.

You just learned that?


n-dimensional without question huh?

Boy did you miss the point of that article.

Although I do know of a great position which might interest you, it's called a 'software architect', just let me take you round the back here, ignore the guys with guns...




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: