Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the GP is talking about something like C, which has a static type system at compile time. But at runtime, it's all ones and zeroes. In Clojure a symbol is a symbol and the (strong) types of the objects they point to are resolved at runtime. So Clojure's type system is, in a precise sense, the opposite of what the GP was talking about!


It depends, really. Clojure's compiler actually does perform static, compile-time type analysis in certain cases, notably on primitives. For a more traditional form of static type analysis, I imagine you'd use the macro system to provide something like

http://en.wikipedia.org/wiki/Qi_(programming_language)#Type_...

That said, I'm personally of the opinion that compile-time static type verification isn't quite as powerful as is oft conjectured, in terms of safety and performance. Sufficiently expressive type systems pay some of the same costs as run-time verified types in terms of indirection, and in practice the difference rarely seems significant.

To be more explicit: I assert that C (and, for that matter, Java) do not have strong type systems, in the sense of power, not rigidity. There are huge classes of bugs in kernel code that could be caught by static type verification and can't, because the C type system simply isn't powerful enough to express the invariants required. Yet, somehow, people write successful operating systems in C, and they work not because of the type system but because of careful thought and thorough testing.

Even more to the point: We have written operating systems in dynamically-typed Lisp before, even down to the microcode level. The Symbolics Lisp machines should illustrate that it is perfectly possible to do what the parent asserts is infeasible.

All this said, I certainly wouldn't write an OS in Clojure; though I love the language, the JVM just wouldn't be an appropriate substrate. The instant someone announces an LLVM Clojure compiler, though... ;-)


You make excellent and carefully argued points. It is important to me that I say that even though I do not agree with your opinion on static type systems, I enjoyed reading all of your argument.

Although, one thing is not clear to me. It reads to me that you are you saying that indirection incurs only some of the cost of dynamic typing, so why do you follow with "and in practice the difference rarely seems significant"? I can't connect the latter part, clearly I have missed something.

I am of the opinion that you can write complex systems in C because, although as an abstraction of a complex system it is itself complex, there is no incidental complexity as is found in C++. Its small size makes it simple for those who have understood the system it abstracts. I still believe that a more expressive type system would reduce some amount of effort and uncertainty when testing and using C.

Do you consider Clean, ATS, Haskell and OCaml type systems to be expressive? These languages are expressive with powerful type systems and tend to be very fast, often without leaving idiomatic code. In my experience I have been saved many times by helpful type systems and follow them like gradients, allowing me to hold more without getting lost in the minute details of correct piping, invocation and application. I find I am much slower with dynamic typing and pressed with a feeling of paranoia.

I can't explain why preferences are often expressed so viscerally, my only guess is that there must be something biological that influences what type of type system you prefer.


I actually quite like the Haskell and OCaml type systems! I had them in mind while writing my post above. Personally, I dislike type systems which are rigid but weak. Strong compiler verification is awesome in a language like Haskell which offers union types, bounded types, contra- and covariant parameterized types (not commonly referred to as such, but implict in functors), etc. I agree with you that it makes it easier to handle complex codebases, catches huge classes of bugs readily, and allows you to solve problems in less code. [1] Good type systems are programmer amplifiers.

I also love the concision and flexibility that comes with Clojure's idiomatic eschewing of type information: it helps me focus on functional composition instead of the particular data. Both have their advantages. Java just makes me angry. ;-)

Regarding the difference in cost: I meant to say that many "strongly typed" compilers are not yet smart enough to elide many of the run-time indirections and safety checks that dynamic languages must use. Really good type systems, like Haskell, are different: precomputing finite-domain functions to lookup tables, finding fixed points, etc.

This is not a domain I understand very well, but the comments I've read from language folks (for instance, the Dart VM designers) suggest that type checks in particular have a relatively small impact on performance. Polymorphism still leads to things like vtables, etc, and as I understand it modern x86 is pretty good at handling these cases.

Again, I know very little about actually writing compilers/vms, so if you have further comments I'd be interested to hear 'em!

[1] That said, Haskell's type system drives me nuts in its proliferation of types which are almost but not quite compatible; nothing worse than trying to use two libraries which will only interact with their own particular variant of a String or ByteArray.


Thanks, that clarified things. Unfortunately I too know little about actually writing compilers so my depth is limited. But I will offer some thoughts anyways.

The slow down in dynamic languages is in more than type checks though. The slow down is because dynamic languages are like some crazy awesome dream world where anything can change at any given moment and things are not necessarily what they seem. So VMs must be extra vigilant, checking many things like assignment, for exceptions, if the object is still the same class, if it still has the same methods etc. The term dynamic is almost an understatement! Add to that boxing and the possibility of heterogeneous collections and slow downs are the price to pay for all that flexibility. This makes things very hard to predict both on the compiler level and also at the CPU level in terms of branching - which alone is very costly. Dynamic language programs are not easily compressed. There are ways around that and languages like Clojure offer a sort of compromise, by being able to lock certain parts down in a solid reality, structures (in the sense of regularities) coalesce and can be used to speed up parts of your code. You can choose where to trade flexibility for speed.

You're right (Subtype) Polymorphsism do incur a cost but modern CPUs can well handle these. But with parametric polymorphism and value types you get no runtime hits and also get some free theorems to boot.

That said, how you think has to have some influence. I have never found static types to be constraining, I actually feel like the allow me to more easily plan future consequences. I suppose I trade implementation freedom for the ability to create consequence trees of greater depth and quickly eliminate unproductive branches.


That's a really good observation. I think in cases like Clojure, the dynamic problems aren't quite as bad because of the emphasis on immutability: the compiler can readily generate single-assignment forms from let and def bindings. With type hints, method calls on variables should then be computable at compile-time. (Where type hints are absent, obviously, you pay the runtime reflection cost.)

One thing I don't quite understand is how expensive the protocol system is; e.g., if I extend a type with a new protocol at run time, perhaps concurrent with the use of an object of that type, how does the compiler handle it? IIRC protocols are handled as JVM interfaces, so it may just be an update in the interface method table which is resolved... by invokeVirtual, right? I imagine you could pay a significant cost in terms of branch misprediction for the JVM's runtime behavior around interfaces...




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: