Hacker Newsnew | past | comments | ask | show | jobs | submit | lkitching's commentslogin

Neither defprotocol nor deftype introduce static typing into Clojure. Errors in their usage are not checked statically and are only discovered at runtime.


No, defprotocol and deftype have the same properties as Java interface and Java classes, and the types are checked at compile time. This is static typing. Period. Clojure is a compiled language, it does check typing during compilation.


Protocols do not work like Java interfaces or classes. Their methods are compiled into regular functions which lookup the implementation to use at runtime based on the runtime type of the receiver. Compilation will check for the named function but doesn't do any further checking. Given the following protocol and implementation:

  (defprotocol P
    (method [this ^Integer i]))

  (extend-protocol P
    String
    (method [s i] (.substring s i)))
both (method "test" "call") and (method 1 2) will be accepted by the compilation phase but will fail at runtime.

Of course there's no requirement for Clojure code to be AOT compiled anyway so in that case any name errors will still only be caught at runtime when the compilation happens.

Type hinted bindings are only converted into a cast and are not checked at compilation time either e.g.

  (defn hinted [^String s] (.length s))
  (hinted 3)
will be accepted but fail at runtime.

deftype is only used for Java interop an is also not a form of type checking. The methods will be compiled into Java classes and interfaces, but the implementations defer to regular Clojure functions which are not type checked. You can only make use of the type information by referencing the compiled class files in Java or another statically typed language, using them from Clojure will not perform type checking.


deftype IS a Java class, it's not compiled into something else. What is a Clojure function? A Clojure function is a Java class. Clojure is a compiled language, so it does check types, just like Java check types.

So if you use defprotocol and deftype for every domain objects in your code, your code won't compile if there's a type error. Try it.

BTW, that's the way many Clojure libraries are implemented. These libraries rely on dispatch on type to work, so they are taking advantage of the type checking.

Of course, you will say, "oh, clojure is not normally AOT, so it's not dong the checks.", but that's another issue. The issue at hand is this: can you write Clojure such that types are checked at compile time. The answer is YES.

The compiler may run only when you run the program, that's a different issue. You are confusing these two issues.

If you want a separate compile stage, then basically you are already excluding runtime compilation, i.e. you are arguing against runtime compilation. So it's not really about typing, but about how you want to run the program. Isn't it? You want AOT for everything, you don't want runtime compilation. That's it. It has nothing to do with types.


Of course Clojure has to ultimately be compiled into a native format for the host platform, bytecode in the case of the JVM implementation, but that doesn't require type checking in the same way Java does.

Clojure functions are compiled into implementations of clojure.lang.IFn - you can see from https://clojure.github.io/clojure/javadoc/clojure/lang/IFn.h... that this interface simply has a number of overloads of an invoke method taking variable numbers of Object parameters. Since all values can be converted to Object, either directly for reference types or via a boxing conversion, no type checking is required to dispatch a call. With a form like

  (some-fn 1, "abc", (Object.))
the some-fn symbol is resolved in the current context (to a Var for functions defined with defn), the result is cast (not checked!) to an instance of IFn and the call to the method with required arity is bound. This can go wrong in multiple ways: the some-fn symbol cannot be resolved, the bound object doesn't implement IFn, the bound IFn doesn't support the number of supplied arguments, the arguments are not of the expected type. Clojure doesn't check any of these, whereas the corresponding Java code would.

Protocol methods just get compiled into an implementation of IFn which searches for the implementation to dispatch to based on the runtime type of the first argument, so it doesn't introduce static type checking in any way.


But if you add type hint in the signature, it does check the type. Basically, if you specify the type, it will check type. Just like any language that is not automatically inferring types, e.g. Java. So it is the same as Java.

You guys make it out like Clojure is doing something extra to hide Java types, but it doesn’t. What Clojure does is really minimal on top of Java. It barely hides anything.

If you give it type, it will check type. If you don’t give a type, it falls back to a default type, Object, which IS a TYPE. The fact that Clojure compiler cannot deal with GraalVM SVM Pointer type tells you that it’s checking type, because Pointer is not an Object! I found this out the hard way: https://yyhh.org/blog/2021/02/writing-c-code-in-javaclojure-...

“One limitation that one needs to be aware of when writing native image related Clojure code, is that most things in the GraalVM SDK inherit from org.graalvm.word.WordBase, not from java.lang.Object, which breaks the hidden assumption of a lot of Clojure constructs.”


I've already shown that type hints do not constitute type checking:

  (defn f [^String s] (.length s))
  (f 3)
is a valid Clojure program that fails at runtime with a cast error.

  class X { public static int f(String s) { return s.length(); } }
  X.f(3)
is not a valid Java program at all. Clojure compilation generates bytecode to dispatch dynamically and all but the most basic checks are handled at runtime by the JVM. This is fundamentally different to the static type checking that languages like Java and Scala do. It's not that Clojure is hiding something from Java, but rather that it isn't doing the considerable amount of effort the Java type checker does to analyse the program before execution. This is by design - Clojure has deliberately avoided adding a static type system in favour of things like spec.


So what happens at runtime if errors are found?


you get runtime errors with a long JVM stacktrace


If you paste into Claude, it will instantly tell you what's going on.


If the core team had ever addressed the decade of surveys showing that error messages/stacktraces were people's top complaints, you wouldn't need Claude.


Which OCaml features exist in Rust but not Haskell? The trait system looks very similar to Haskell typeclasses, but I'm not aware of any novel OCaml influence on the language.


> Which OCaml features exist in Rust but not Haskell?

Rust's most important feature! The bootstrapped implementation.


I'm not convinced the implementation language of the compiler counts as a feature of the Rust language. If the argument is that Rust wouldn't have been invented without the original author wanting a 'systems OCaml' then fine. But it's possible Rust would still look similar to how it does now in a counterfactual world where the original inspiration was Haskell rather than OCaml, but removing the Haskell influence from Rust as it is now would result in something quite different.


Rust isn't just a language, though.

Additionally, unlike some languages that are formally specified before turning to implementation, Rust has subscribed to design-by-implementation. The implementation is the language.


That just means the semantics of the language are defined by whatever the default implementation does. It's a big stretch to conclude that means Rust 'was' OCaml in some sense when the compiler was written with it. Especially now the Rust compiler is written in Rust itself.


You're overthinking again. Read what is said, not what you want it to say in some fairytale land.


The way in which monads are monoids isn't really helpful for understanding them for progamming.


I dunno, it helped monads "click" for me at a theoretical level (they're just monoids in the category of endofunctors, what's the problem?) but by then I'd acquired a strong intuition of them already, so maybe it's not helpful for a novice. I'm pretty sure Applicatives help tho, LYAH seems to think so anyway. They sure would have helped me: the "Gentle Introduction to Haskell" was anything but.


Monads being monoids isn't very interesting in a practical sense - e.g. just because you can associatively combine steps in a program isn't very interesting. (We do it all the time when we extract a subset of instructions as a procedure)

I'd say monoids are more interesting, mainly due to implications for massively parallel algorithms and caching and how finding operations that fit the laws let you reap massive benefits in both aspects.

I've yet to see something as practically interesting for monads. Perhaps monadic parsers or STM qualify. What would you say makes monads just as interesting in a practical sense?


Since this defines an interface, does this solve the null problem? e.g.

    Maybe<String> foo = null;
    foo.map(s -> "bar");
explicit pattern matching is usually discouraged anyway though, and you can do this now with Optional

    Optional<String> foo = Optional.empty();
    String message = foo.map(s -> "Hello " + s).orElse("Fine, leave me hanging");
Other languages like Scala also have an Option.fold method for this specific case.


How is this related to Brexit? Ireland have had a low corporation tax rate for the last 20 years, long before the Brexit vote.


Assuming you're referring to *1, *2, *3 and *e, these are only defined within the REPL and are never used in real programs.


The OP is contrasting between a 'validation' function with type e.g.

    validateEmail :: String -> IO ()
and a 'parsing' function

    validateEmail :: String -> Either EmailError ValidEmail
The property encoded by the ValidEmail type is available throughout the rest of the program, which is not the case if you only validate.


The post is suggesting that parsing and validation are different things, since the output of a parser captures the properties being checked in the type, and validation does not. Downstream consumers of validated input cannot rely on the properties that were validated since the representation type doesn't encode them e.g. the non-emptiness of a list.


Arguably the resource is the dataset of stock exchanges, and the CSV representation is forced to omit all the metadata but the HTML representation isn't.


I understand what people are getting at, as it's not really that big of a logical leap. I think the fact that it is somewhat of a stretch, but still "in the lines," is exactly why it is a "trick": it isn't doing anything particularly invalid or hacky, it's just not necessarily what you'd imagine when reading the RFC. Content negotiation to me is more about serving the optimal content to a given agent, not really about selecting modalities for different use cases based on different types of user agents.

I think both cases are "valid" although I think it is inherently less tricky if the document talking about and previewing the dataset is referenced via a separate URL from the dataset itself. (Which, of course, entirely mitigates problems like Apache Spark having HTML in the Accept header.)


There's been a fair amount of churn in the ecosystem, even if the language itself has been very stable. Leiningen was ubiquitous 5-6 years ago, but if you switched to using deps for dependency management you lost the ability to run tests or build uberjars and had to manually re-create these on a per-project basis. Now there's tools.build, but that also requires you to manually write essentially the same tasks for basic functionality in each project. Leiningen and tools.deps also seem to resolve depedencies differently, so you can run into issues simply by migrating from one to the other.


That's not "massive amounts of toolchain pain", this happens in every general-purpose language, I'd say: that's "business as usual".

Like in Javascript, for example, any lib composed with CRA (Create React App) is painfully difficult to re-use in non-CRA apps. You constantly run into dependency resolution pain with Haskell; Python has toolchain resolution problems; .Net has its own challenges. Well, at least .Net folks don't have to run three different, incompatible versions of Visual Studio anymore to compile a single project. I do remember those days.

If I had to run multiple versions of Clojure, Leiningen, or CIDER on a single machine to compile different projects, or if Clojure folks had to invent something like pyenv, yeah, I'd agree that there is a problem.

You just can't make everyone happy. People either complain that "Clojure is dying" because some lib hasn't been updated since March, or "Clojure has too much churn" because Cognitect rolled out a new lib.

Clojure earned the fame of being very stable because you can pick any five-six years old project and it still would compile. Now you're complaining that you've decided to switch to a different build tool and saying it's painful?

Do you know what's painful? Having to migrate from Angular to React and to keep them both in the same .js project during the transition phase. Clojure has nothing of that sort. So many times we slowly moved from one thing to another with virtually zero downtime.

I can compare the frustration you seem to describe with my own experience building thing in different languages. Clojure by far is the least frustrating in that regard.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: