I love the things I learned from Haskell, but I find it so painful to actually use :( Similar case with Erlang...
I wonder if there are any functional languages that aren’t weird? Like they call the first element in a list / the other elements “list[0]” and “list[1..]” instead of “car” and “cdr”? Bonus points for a C-inspired syntax rather than an abstract-mathematics-inspired syntax
For now I just write Rust and Python using functional-style design (const inputs, no side effects, etc), but I feel like I’m missing out...
I had been doing Ruby for a long time (also Java and Scala before and after that, now Python).
Gave Elixir and Phoenix framework a shot a month ago and it's a breath of fresh air: great language, outstanding tooling.
I've also done a bit of Haskell, OCaml and PureScript within my personal projects. I do miss having a state of the art typing in Elixir, but there is enough in Elixir to keep me happy:
* immutability everywhere
* ridiculously low latencies (not by HFT standards I guess)
I believe someone has made an OCaml BEAM compiler, but I don’t recall offhand if it’s just a hobbyist product or something that could be used seriously.
For me it was a similar story, after many years of python I got deep into the functional world with Elm, F#, some Haskell, and then I found out how complete the elixir ecosystem was but even a week later I could not stand going back to a dynamic and untyped languages, perhaps o didn’t insist enough with spec but I couldn’t go much beyond the second week. Did you have extensive usage of functional strong typed languages before elixir or just basic knowledge? Did it really get better that feeling of missing the types after a while? For me the difference feels as deceiving as falling from an full featured IDE to an environment with notepad and no source control and no compiler errors/stacktraces.
And I used to be the python guru at the previous company and made a whole lot of language evangelizing etc, I am afraid there is no way back for me.
I'm not sure where is the line between extensive and basic knowledge. Here is my more detailed exposure:
In commercial context:
* Of strongly typed ones only Scala (with [shapeless]). Can reluctantly throw in Kotlin as well for it's amazing structured concurrency.
In non-commercial context:
* Went through a few chapters of [Software Foundations] doing Coq proofs.
* Worked through most of the [Types and Programming Languages] (writing typecheckers in Ocaml)
* 3 services in Haskell (1 on Scotty, 2 on Servant). Loved persistent+esqueleto for the ORM layer, disliked Opaleye.
* 2 projects in PureScript (1 with Halogen, 1 with React bindings).
* 1 project in ReasonML (Ocaml).
-
> I am afraid there is no way back for me
I see where you are coming from. In my case I can alternate between "I want all invariants properly expressed and checked" and "I just want to ship that barely-working piece of junk and iterate on it". I learned to adjust depending on organization needs. IMO, for many orgs, especially startups/scaleups, the latter is often the more fitting way. With that in mind, I'm willing to trade the guiding hand of great type systems for other productivity aspects (amazing runtime and cohesive web framework in Elixir's case).
Can’t speak for the GP, but the most difficult part of Erlang isn’t the syntax, but trying to clear my mind of how I would design something in C#/Java and embracing/understanding OTP.
>Bonus points for a C-inspired syntax rather than an abstract-mathematics-inspired syntax
Since functional languages are based on the mathematical idea of function application, this is a weird request. What would a C-style functional language even look like? Most functional languages have syntax for doing a imperative-style sequence of assignments before producing a result. Beyond that, the risk is that C-style code will make the programmer think that the language itself is like C, Java, or other imperative languages - which fundamentally, it wouldn't be.
>imperative languages are based on the idea of a turing machine
In what way? I wouldn't count executing sequential instructions as a "the idea of a turing machine", and no major imperative language has programs which have only finitely many states along with an infinite memory space that is both code and data.
You might enjoy Scala. It has a range of usable styles, from "concise Java" to "somewhat Haskell-ish with OOP-ier syntax" to "type astronaut", and you can get by anywhere in between (as long as you don't mind hearing others bicker a bit about what people ought to be doing). It can target JVM, JS, and native/LLVM. And, Scala 3 just reached RC stage; it tries to revamp some of the language aspects that people found to be pitfalls or otherwise confusing, so it might be a great opportunity to try it for the first time.
As a Scala practitioner (well, in my previous job) I found Scala code harder to understand and read than Haskell.
Haskell's syntax is actually quite minimal by comparison. Scala type signatures, in particular of library functions, can be a beast to understand.
Also, because many Scala devs start using it as a sort of "better Java", spaghetti code and imperative-style messes are relatively common. This doesn't happen with Haskell, because nobody starting Haskell tries to write Java-like code with it.
Common Lisp has functions first, second, last, etc. It has rest too.
(first '(a b c d)) returns 'a
(rest '(a b c d)) returns '(b c d)
(third '(a b c d)) returns 'c
(nth 0 '(a b c d)) returns 'a
(nth 1 '(a b c d)) returns 'b
(last '(a b c d) 2) returns '(c d)
I find these quite intuitive, and after a bit of use one gets accustomed to Lisp's three ancient functions: car, cdr, and cons. car is the same as first and cdr is the same as rest.
In comparison to handling lists in more modern languages, Lisp seems overly verbose and a bit cumbersome. To be fair, there has been over half a century of evolution in programming languages since Lisp was originally designed.
Early Lisp was an alternative to programming in Fortran II, Fortran IV, or Algol 60. These are very primitive languages with significant hurdles for programmers trying to do non-numeric computations (e.g. AI programs). Lisp's underlying memory model of garbage collected cons cells gave it so much flexibility compared to Fortran's fixed length arrays containing only numbers. Fortran had no dynamic lists, no dynamic vectors or slices, no dictionary/hashmap types, no records or structs, no variable length strings, no sets, no tuples. Assembly language often appeared as a reasonable alternative.
Here are a few Fortran if statements appearing in a popular programming book of the 60's (I still have it on my bookshelf):
IF (x - 2.1) 40, 40, 30
This if will always jump to either line labeled 40 or line labeled 30 of the program.
Here's another:
IF(I.GE.20.AND.I.LE.42.AND.J.GE.20.AND.J.LE.42) GO TO 702
Lisp's cond (the special form used like if) may seem odd now, but compared to the other languages of the time it was so much more expressive.
Because I'm an Emacs user, I use Lisp frequently just to keep my configuration tweaked the way I like it, but if I was creating an editor like Emacs today I think it would be better to use Python, Javascript, or Lua for the underlying programmable part of the editor.
Perhaps Haskell is heading for the same kind of niche that Common Lisp occupies today, remaining important for the ideas it explored and of historical interest but never becoming more popular than it is today.
It's also worth noting that along with nth which takes the index first, there's elt which takes a sequence as the first parameter and index as the second. aref is similar but restricted to arrays and permits multiple numbers for the subscripts since arrays can be multidimensional. char, which accesses characters in a string, takes the string first and index second as well. bit takes the bit array as the first parameter and the subscripts follow it.
>Like they call the first element in a list / the other elements “list[0]” and “list[1..]” instead of “car” and “cdr”?
car and cdr are remnants of an ancient instruction set (https://en.wikipedia.org/wiki/CAR_and_CDR)... first/rest, head/tail are better. Actually doesn't Haskell use head/tail for lists? Except even there what is in the prelude is busted since head/tail are partial.
Maybe I'm weird but I even prefer cad/cdr to, say clojure first/rest. Even though first/rest is completely justified.. cad/cdr have a kind of abstract esoteric place for linear walk on a sequence, also they are symmetric three letters different by their middle.
In 21 or so of LIsp coding, I've noticed that my use of car, and cadr, cadar, caddr, and all those has been greatly increasing. I hardly ever type second instead of cadr. A cursor scan of my git history confirms it.
One reason is that second and third and such are oriented toward sequences. But sequences are often generic. In TXR Lisp, I have [x 2], so why would I ever write (third x)? It's verbose, like using Roman numerals instead of Arabic.
Now if I'm processing tree structure, I know that is made of conses. So (caddr x) makes sense.
Part of the reason it makes sense is that when we are processing tree structure, such as code syntax, we cannot just evaluate (caddr x) out of the blue. We can only do that if (cddr x) has been confirmed to be a cons cell: (consp (cddr x)). The syntax could be bad. It could contain the dotted notation in an unexpected place, or be missing required arguments.
It is also easier to read and verify. We know caddr is right because it just adds an a to cddr, the last cell which was tested.
There is an impedance mismatch between validating (cddr x) and then extracting (third x), which isn't there when (caddr x) is used.
Anyway, a lot of that kind of code is avoided by pattern matching.
(when-match (@nil @nil @elem . @nil) x
elem)
That also avoids traversing the structure multiple times. Unless the compiler is clever about doing CSE between these functions, (cddr x) starts scanning at x.
> Bonus points for a C-inspired syntax rather than an abstract-mathematics-inspired syntax
Out of curiosity, could you give an example of what you find "abstract-mathematics-inspired" of Haskell's syntax? Is it just the one-letter identifiers (which is more standard practice than syntax), the symbols for function names (which remind me of C++'s "<<" and the like) or what?
My own personal opinion is that Haskell's syntax has some pitfalls with indentation, but other than that it's not particularly difficult. It's just not C-like, but that's a separate issue.
Similarly, Lisp-like languages have barely any syntax at all!
There are some advantages in the syntax and reasons to be for all these things. I would recommend to start with a sweet intermediate step in which you may like more the domain you are going to program, for example if you are into web frontend go with Elm, if you like the dotnet ecosystem, server programming and mobile go with F#, and you may want to check clojure or prolog in order to understand why a homoiconic syntax is later important. Make the effort, it pays off when your brain clicks
I've not learned its Erlang base, so I can't tell my Erlang from my Elixir, but... in Elixir you can Enum.at(list,n), which with:
at = &(Enum.at(list,&1)
Could simplify into:
at.(n)
If that doesn't suit I'm sure some equally simple such thing could create exactly what you're after. (As I understand it, various kinds of metaprogramming are an intended base element of the language itself.)
car/cdr have historic origin all the way back to the fifties. It made sense when creating the languages for familiarity. In essence it's a pair of data and can be used in any which way you feel like. Understanding that makes for a powerful tool, which is why many functional experts so stubbornly hang on to them. Many functional languages have types as a first class citizen, and you can define abstractions on top of what you already have. If you put a cons cell as the data, you have created a tree structure as an example where car points children of a node and cdr points to the next sibling of a node.
Elm doesn't have quick list access like that, but it does have Arrays if you need them. Also accessing a list or array returns a Maybe of the contained type, rather than potentially throwing as in Haskell.
I wonder if there are any functional languages that aren’t weird? Like they call the first element in a list / the other elements “list[0]” and “list[1..]” instead of “car” and “cdr”? Bonus points for a C-inspired syntax rather than an abstract-mathematics-inspired syntax
For now I just write Rust and Python using functional-style design (const inputs, no side effects, etc), but I feel like I’m missing out...