Hacker Newsnew | past | comments | ask | show | jobs | submit | Apollidore's commentslogin

Haskell is fun, but bear in mind that it's followes a purely functional programming paradigm, meaning you either do things the Haskell or you don't do them :). But it's a fun language to learn, and there are lots of free quality ressources on the internet


What do you mean by "either do things the Haskell (way) or you don't do them"? Isn't that the same with every language? I still don't understand why people put all sorts of unfair labels on Haskell. By labelling Haskell with "fun", we almost make it sound like it can't be used in production.


What I think he meant is more like "either do things the functional Haskell way or you don't do them", as in, there is no non-functional fallback or alternative in the language, which some functional languages have.


There's escape hatches here and there. If you need imperative code, use IO or ST. If you need to debug output something, Debug.Trace.

If you need to temporarily bypass purity there's unsafePerformIO and friends (or they can occasionally be used permanently if you do a lot of analysis I believe).

It's a pretty practical language compared to its reputation.


Actually, someone said, “Haskell is the world’s best imperative programming language.”


Somebody said it, but it's glib, and not particularly true. I believe it was a comment on do notation, the bind operation, and programmable semicolons. The customizability is definitely cool, but nobody could possibly argue that imperative programming in Haskell is ergonomic.


Sure, IO and ST are escape hatches, but they're not escape hatches that beginners struggling with the concept of a higher order function are going to be able to use.


I imagined it was more in the sense of the 'Have fun implementing Quicksort!' variety, where you're fighting the language constructs, basically, to do what you want.


Haskell has a huge catalog of libraries you can reach for, so you don’t have to reinvent the wheel. That said, the Haskell way can seem very different than what you see in most languages. As an example, IO—or anything causing a side effect—has to happen in the IO Monad. There are good reasons for this, but it takes some getting used to.


But let's say I want to print something in a deeply nested function which isn't of the return-IO-Monad-type. Now I have to convert the entire call-chain of functions into Monad style. Is there some tool to do this automatically?


For debug there is Debug.Trace which is about perfect.

For non-debug, you may be overestimating how useful it is to output from pure functions in production.

In general when you are in a situation where you need to change the Monad you're under, there's techniques and libraries for that (one way you define it once, give it a name, etc.).

The biggest pain tends to be going from pure to monadic in the first place, which last I used Haskell there wasn't much for but to just do it. There may be more tooling now, haven't seen.


Why would you want to print inside a pure function though?


Because some non-technical manager came to you and said, "Can't you just print out the value here? And can I get it in the next 15 minutes?" and it was pretty clear that "No" was not an acceptable answer.


The parent poster is not lamenting that Haskell lacks a trivial sorting function.

They are lamenting that implementing a trivial sorting function in Haskell is difficult.


Different things are hard or easy in Haskell compared to other languages. Mergesort would be the first one I'd reach for in Haskell if I'm implementing from scratch.

Quicksort isn't really hard, you just do it in IO and it looks about like what you'd see in any imperative language. (or you can use ST, but then you have to know about ST).


I disagree.

  quicksort [] = []  
  quicksort (x:xs) =   
      let smallerSorted = quicksort [a | a <- xs, a <= x]  
          biggerSorted = quicksort [a | a <- xs, a > x]  
      in  smallerSorted ++ [x] ++ biggerSorted


That is (infamously) not quicksort.


What's wrong with it?


Performance. Internal representation. You are not actually modifying things in place. Those are the equivalent of linked lists not arrays. That quicksort is not quick at all because the append operation in Haskell is linear. You can't use those lists to implement quicksort.


Haskell is lazy. a ++ b === (head a) : ((tail a) ++ b) (plus the base case)

It is only linear when you force it to evaluate, which ideally happens only once (amortized) in your program when the result is consumed. As long as you are careful, you don't get the bad quadratic performance that you'd get from something like eagerly appending a character to a string in a loop in Java. There are either gotchas, though, including relying on the compiler to do things in constant space that naively look linear, and also knowing when the compiler won't help you and you have to structure your computation manually.


“Modifying things in place” is, from the logical standpoint, an unnatural and even dangerous thing. A person that is not familiar with (imperative) programming often has a hard time understanding variables and assignments (similar to some programmers not understanding pointers). Mathematics doesn’t have assignments, either. That should tell you something...


It mostly tells me that mathematics does not care about runtime or space performance. Which is fine, because mathematics is about solving abstract problems, not getting a Turing machine to solve concrete problems.


In that case, quicksort requires danger and artifice. In any case, it's not quicksort.


Won't this have to iterate through linked lists and create multiple new heap allocations on each recursive step? Doesn't that imply n log n heap allocations just to sort a list?


Yes, but due to lazy-evaluation that won't happen until you do something with the sorted list. Your performance may just blow up in some seemingly unrelated place.

My biggest problem with Haskell is how GC and lazy-evaluation makes it very difficult to reason about what the hardware is actually doing at a given point. I know there ways to inspect and control it, but I've found myself preferring languages that have simpler mental models.


Right, by the time you have to control it, you are washing away all the advantages for using it.


Not that I know any Haskell, but wouldn't smallerSorted already include x due to <= operator?

edit: also, how is the pivot element selected there?


Both of your questions have the same answer. (x:xs) means x is the first element and xs is the rest.


Aha! Thanks.

Obviously picking first element as pivot makes for a very poor Quicksort. Though I presume a better approach, say median of first, mid and last elements, would add but an extra line?


The reason the first element is used as the pivot is that these are linked lists, not arrays, so it would take linear time to access the middle and end. This isn't a real quicksort.


Spot on. There's a lot of wisdom in this comment.


But you do have to invent something or you wouldn't be writing a program in the first place.

As soon as you need to control the order of execution, memory usage, IO, or any combination of those, you are fighting the language and into territory of things you can technically do and out of the realm of things the language makes easy.


For 99% of programmers, Haskell can't be used in production.


This is nonsense.


I wish I could give you a thumbs up for speaking plain truth here. Anyway programmers will find out for themselves how best to satisfy their customer need.


> it's followes a purely functional programming paradigm

I'd say that it's one take on that paradigm, the Way of Monads is not the only way to do purely functional, Clean uses uniqueness typing instead. Though it otherwise has rather similar properties to Haskell.


I'd say start with some eager functional language such as OCaml without the "object" part. Haskell code is hard to debug cuz you can't rely on good old print.

Monad itself is never a silver bullet. Type systems and Composability are the true power of FP IMO.


Debugging declarative code is always hard, whatever the language. Haskell does allow you to use trace expressions which will output values to standard out, see Debug.Trace.

This is no reason to avoid arguably the most popular (and state-of-the-art) function language and implementation out there.


It took me way too long to come across Trace, if it was more widely known I don't think we'd see so many "impossible to debug" related issues. It's great.


I think we can all agree that debugging languages with lazy evaluation is specially hard. To effectively use Debug.Trace you actually need to add strictness and that can be fundamentally incompatible if you do intend to use the full power of lazy evaluation.


Debug.Trace won't solve the space leaks lurking around every corner of your program, and won't shave the hundreds of megabytes to gigabytes of RAM the compiler needs for compiling a non trivial program with dependencies.


It's for runtime debug logging after all, not intended to solve those issues.

Space leaks I'd like to read more about but writings about them are a bit hard to come by. Recommendations?


Though The Haskell Way includes the io monad, which is imperative code in practice :)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: