Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"defer" has really weird semantics; it's based on imperatively pushing cleanups onto implicit per-function mutable state. (This shows up if you call "defer" in a loop, for example.) It's also possible to screw up and forget to use "defer", causing things like leaks of file handles. RAII, on the other hand, is based on lexical scope, without hidden mutation, and also makes it harder to forget to do things like close files.


That doesn't strike me as particularly strange. It just maintains a separate stack per function, then unwinds it at the end.


It's more complex to specify and implement than "this code statically runs at the end of the block". This can lead to surprises. (I would not have guessed what "defer" in a loop does until I learned.)

Because "defer" is tied to function definitions, it means that pulling code out into a new function or inlining a function will silently change its behavior in subtle ways.

Like most complex language semantics, "defer" is less easily optimizable than the simple semantics of RAII—with RAII the compiler statically knows what it is supposed to call, which helps exceptions (panicking goroutines in Go) because it allows their cleanups to be optimized into table-driven unwinding. But because Go chose "defer", the compiler must insert code that dynamically keeps track of cleanups to handle the general case. This results in indirect function calls, and bookkeeping overhead, whereas RAII is purely static function calls. The bookkeeping overhead is significant: it allows RAII to be zero-cost in the no-panic path (very important in C++'s case!), while I don't see any way to implement "defer" in a zero-cost way.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: