Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The examples are odd because he doesn't incorporate any notion of differentiability.

So a Generating RNN is not quite like foldr, since foldr has no notion of differentiability.

One needs to show examples that pulls in some kind of automatic-differentiation capability.



When you pass a differentiable function into fold -- or most higher order functions, for that matter -- you get a function that is differentiable on everything but a measure zero set.

The mechanics of how you compute the derivatives are separate from this. Obviously, the efficient way is to use backprop (reverse mode AD), as we always do in deep learning. But you could also use discrete derivative approximations. The point is that the resulting function is differentiable, which is independent of how you compute the derivatives.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: