Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure, but I take it the original comment wasn't exactly by someone with some ML background. And getting to grips with log likelihoods, (cross-) entropy, linear/logistic regression, evaluation metrics, and maybe even some Bayesian statistics might be rather helpful before jumping on the DL bandwagon.

While there are far too many hardcore statisticians and academics who love their theorems more than anything, not all classes are that way. I think I'd have loved it if I could have learned ML from today's MOOCs, instead of those theorem provers and formula speakers I had to deal with (and pass real-life exams you can't repeat every 8 hrs...)



I don't have an ML background and I had no problem understanding the LeCun 1998 paper. Naturally, the more ML one knows the better, I'm just encouraging people to dive in and try without getting intimidated.


Anecdotally, one astonishing observation I often make is that "breakthrough" papers [1] are nearly universally among the most accessible, clear and easy to follow. From Watson and Crick on DNA in MolBio, to Backpropagation by Hinton in ML, to Cox' survival model in Statistics, the most significant advances often tend to be the "easiest" to understand (in hindsight only, naturally).

[1] http://www.nature.com/news/the-top-100-papers-1.16224




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: