> I have never heard that Hopfield nets or Bolzmann machines were given any major weight in the history.
This is mostly because people don't realize what these are at more abstract levels (it's okay, ironically ML people frequently don't abstract). But Hopfield networks and Boltzmann machines have been pretty influential to the history of ML. I think you can draw a pretty good connection from Hopfield to LSTM to transformers. You can also think of a typical artificial neural network (easiest if you look at linear layers) as a special case of a Boltzmann machine (compare Linear Layers/Feed Forward Networks to Restricted Boltzmann Machines and I think it'll click).
Either way, these had a lot of influence on the early work, which does permeate into the modern stuff. There's this belief that all the old stuff is useless and I just think that's wrong. There's a lot of hand engineered stuff that we don't need anymore, but a lot of the theory and underlying principles are still important.
Either way, these had a lot of influence on the early work, which does permeate into the modern stuff. There's this belief that all the old stuff is useless and I just think that's wrong. There's a lot of hand engineered stuff that we don't need anymore, but a lot of the theory and underlying principles are still important.