Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Technological advances tend to happen in jumps. We are no doubt approaching a local optima right now, but it won't be long until another major advancement propels things forward again. We've seen the same pattern in ML for decades.


Please name me one technological advance of major import in the fundamental transformer kernel space that has occurred in the last decade that has any import at all on today's LLMs.

I will wait.


The very idea of the Transformer architecture. Surely you've heard of "Attention is all you need".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: