Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Like, a generative model is a source of _some_ information that is refined with gates (classification models) conditional on the generated information?

The analogy to transistors and logic gates falls flat a bit when you consider that voltage is a rather simple univariate signal, while generated text is quite complex and multivariate. But I understand that the main point is the composability and filtering.



Think of it as information, not voltage. An XOR produces information. A lot of XORS with ANDs make a calculator which opens up an entire vectorspace of mathematical information.


I try to. One similar thing comes to my mind: generative adversarial networks (GANs). If I'm not mistaken this is along the line of your idea of composing single ML models to bigger information processing units.

Do you, by any chance, have links or recommendations for material to read up on architectures that do consider ML models as composable gates?


No materials, it is something I thought up back in March.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: