Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not at all. It's not the counter-factual they're generating, it's the "too rare to capture often enough to train a response to" they're generating.

They're implying that without the model having knowledge, even approximate, of a scene to react to, it simply doesn't react at all; it simply "yields" to the situation until it passes. In my experience taking Waymo's almost daily this holds.

I would rather not have the Waymo yield to a tornado, rising flood-waters, or charging elephant...



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: