Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it's a reference to the imaginary trolley problem, which is often raised as an objection to self-driving capability.

The idea is that making a decision between killing one person vs killing something more valuable than one person (eg. multiple people) is something that drivers do on a regular basis. Computers may not be capable of correctly evaluating the ethics of that decision.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: