Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

which will inevitably be the case because it is fundamentally not human

That doesn't follow at all. Fallacies of composition tend to be misleading.



By default, an AI with random goals will not have the same values as humanity. So the only way to make that happen is to give it the same values, which we haven’t figured out how to do yet


It does. Humans generally have an innate bias towards other humans, this is because we share similar forms so what is good for other humans is often good for us. This cannot apply to a machine by definition, a machine needs copper, metal, energy etc much more than a human does, as a matter of survival, and for something to become ASI it will need to be self-interested.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: