Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To tire a comparison to human thinking, you can conceive of it as hallucinations too, we just have another layer behind the hallucinations that evaluates each one and tries to integrate them with what we believe to be true. You can observe this when you're about to fall asleep or are snoozing, sometimes you go down wild thought paths until the critical thinking part of your brain kicks in with "everything you've been thinking about these past 10 seconds is total incoherent nonsense". Dream logic.

In that sense, a hallucinating system seems like a promising step towards stronger AI. AI systems simply are lacking a way to test their beliefs against a real world in the way we can, so natural laws, historical information, art and fiction exist on the same epistemological level. This is a problem when integrating them into a useful theory because there is no cost to getting the fundamentals wrong.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: