Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think you'll find that humans have also demonstrated that they will misrepresent their own reasoning.

That does not mean that they cannot reason.

In fact, to come up with a reasonable explanation of behaviour, accurate or not, requires reasoning as I understand it to be. LLMs seem to be quite good at rationalising which is essentially a logic puzzle trying to manufacture the missing piece between facts that have been established and the conclusion that they want.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: