Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Who, exactly, said that reasoning requires introspection? The proof of reasoning is in the result. If you don't understand the math, you won't come anywhere near the correct answer.

That's kind of the idea behind math: you can't bullshit your way through a math exam. Therefore, it is nonsensical to continue to insist that LLMs are incapable of genuine understanding. They understand math well enough to solve novel math problems without cheating, even if they can't tell you how they understand it. That part will presumably happen soon enough.

Edit: for values of "soon enough" equal to "right now": https://chatgpt.com/share/680fcdd0-d7ec-800b-b8f5-83ed8c0d0f... All the paper you cited proves is that if you ask a crappy model, you get a crappy answer.



A simple program in the calculator can provide the correct math answer, hence I conclude that my Casio can "reason" and "understand" maths.

You have redefined words reason and understand to include a lot of states which most of the population wouldn't call neither reasoning not understanding. In those arbitrary definitions, yes, you are right. I just disagree myself, that producing correct math answer is in any way called reasoning, especially given how LLMs function.


A simple program in the calculator can provide the correct math answer, hence I conclude that my Casio can "reason" and "understand" maths.

Cool, we're done here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: