Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

of course LLM doesn't experience or feel anything. To experience or feel something requires a subject, and LLM is just a tool, thing, an object.

It's just a statistical machine which excels at unrolling coherent sentences but it doesnt "know" what the words mean in a human-like, experienced sense. It just mimics human language patterns prioritising producing plausible-sounding, statistically likely text over factual truth, which is apparently enough to fool someone into believing it is a sentient being or something





You are being awfully confident about something that we cannot assess or even consistently define. They probably do not have subjective experience... but how do we know for sure?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: