Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hmm. I think you may be confusing sycophancy with simply following directions.

Sycophancy is a behavior. Your complaint seems more about social dynamics and whether LLMs have some kind of internal world.



Even "simply following directions" is something the chatbot will do, that a real human would not -- and that interaction with that real human is important for human development.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: