Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I guess training LLMs on works of fiction/sci-fi would not be of net benefit. No distinction between reality and perceived reality. Considering LLMs have a hallucination problem as it is.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: