Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hallucinations in LLM will severely affect its usage in scenarios where such hallucinations are completely unacceptable - and there are many such scenarios. This is a good thing because it will mean that human intelligence and oversight will continue to be needed.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: