Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How can they have high confidence is the actual prompt, rather than a hallucunation? Is it related to how robust the output is to multiple prompt injections?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: