Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The issue starts, when an LLM's transfers knowledge between languages, even though that knowledge is not correct in that language. I have seen this with e.g. ChatGPT answers regarding laws for example where it refers to US laws when asked in German, which are obviously not relevant.


> The issue starts, when an LLM's transfers knowledge between languages, even though that knowledge is not correct in that language. I have seen this with e.g. ChatGPT answers regarding laws for example where it refers to US laws when asked in German, which are obviously not relevant.

There is no necessary correlation between language and the correct set of laws to reference. The language of the question (or the answer, if for some reason they are not the same) is an orthogonal issue to the intended scope. There is no reason US laws couldn't be the relevant to a question asked in German (and, conversely, no reason US laws couldn't be wrong for a question asked in English, even if it was specifically and distinguishably US English.)


When you ask an LLM (in German) without further clarifying your location I expect it to refer to German (or Austrian/Swiss) laws.

For most questions it does this pretty well (e.g. asking for the legal age to drink). However once the answer becomes more complex it starts to halucinate very quickly. The fact that some of the hallucinations are just translated US laws makes me think that the knowledge transfer between languages is probably not helping in instances like this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: