Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is the same like saying "Blindly navigating Google Maps would have killed me." when a person was shot for trespassing on a classified military unit that had been deliberately removed from maps by Google.

Normal LLMs are number predictors, yes. But Google Gemini is not a normal LLM: is a lobotomized model of unknown training dataset for supposedly moral-educational filters, from which information about poisonous substances has been cut out.

Specific people are liable for pushing hallucinating word-generator into Google Search. Specific people are liable for censoring this "model". And the fact that the responsibility for this censorship shifts to the end-users plays very much into their hands.

Upd: I provided examples of different Gemini responses depending on version/censorship settings in https://news.ycombinator.com/item?id=40728686



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: