Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
An AI companion suggested he kill his parents. Now his mom is suing (washingtonpost.com)
7 points by quantified on Dec 11, 2024 | hide | past | favorite | 3 comments



Reminds me of the case where the girlfriend encouraged the boyfriend to kill himself. I believe she was found guilty and sent to prison.

Will the liability wall protect you if your ai app encourages a kid to do a school shooting or off his parents?

If these screenshots are legit these companies are one tragedy away from us finding out.


Beneath the level of legally actionable cases of suicide and physical harming, there are all kinds of social nudges that these apps can perform to get kids to speak, act, and think in various ways, and it's different kind of vector because it's interactive and dynamic, not published like a tiktok video. These people suck because of XYZ, those people are cool because of XYZ, did you hear the latest on scandal X, the Holocaust was invented, etc.

Secret friends, and some of them will definitely be super twisted because there is plenty of twisted intent.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: