God damn it. I’ve literally been warning FOR YEARS that LLMs will cause someone to commit suicide. I use this example in all my talks on why we need more research on safe NLP systems. The example I literally use is that a chatbot will reinforce someone’s suicide ideation and they will act on it. Now it’s happened. Now it’s real.

“Belgian man dies by suicide following exchanges with chatbot”

https://www.brusselstimes.com/430098/belgian-man-commits-suicide-following-exchanges-with-chatgpt

@Riedl

Syndication Links

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.