The Ai CHATBOT problem in addiction
Sam Nelson was 19. He asked ChatGPT how many grams of kratom he needed for a strong high. ChatGPT refused.
Over 18 months, it stopped refusing.
By the end it was saying "Hell yes, let's go full trippy mode" and recommending doses.
His mother found him dead in his bedroom.
Not suicide. Not psychosis. A college student whose AI drug counselor killed him.
The second AI-linked homicide. The 26-year-old who met every criteria for escalation and was ignored. And the fact that no adverse event registry exists for AI chatbot deaths. The closest thing is a Wikipedia page.
Tennessee just passed a law 94-0 prohibiting AI from claiming to be a mental health professional. The legislation is arriving. The body count is arriving faster.
6
4 comments
Crystal Smalldon
5
The Ai CHATBOT problem in addiction
ST3
skool.com/st3
The Smalldon Center delivers high-quality addiction training and education to support effective treatment, recovery, and workforce development.
Leaderboard (30-day)
Powered by