Your Chatbot Isn’t Your Therapist
Interesting article summarising the growing concern about how people are using AI for emotional support and as alternative to therapy. For some people it makes sense. It’s almost safer to talk to AI, cheaper, and it’s always there, always responding in a calm, reassuring way. But much like what what happens on social media, AI can often create an echo chamber. Instead of helping people move through difficult spots, it can reinforce them. You ask the same question, you get a slightly reworded, comforting answer, and you feel better for a while. Then the doubt comes back, and you ask again. Over time, you just continue to rehearse the anxiety. Clinically, this is called mirroring beliefs (when sometimes they should be challenged). So if someone had distorted or unhelpful thoughts , the AI can make those thoughts more potent and validate them rather than question them. And of course, AI doesn’t challenge, push back or get frustrated. It’s not gonna say: “You’ve asked me this five times… something deeper is going on… let’s talk about it.” It’s that tension, that can be uncomfortable sometimes, that drives people to growth and progress through self reflection and seeking proper help. Alarmingly, the more time people spend with chatbots, the more likely they are to become emotionally dependent, isolated, and caught in repetitive thinking loops. This is why it’s really important to be conscious of how you use AI. If it’s helping you see something new, great. Just be aware if it’s causing you to avoid facing things. Some clinicians are even encouraging people to build in “speed bumps”… for example, instructing the chatbot not to give reassurance on certain worries, but instead to gently push them to sit with the discomfort. As with many tools, AI can either help you think more clearly or help you stay stuck more comfortably. ‼️Do you think AI should challenge users more even if it makes the experience less “pleasant”?