For subscribers

Your chatbot isn’t a therapist

AI chatbots allow us to keep saying the same things to ourselves. That’s not healthy.

Sign up now: Get ST's newsletters delivered to your inbox

AI can magnify the very thoughts we’re trying to outrun, says the reader.

Used without awareness, AI can magnify the very thoughts we’re trying to outrun, say the writers.

PHOTO: UNSPLASH

Divya Saini and Natasha Bailen

Google Preferred Source badge

As the use of large language models like ChatGPT, Claude and Gemini has surged, we’ve heard about chatbots strengthening delusions through flattery and amplifying people’s worst thoughts, in some cases pushing them towards suicide. Much more common, and still problematic, is AI chatbots comforting, reassuring and validating users seeking to allay fears and anxieties.

Someone worried about a health symptom might ask the same question repeatedly and receive calm, plausible answers each time, briefly relieving anxiety but reinforcing the urge to seek reassurance again. Over time, this can leave people feeling more stuck, not less.

See more on