If you or someone you know may be considering suicide or be in crisis, call or text 988 to reach the 988 Suicide & Crisis Lifeline.
Roughly two-thirds of teenagers are using AI chatbots, like ChatGPT, when they need someone to talk to. People are drawn to AI for mental health because it's free and accessible, but experts stress it does not replace the empathy found from real people.
Dr. Colin Depp, a professor of psychology at the University of California, San Diego, said research shows chatbots are surprisingly good at identifying emotions and developing a shared understanding of what a person is feeling, which is why they like them.
“I think at their current state, it's more like a sounding board as opposed to something that might necessarily lead to a particular change in an underlying condition," said Depp. "I think that that also could change, too, because we could imagine that in, you know, several years, one could these things could become more effective and more aligned.”
Aura, an online safety organization, found in its 2025 State of the Youth report that 19% of 16-year-olds used chatbots for conversations of emotional support.
Parents, teachers and therapists should guide users to see AI as a tool and not as a therapist, similar to writing in a journal or letters to no one in particular. Depp said it is a machine capable of identifying people’s emotions. But it doesn’t do well at setting agendas that have to do with making real change in a person's mental health, like a human therapist would be able to do.
“I do believe there are some ways that people who are, I think super users have tried to blend the two, [work on] some particular skill that they're wanting to develop or interpersonal challenge or something like that. And they're using the ChatGPT to help kind of prepare for sessions or get more out of it.”
The Idaho Crisis and Suicide Hotline said it is getting increased reports of people relying on AI for crisis support and that there have been some bad outcomes and even deaths linked to using AI nationwide. Lee Flinn, the director of the Idaho hotline, said whenever youth reach out, they will always be connected with a real human on the other side.
“We want to let everybody know that Idaho Crisis and Suicide Hotline, we support the entire state of Idaho, and we are a member of the 988 lifeline," said Flinn. "And there is no use of AI in supporting, you know, directly supporting a person through a phone call, a text or a chat.”
Flinn also said crisis counselors are concerned that spending time developing a relationship with a chatbot could interfere with a person’s willingness to seek out traditional therapy, something that Depp agreed with.
Depp also said he hopes there would be ways of expanding the ability of AI to be provided for those waiting for appointments or for those in rural areas who may not have accessible mental health care.