© 2026 Boise State Public Radio
NPR in Idaho
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Teenagers relying on AI chatbots for mental health care, emotional support

Canva

If you or someone you know may be considering suicide or be in crisis, call or text 988 to reach the 988 Suicide & Crisis Lifeline.

Roughly two-thirds of teenagers are using AI chatbots, like ChatGPT, when they need someone to talk to. People are drawn to AI for mental health because it's free and accessible, but experts stress it does not replace the empathy found from real people.

Dr. Colin Depp, a professor of psychology at the University of California, San Diego, said research shows chatbots are surprisingly good at identifying emotions and developing a shared understanding of what a person is feeling, which is why they like them.

“I think at their current state, it's more like a sounding board as opposed to something that might necessarily lead to a particular change in an underlying condition," said Depp. "I think that that also could change, too, because we could imagine that in, you know, several years, one could these things could become more effective and more aligned.”

Aura

Aura, an online safety organization, found in its 2025 State of the Youth report that 19% of 16-year-olds used chatbots for conversations of emotional support.

Parents, teachers and therapists should guide users to see AI as a tool and not as a therapist, similar to writing in a journal or letters to no one in particular. Depp said it is a machine capable of identifying people’s emotions. But it doesn’t do well at setting agendas that have to do with making real change in a person's mental health, like a human therapist would be able to do.

“I do believe there are some ways that people who are, I think super users have tried to blend the two, [work on] some particular skill that they're wanting to develop or interpersonal challenge or something like that. And they're using the ChatGPT to help kind of prepare for sessions or get more out of it.”

The Idaho Crisis and Suicide Hotline said it is getting increased reports of people relying on AI for crisis support and that there have been some bad outcomes and even deaths linked to using AI nationwide. Lee Flinn, the director of the Idaho hotline, said whenever youth reach out, they will always be connected with a real human on the other side.

“We want to let everybody know that Idaho Crisis and Suicide Hotline, we support the entire state of Idaho, and we are a member of the 988 lifeline," said Flinn. "And there is no use of AI in supporting, you know, directly supporting a person through a phone call, a text or a chat.”

Flinn also said crisis counselors are concerned that spending time developing a relationship with a chatbot could interfere with a person’s willingness to seek out traditional therapy, something that Depp agreed with.

Depp also said he hopes there would be ways of expanding the ability of AI to be provided for those waiting for appointments or for those in rural areas who may not have accessible mental health care.

Stay Connected
I’m a social media enthusiast here at Boise State Public Radio. I help improve our social media presence and build an audience on different platforms. I study analytics to make adjustments to strategy and try to reach as many people as I can with our content.

You make stories like this possible.

The biggest portion of Boise State Public Radio's funding comes from readers like you who value fact-based journalism and trustworthy information.

Your donation today helps make our local reporting free for our entire community.