Is AI the Next Remedy for the Teen Depression Crisis?

admin
5 Min Read

Imagine a world where a distressed teen wakes up in the middle of the night and opens an app on her phone to talk about the pain of not fitting in at school or missing her friends after moving to a new city.

This is the new Big Tech solution for the youth mental health crisis: therapy with a chatbot available at every child’s fingertips via a smartphone. With artificial intelligence (AI) infiltrating mental health care, all it takes is a simple Google search to yield the illusion of connection.

Yet some experts argue that the promise of having a therapist in every teen’s pocket is the opposite of what kids really need: genuine human connection.

“Kids these days need to learn how to unplug and be outside and away from their devices,” said Ms. Garfield-Jaeger, who is skeptical of “more tech” coming to their rescue.

She added that many people come to therapy because of issues with relationships, and “they need an empathetic relationship with another human.”

According to Woebot Health’s website, more than 80 percent of the messages from teens to Woebot are received outside of typical provider hours “when no other care is available.” Many of these texts come in between 3:00 a.m. and 5:00 a.m.

Psychologist Alison Darcy, the founder and president of Woebot Health, created the AI chatbot in 2017 with youth in mind. She recently told 60 Minutes that Woebot employs cognitive behavioral therapy (CBT) techniques to help users reframe thoughts.

According to Ms. Darcy, the app delivers the same statements that a real therapist would and can have therapeutic conversations. However, it cannot feel empathy or understand a person’s unique situation.

“Teenagers have been our first cohort,” said Ms. Aggarwal. About 30 percent of Wysa’s users are between the ages of 13 and 25, she said.

“But most worryingly,” according to Wysa’s website, “young people aren’t getting the help that they need. More than half (55%) who scored 3 or more on GAD2 and PHQ2 screening questionnaires for anxiety and depression haven’t spoken to a relevant professional about it.”

The two-item generalized anxiety disorder questionnaire (GAD-2) and the two-item patient health questionnaire (PHQ-2) are brief screening tools used to quickly assess anxiety and depression, respectively.

“Depending on how the person interprets those questions, everybody could be depressed,” said Ms. Garfield-Jaeger. “Real depression assessment used to take a couple of sessions with a mental health professional or at least several hours.”

“I think a lot of people are being diagnosed with depression, anxiety, who do not have it,” she said. “I do think that more people are anxious and depressed because of social isolation and other reasons.”

Author Abigail Shrier examined this issue in her book, “Bad Therapy,” and concluded that diagnoses are being made carelessly and treatments that “affirm all feelings” are also being dispensed freely.

“We used to call that validating,” said Ms. Garfield-Jaeger, who worked for over a decade at a partial hospitalization program with young people struggling with severe mental illness.

“So we would say, we validate your feeling, but that doesn’t mean your feeling is reality. It means you feel it, and now we need to investigate and do some — what they call — reality testing,” she added.

Ms. Shrier’s investigation also concluded that kids are often treated for what was once considered “normal growing pains.”

But for those who are lonely, hungry for connection, and struggling — the very people whom the creators of the chatbots believe can get the most use — “I think it is bad to not have humans involved,” said Ms. Garfield-Jaeger, “because once you start dealing with feelings, there needs to be a human to de-escalate you, or help if you do start having suicidal ideations. That’s dangerous.”

Share This Article
By admin
test bio
Please login to use this feature.