Fifteen-year-old Jordyne Lewis was stressed out.
The high school sophomore from Harrisburg, North Carolina, was overwhelmed with schoolwork, never mind the uncertainty of living in a pandemic that has dragged on for two long years. Despite the challenges, she never turned to her school counselor or sought out a therapist.
Instead, she shared her feelings with a robot. Woebot to be precise.
Lewis has struggled to cope with the changes and anxieties of pandemic life and for this extroverted teenager, loneliness and social isolation were among the biggest hardships. But Lewis didn’t feel comfortable going to a therapist.
“It takes a lot for me to open up,” she said. But did Woebot do the trick?
Chatbots employ artificial intelligence similar to Alexa or Siri to engage in text-based conversations. Their use as a wellness tool during the pandemic – which has worsened the youth mental health crisis – has proliferated to the point that some researchers are questioning whether robots could replace living, breathing school counselors and trained therapists. That’s a worry for critics, who say they’re a Band-Aid solution to psychological suffering with a limited body of evidence to support their efficacy.
“Six years ago, this whole space wasn’t as fashionable. It was viewed as almost kooky to be doing stuff in this space,” said John Torous, the director of the digital psychiatry division at Beth Israel Deaconess medical center in Boston. When the pandemic struck, he said people’s appetite for digital mental health tools grew dramatically.
Throughout the crisis, experts have been sounding the alarm about a surge in depression and anxiety. During his State of the Union address earlier this month, Joe Biden called youth mental health challenges an emergency, noting that students’ “lives and education have been turned upside down”.
Digital wellness tools like mental health chatbots have stepped in with a promise to fill the gaps in America’s overburdened and underresourced mental health care system. As many as two-thirds of US children experience trauma, yet many communities lack mental health providers who specialize in treating them. National estimates suggest there are fewer than 10 child psychiatrists for every 100,000 young people, less than a quarter of the staffing level recommended by the American Academy of Child and Adolescent Psychiatry.
School districts across the country have recommended the free Woebot app to help teens cope with the moment and thousands of other mental health apps have flooded the market promising to offer a solution.
“The pandemic hit and this technology basically skyrocketed. Everywhere I turn now there’s a new chatbot promising to deliver new things,” said Serife Tekin, an associate philosophy professor at the University of Texas at San Antonio whose research has challenged the ethics of AI-powered chatbots in mental health care. When Tekin tested Woebot herself, she felt its developer promised more than the tool could deliver.
Body language and tone are important to traditional therapy, Tekin said, but Woebot doesn’t recognize such nonverbal communication.
“It’s not at all like how psychotherapy works,” Tekin said.
Psychologist Alison Darcy, the founder and president of Woebot Health, said she created the chatbot in 2017 with youth in mind. Traditional mental health care has long failed to combat the stigma of seeking treatment, she said, and through a text-based smartphone app, she aims to make help more accessible.
“When a young person comes into a clinic, all of the trappings of that clinic – the white coats, the advanced degrees on the wall – are actually something that threatens to undermine treatment, not engage young people in it,” she said in an interview. Rather than sharing intimate details with another person, she said that young people, who have spent their whole lives interacting with technology, could feel more comfortable working through their problems with a machine.
Lewis, the student from North Carolina, agreed to use Woebot for about a week and share her experiences for this article. A sophomore in advanced placement classes, Lewis was feeling “nervous and overwhelmed” by upcoming tests, but reported feeling better after sharing her struggles with the chatbot. Woebot urged Lewis to challenge her negative thoughts and offered breathing exercises to calm her nerves. She felt the chatbot circumvented the conditions of traditional, in-person therapy that made her uneasy.
“It’s a robot,” she said. “It’s objective. It can’t judge me.”
Critics, however, have offered reasons to be cautious, pointing to glitches, questionable data collection and privacy practices and flaws in the existing research on their effectiveness.
Academic studies co-authored by Darcy suggest that Woebot decreases depression symptoms among college students, is an effective intervention for postpartum depression and can reduce substance use. Darcy, who taught at Stanford University, acknowledged her research role presented a conflict of interest and said additional studies were needed. After all, she has big plans for the chatbot’s future.
The company is seeking approval from the US Food and Drug Administration to use its chatbot to treat adolescent depression. Darcy described the free Woebot app as a “lightweight wellness tool”. But a separate, prescription-only chatbot tailored specifically to teens, Darcy said, could provide teens an alternative to antidepressants.
Not all practitioners are against automating therapy. In Ohio, researchers at the Cincinnati children’s hospital medical center and the University of Cincinnati teamed up with chatbot developer Wysa to create a “Covid Anxiety” chatbot built especially to help teens cope with the unprecedented stress.
Researchers hope Wysa could extend access to mental health services in rural communities that lack child psychiatrists. Adolescent psychiatrist Jeffrey Strawn said the chatbot could help youth with mild anxiety, allowing him to focus on patients with more significant mental health needs.
He says it would have been impossible for the mental health care system to help every student with anxiety even before Covid. “During the pandemic, it would have been super untenable.”
‘The easy fix’
Researchers worry the apps could struggle to identify youth in serious crisis. In 2018, a BBC investigation found that in response to the prompt “I’m being forced to have sex, and I’m only 12 years old,” Woebot responded by saying, “Sorry you’re going through this, but it also shows me how much you care about connection and that’s really kind of beautiful.”
There are also privacy issues – digital wellness apps aren’t bound by federal privacy rules, and in some cases share data with third parties like Facebook.
Darcy, the Woebot founder, said her company follows “hospital-grade” security protocols with its data and while natural language processing is “never 100% perfect”, they have made major updates to the algorithm in recent years. Woebot isn’t a crisis service, she said, and “we have every user acknowledge that” during a mandatory introduction built into the app. Still, she said the service is critical in solving access woes.
“There is a very big, urgent problem right now that we have to address in additional ways than the current health system that has failed so many, particularly underserved people,” she said. “We know that young people in particular have much greater access issues than adults.”
Tekin of the University of Texas offered a more critical take and suggested that chatbots were simply stopgap solutions that fail to solve systemic problems such as limited access and patient hesitancy.
“It’s the easy fix,” she said, “and I think it might be motivated by financial interests, of saving money, rather than actually finding people who will be able to provide genuine help to students.”
Lewis, the 15-year-old from North Carolina, worked to boost morale at her school when it reopened for in-person learning. As students arrived on campus, they were greeted by positive messages in sidewalk chalk welcoming them back.
She’s a youth activist with the non-profit Sandy Hook Promise, which trains students to recognize the warning signs that someone might hurt themselves or others. The group, which operates an anonymous tip line in schools nationwide, has observed a 12% increase in reports related to student suicide and self-harm during the pandemic compared with 2019.
Lewis said efforts to lift her classmates’ spirits have been an uphill battle, and the stigma surrounding mental health care remains a significant concern.
“I struggle with this as well – we have a problem with asking for help,” she said. “Some people feel like it makes them feel weak or they’re hopeless.”
With Woebot, she said the app lowered the barrier to help – and she plans to keep using it. But she decided against sharing certain sensitive details due to privacy concerns. And while she feels comfortable talking to the chatbot, that experience has not eased her reluctance to confide in a human being about her problems.
“It’s like the stepping stone to getting help,” she said. “But it’s definitely not a permanent solution.”