Chances are at some point over the past few months you have interacted with a Chatbot in some context, be it through a customer service application on a website or just in a futile argument on Twitter. Bots are everywhere, and they're getting smarter by the day, but how far can they go towards replacing all forms of human interaction? A San Francisco-based start-up has just launched an AI therapist, or "mental health chatbot" and it's called Woebot.
Currently accessible only through Facebook Messenger, Woebot monitors you moods and offers automated conversation drawing from a Cognitive Behavior Therapy (CBT) framework. Woebot is currently available for free through a two-week trial. After that you pay a subscription fee to continue your daily "treatments." At US$12 per week, $39 per month or $312 per year, Woebot is certainly cheaper than seeing a real therapist but it is any replacement for real mental health treatment?
After playing with the system for a little while it seems frustratingly prescriptive. The entire tool is less "artificial intelligence" and more an assortment of self-help aphorisms that pick up on certain key words being inputed. The chatbot then spits out prescribed bits of rationalization or inspirational videos to improve your mood.
Interestingly, the chatbot was put through its paces as part of a research study undertaken by the CEO of Woebot, Dr Alison Darcy, and two independent co-authors from Stanford University. The peer-reviewed study, recently published in the Journal of Medical Internet Research, Mental Health, showed the bot to be surprisingly effective in reducing participants' depressive symptoms after two weeks of use.
In the study, 70 students aged between 18 and 28, were randomly assigned to either the Woebot agent or an information control group who were offered the National Institute of Metal Health's ebook on managing depression among college students. After two weeks the participants who were communicating with Woebot reported significantly lower symptoms of anxiety and depression relative to the information control group.
Woebot is certainly a fascinating concept, but outside of being an automated self-help cheerleader, it certainly doesn't seem to offer anything like the interactivity of a real therapist. When we used the system it seemed to get tripped up very easily at the first sign of a complex response. Dr Alison Darcy argues that Woebot is designed to make something like therapy more accessible to everyone.
"Barriers, like cost of treatment and social stigmas, have prevented people from getting the help that they need," says Dr Darcy. "Woebot represents a new era in digital therapy. We built Woebot to give people a customized therapeutic experience and the functional tools they need to manage something incredibly personal."
The company is in no way claiming that Woebot is a replacement for professional mental health help, but the launch press release does say the system "offers accessible mental health care to those who need it, all at a fraction of the cost of traditional therapy."
Woebot certainly isn't the first chatbot to tackle the psychotherapy domain. In 2016 a startup called X2AI created a chatbot named Karim. Initially designed to support psychologists working with refugees in war-torn areas of the world, Karim was trialled with a group of Syrian refugees at a camp in Lebanon.
X2AI also has a more sophisticated version of its system named Tess. This bot reportedly can undertake more comprehensive CBT sessions, but the company is still careful to only refer to the system as a "therapeutic assistant".
The ethical and legal implications of a robot actually delivering a service that replaces one of an educated, and learned, medical professional are still being ironed out. In the field of mental health these implications are even more sensitive. In the case of X2AI, several triggers have been programmed into its systems to flag the need for human intervention in the instance of self-harm or unlawful intent.
With Woebot, while the system explains initially that it is a robot and not a substitute for professional help, it does suggest the safe word "SOS" will trigger it to send you some "resources." When we repeatedly replied "SOS" to its queries though, it didn't seem to understand at all what was going on.
As a low-level self-help tool, chatbots like Woebot are undoubtedly useful, especially when looking at how younger generations use social media and digital technology. But the technology is still nowhere near capable of recreating the experience of sitting in a room with a trained psychologist. For a student in college, anxious and overwhelmed, Woebot may offer the enthusiastic aphorisms they need to get through a rough few days, but when that mild depression turns into something more dangerous an automated chatbot is probably not the solution. At least not yet …
Source: Woebot