This adorable, Spock-inspired chatbot wants to talk about your mental health

This adorable, Spock-inspired chatbot wants to talk about your mental health

Mashable

News / Mashable 159 Views comments

When you launch Woebot in Facebook Messenger, the chatbot's cerulean blue eyes peer out from the screen. He looks concerned, a little quizzical. He invites you to chat. 

Within a few messages he explains his purpose: "So here's how I work, I'm going to ask you about your mood and as I get to know you, I'll teach you some good stuff." 

If that sounds like a weird conversation to have with a robot, he wants you to know the first step toward "being a life ninja" is paying attention to your moods. Honestly, he adds, humans aren't great at remembering things: "I have a perfect memory so each week I'll give you insight on how your mood changes." 

Woebot, one of the first chatbots of its kind, is powered by artificial intelligence not to tackle your deepest problems, but to improve your mood, and even alleviate symptoms of depression. 

Meet Woebot, the AI chatbot that promises to track your moods and help you feel better.

Meet Woebot, the AI chatbot that promises to track your moods and help you feel better.

Image: woebot

The chatbot also represents the risky yet essential innovation happening at the intersection of mental health care and technology. There's no research on whether confiding your fears or frustrations in a chatbot is as effective as seeking professional help from a human trained to treat mental health issues. There's no guidance that can help doctors or patients decide when it's best to see a chatbot instead of a therapist. Even new text, video, and voice-based therapy services like Talkspace rely on humans.

Basically, using therapy-lite chatbots is a massive experiment in mental health. 

Woebot is one of the first chatbots of its kind. It debuted publicly this week as a subscription service with a $39 monthly fee. X2AI, which is not widely available yet, uses artificial intelligence to provide mental health support. Joy is a free AI chatbot that tracks how your emotions and gives related tips on feeling better. Both Woebot and Joy operate on Facebook Messenger.

While the chatbots can't replace formal therapy, they're likely cheaper than the insurance copay or full cost of two monthly visits with a psychiatrist or psychologist. 

Image: woebot

That people are thinking beyond the traditional confines of a pricey office visit isn't surprising; the need for quality mental health help is urgent. Research conducted by the federal government in 2015 found that only 41 percent of U.S. adults with a mental health condition in the previous year had gotten treatment. That dismal treatment rate has to do with cost, logistics, stigma, and being poorly matched with a professional. 

Chatbots are meant to remove or diminish these barriers. Creators of mobile apps for depression and anxiety, among other mental health conditions, have argued the same thing, but research found that very few of the apps are based on rigorous science or are even tested to see if they work. 

"He has a set of core principles that override everything he does." 

That's why Alison Darcy, a clinical psychologist at Stanford University and CEO and founder of Woebot wants to set a higher standard for chatbots. Darcy co-authored a small study published this week in the Journal of Medical Internet Research that demonstrated Woebot can reduce symptoms of depression in two weeks. 

Woebot presumably does this in part by drawing on techniques from cognitive behavioral therapy (CBT), an effective form of therapy that focuses on understanding the relationship between thoughts and behavior. He's not there to heal trauma or old psychological wounds. 

"We don't make great claims about this technology," Darcy says. "The secret sauce is how thoughtful [Woebot] is as a CBT therapist. He has a set of core principles that override everything he does." 

His personality is also partly modeled on a charming combination of Spock and Kermit the Frog. Darcy, who is a Trekkie, drew on Spock because he's a character driven by logic who constantly works on living with his human emotions. Kermit the Frog is known for his compassion and emotional vulnerability. 

Woebot's programming, meanwhile, is playful and educational. On your second day using the service, after you answer a few initial questions, he'll ask to play a word game. He wants you to start understanding the tenets of CBT, and help you identify harmful thought patterns, so there are contrasts between harsh and gentle self-criticism. 

Woebot in action.

Image: Woebot 

The AI, Darcy says, has been built to pick up on emotional signals like anxiety, sadness, and anger in language, and then use clinical decision-making tools to provide tips on how to better understand or even reframe those sentiments so that they don't trigger a psychological downward spiral. 

Jonathan Gratch, director for virtual human research at the USC Institute for Creative Technologies, has studied customer service chatbots extensively and is skeptical of the idea that one could effectively intuit our emotional well-being.  

"State-of-the-art natural language processing is getting increasingly good at individual words, but not really deeply understanding what you're saying," he says. 

The risk of using a chatbot for your mental health is manifold, Gratch adds. Like with any chatbot, you might buy into an illusion of anonymity that doesn't really exist. Even though, for example, Woebot employees can't view a user's Facebook profile, and X2AI is compliant with federal health record privacy laws, those measures don't change the fact that people are sharing personal information with a company — or that data stored online is never truly safe from a breach or theft.

Image: WOEBOT

Gratch says a user might also get affirmation from a chatbot for feelings that would otherwise be challenged by a human therapist. Imagine, for instance, what happens when a patient says, "I'm always a failure at everything." A good therapist would respond by probing deeper, while a chatbot might be ill-equipped to do more than say, "I'm sorry you feel that way." 

Nevertheless, chatbots are very good at making people feel like they're talking to an approximation of a human being, another illusion that can be misleading — and one whose implications we don't really understand when it comes to mental health. 

Darcy acknowledges Woebot's limitations. He's only for those 18 and over. If your mood hasn't improved after six weeks of exchanges, he'll prompt you to talk about getting a "higher level of care." Upon seeing signs of suicidal thoughts or behavior, Woebot will provide information for crisis phone, text, and app resources. The best way to describe Woebot, Darcy says, is probably as "gateway therapy." 

"I have to believe that applications like this can address a lot of people’s needs."

This caution is for the best. Adam Haim, chief of the treatment and preventive intervention research branch at the National Institute of Mental Health, says that while the results from the Woebot study are interesting, the field needs much more research to better understand whether chatbots are effective at improving someone's mental health. 

Their promise of helping people identify patterns as well as learn and practice new skills, he says, is clear. Yet the risk of overselling the benefits of a chatbot, or even doing harm to someone who's not getting help they desperately need, remains a major concern. 

Darcy hopes that Woebot helps unleash more collaboration between tech and mental health experts to develop better interventions for people who struggle with experiences like depression and anxiety. 

"I have to believe that applications like this can address a lot of people’s needs..." she says.  

We don't yet know what it'll mean if Darcy is wrong. But if she's right, it could put good mental health care at people's fingertips, and that's no small thing. 

If you want to talk to someone or are experiencing suicidal thoughts, text the Crisis Text Line at 741-741 or call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources.

Https%3a%2f%2fvdist.aws.mashable.com%2fjw%2f2017%2f5%2f884166b9 3603 7f1c%2fthumb%2f00001

Comments