Can AI Bots Help Your Mental Health?

450
0

Let’s admit it. When the pressure of today’s life amps up, all of us have moments when we really need to have someone to reach out to for advice, reassurance, or just a friendly ear. The good news is, AI is making inroads into the world of mental health.

According to Forbes, AI is getting increasingly sophisticated at doing what humans do, but more efficiently, quickly and at a lower cost. To get a sense of how chatbots measure up to in-person therapy, Healthline.com did a test run of two leading mental health chatbots and asked users to provide feedback. Here’s what they found. mental health apps

Woebot

Woebot is a “fully automated conversational agent” developed by Woebot Labs in San Francisco. Woebot is very user-friendly and begins with a short survey to see what areas the user wanted to work on. Additionally, it reviews confidentiality and gives instructions on what to do in an emergency.

Woebot has a sense of humor and also has skills — in no time, Woebot had identified the user’s mood (with emoji support), identified three thoughts underlying that mood, and helped the user see that these thoughts were “distortions,” which were replaced with more helpful thoughts. In other words, Woebot does cognitive behavioral therapy (CBT) — an evidence-based approach to treatment.

mental health apps

Over time, Woebot charts emoji responses to help visualize trends and then shares that chart with the user. This allows the user to understand why he/she should be checking in daily. Though Woebot isn’t meant to replace a real therapist, it’s a great tool to use outside of therapy to keep you on track with your internal work. mental health apps

Wysa

Wysa is a playful AI penguin that operates on iPhone and Android platforms. After introductions, Wysa informs the user of confidentiality and that conversations are private and encrypted. The user then tells Wysa what he/she struggles with (mild depression and anxiety) and takes a brief questionnaire.

Based on responses, Wysa built a “toolkit” with a variety of exercises “for better focus if I’m overwhelmed, to manage conflict, and to relax.” Some of these exercises were based on mindful meditation. Every evening the user is contacted for progress monitoring. Like Woebot, Wysa has skills in CBT and restructuring thoughts. It is very user-friendly and attractive.

The second user found it so friendly that, at times, it felt as though there was a human on the other end. The bot had a great sense of humor, could really lighten the mood, is personable and understood what the user was saying. However, Wysa cannot replace a real therapist. It could be used with other forms of therapy. mental health apps

Tess

“Built by clinical psychologists, Tess is a Mental Health Chatbot that coaches people through tough times to build resilience, by having text message conversations – similar to texting with a friend or coach.”

mental health appsIf you’re experiencing a panic attack in the middle of the day or want to vent before going to sleep, you can connect with Tess through Facebook Messenger, and you will receive an immediate reply. mental health apps

Final Thoughts

Healthline.com found that chatbots are a viable and effective method for mental health services. The most obvious benefits are convenience and price. However industry experts warn that “Robots should not be used in lieu of therapists,” said Şerife Tekin, who works in the intersection of philosophy, biomedicine and artificial intelligence at UTSA (University of Texas San Antonio).

In a face-to-face interaction with a therapist, a patient may have a high level of trust to openly discuss conditions. In the digital realm, a patient may self-censor due to the fear of data breaches. Even if there is a written privacy policy issued by the developer, Tekin argues that there are currently no regulations to protect the privacy and security of personal health information. mental health apps

The Wall Street Journal also voiced concerns and concluded, “The efficacy of some products is questionable, a problem only made worse by the fact that private companies don’t always share information about how their AI works. Problems about accuracy raise concerns about amplifying bad advice to people who may be vulnerable or incapable of critical thinking, as well as fears of perpetuating racial or cultural biases. Concerns also persist about private information being shared in unexpected ways or with unintended parties.”

20 Online Therapy Services No One Told You About

Previous articleWhat NOT to Miss on the East Side in June!
Next articleLiving Yards Developments Launches The Loft Plaza