AI Chatbots to Support Those Struggling with Mental Health

Earkick, a chatbot for mental health, welcomes users with a panda that seems amiable and would fit right in with a kid’s show.

The panda responds to users’ chats about worry with the same reassuring remarks as a licensed mental health practitioner, or therapist, would. After that, it can recommend breathing techniques or offer stress-reduction tips.

Among the hundreds of free chatbots designed to address the youth mental health epidemic is Earkick. Karin Andrea Stephan, one of the co-founders of Earkick, however, argues that neither he nor the other developers “feel comfortable” referring to their chatbots as therapeutic tools.

It matters to the expanding digital health sector whether these chatbots or applications offer basic self-help or mental health services. The Food and Drug Administration (FDA) is not required to approve the apps because they make no claims regarding the diagnosis or treatment of medical disorders.

Chatbots Powered by AI

Recent advancements in artificial intelligence (AI)-powered chatbots have prompted a closer scrutiny of the industry’s standing. To mimic human speaking, the system makes extensive use of data.

The chatbots’ benefits are obvious: they are free to use, accessible around-the-clock, and allow users to connect in secret.

The disadvantages are as follows: there is little evidence that chatbots enhance mental health, and the FDA has not approved them to treat diseases like depression.

The American Psychological Association’s Vaile Wright is a technology director and psychologist. Users “have no way to know whether these chatbots are actually effective,” according to her.

Wright went on to say that conventional mental health care and chatbots are not the same. However, she added, they could assist some individuals with less serious emotional and mental health issues.

The program does not “provide any form of medical care, medical opinion, diagnosis, or treatment,” according to the Earkick website. Such arguments, according to some health lawyers, are insufficient.

Harvard Law School professor Glenn Cohen stated, “You want a disclaimer that’s more direct if you’re really worried about people using your app for mental health services.” “This is just for fun,” he proposed.

Nevertheless, because there is a persistent dearth of mental health specialists, chatbots are already in use.

Lack of Mental Health Specialists

Lack of mental health specialists

The National Health Service in Britain has started providing young people with a chatbot named Wysa to help with stress, anxiety, and despair.

This also applies to those who are awaiting a therapy appointment. Similar initiatives are being provided by a few hospitals, colleges, and health insurance companies in the US.

Family physician Dr. Angela Skrzynski practices in the American state of New Jersey. She claims that most of her patients are quite willing to try a chatbot after she informs them of the length of time it will take to see a therapist. Certain adult patients can receive Woebot from her workplace, Virtua Health.

Founded in 2017 by a psychologist with training from Stanford, Woebot is an AI-free platform. Thousands of structured language models created by the company’s researchers and employees are used by the chatbot.

According to Woebot founder Alison Darcy, this rules-based approach is safer to utilize in medical settings. The business is experimenting with generative AI models, but according to Darcy, there have been issues with the system.

“We couldn’t stop the large language models from… telling someone how they should be thinking, instead of facilitating the person’s process.” the speaker stated.

The findings of Woebot were incorporated into an AI chatbot study report that was published in Digital Medicine last year.

The authors came to the conclusion that chatbots could provide temporary relief from depression. However, it was impossible to investigate their long-term impact on mental health.

Ross Koppel is a health information technology researcher at the University of Pennsylvania. He is concerned that treatment and medication may be replaced by these chatbots. Koppel and others want the FDA to investigate and perhaps even regulate these chatbots.

Working at Seattle Children’s Hospital is Dr. Doug Opel. “There’s a whole host of questions we need to understand about this technology so we can ultimately do what we’re all here to do: improve kids’ mental and physical health.” he stated.

You might also like

Leave a Reply

Your email address will not be published. Required fields are marked *