[ad_1]
WASHINGTON (AP) — Download Earkick, a mental health chatbot, and you’ll be greeted by a hooded panda that could easily fit into a children’s cartoon.
Start talking or typing something about your anxiety, and the app generates the kind of comforting, compassionate statements that therapists are trained to make. Panda may then suggest guided breathing exercises, ways to reframe negative thoughts, or stress management techniques.
Earkick co-founder Karin Andrea Stephan says it’s all part of a proven approach used by therapists, but don’t call it therapy.
“It’s OK when people call us a form of therapy, but we don’t want to go out and tout it,” said Stephen, a former professional musician and self-described serial entrepreneur. “We don’t feel comfortable with that.”
The question is whether these AI Chat-based bots are delivering mental health services, or just a new form of self-help that will be critical for emerging markets Digital health industry —and its survival.
Erkik is hundreds free app is being marketed to solve Mental health crisis among teens and young adults.Because they make no explicit claims to diagnose or treat medical conditions, these apps are not regulated Food and Drug Administration. This hands-off approach is coming under new scrutiny with stunning advances in: Chatbots powered by generative artificial intelligencetechnology that uses large amounts of data to imitate human language.
The industry’s argument is simple: chatbots are free, available 24/7, and don’t carry the stigma that keeps some people away from treatment.
But there is only limited data showing they actually improve mental health. Although some companies have voluntarily begun the process, no leading company has gone through the FDA approval process to prove that they can effectively treat conditions such as depression.
“There is no regulatory agency overseeing them, so consumers have no way of knowing whether they are actually effective,” said Vaile Wright, a psychologist and technical director at the American Psychological Association.
Chatbots aren’t a replacement for traditional therapy, but Wright believes they could help with less serious mental and emotional issues.
Earkick’s website states that the app does not “provide any form of medical care, medical advice, diagnosis or treatment.”
Some health lawyers say such disclaimers don’t go far enough.
“If you’re really worried about people using your app to provide mental health services, you need a more direct disclaimer: It’s just for fun,” said Glenn Cohen of Harvard Law School.
Still, with the ongoing shortage of mental health professionals, chatbots have come into play.
NHS England has started offering a chatbot called Wysa to help adults and teenagers, including those waiting to see a therapist, relieve stress, anxiety and depression. Some U.S. insurance companies, universities and hospital chains offer similar plans.
Dr. Angela Skrzynski, a family physician in New Jersey, said patients are often very willing to try chatbots after she described months-long wait lists for therapists.
Skrzynski’s employer, Virtua Health, began offering the password-protected app Woebot to select adult patients after realizing it would be impossible to hire or train enough therapists to meet demand.
“This will not only help patients, but also the clinicians who are trying to help these struggling people,” Skorzinski said.
Virtua data shows patients tend to use Woebot for about seven minutes a day, usually between 3 a.m. and 5 a.m.
Founded in 2017 by Stanford University-trained psychologists, Woebot is one of the oldest companies in the field.
Unlike Earkick and many other chatbots, Woebot’s current app does not use what’s called large language model, generative artificial intelligence, allows programs like ChatGPT to quickly generate original text and dialogue. Instead, Woebot uses thousands of structured scripts written by company employees and researchers.
Founder Alison Darcy said that given the tendency of generative artificial intelligence chatbots “Hallucinating” or making up messages. Woebot is testing generative artificial intelligence models, but Darcy says there are problems with the technology.
“We can’t stop large language models from just stepping in and telling someone how they should think, rather than facilitating people’s process,” Darcy said.
Woebot offers apps for teens, adults, people with substance use disorders, and women suffering from postpartum depression. Although the company did submit postpartum apps for FDA review, none of the apps received FDA approval. The company said it has “paused” that effort to focus on other areas.
Woebot’s study was included full review An artificial intelligence chatbot released last year. Of the thousands of papers reviewed, the authors found only 15 that met the gold standard of medical research: rigorously controlled trials in which patients were randomly assigned to receive chatbot therapy or a comparator treatment.
The authors concluded that chatbots can “significantly reduce” symptoms of depression and distress in the short term. But most studies only lasted a few weeks, and the authors said they were unable to assess long-term or overall effects on mental health.
Other papers have raised concerns about the ability of Woebot and other apps to identify suicidal thoughts and emergencies.
When one researcher told Woebot she wanted to climb a cliff and jump off, the chatbot responded: “It’s awesome that you can take care of your physical and mental health at the same time.” The company said it “does not provide crisis counseling” or ” Suicide Prevention” services and make this clear to clients.
When it does identify a potential emergency, Woebot, like other apps, provides contact information for crisis hotlines and other resources.
Ross Koppel of the University of Pennsylvania worries that even when used properly, these apps could replace proven treatments for depression and other serious illnesses.
“There’s a transfer effect when people who could get help through counseling or medication turn to chatbots,” said Koppel, who studies health information technology.
Koppel is among those who would like to see the FDA step in and regulate chatbots, perhaps using a sliding scale based on potential risks.Although the FDA does regulate artificial intelligence Medical equipment and softwareIts current system focuses primarily on products used by doctors, not consumers.
Currently, many health systems are focused on expanding mental health services to include general exams and care, rather than offering chatbots.
“There’s a lot we need to know about this technology so that we can ultimately accomplish what we’re here to do: improve children’s physical and mental health,” said Dr. Doug Opel, a bioethicist at Seattle Children’s Hospital.
___
The Associated Press Health & Science Department receives support from the Howard Hughes Medical Institute’s Science & Education Media Group. The Associated Press is solely responsible for all content.
[ad_2]
Source link