Read the Beforeitsnews.com story here. Advertise at Before It's News here.
Profile image
By Reason Magazine (Reporter)
Contributor profile | More stories
Story Views
Now:
Last hour:
Last 24 hours:
Total:

Chatbots Are Not Medical Devices

% of readers think this story is Fact. Add your two cents.


A smartphone with an AI mental health chatbot covered in red tape with the FDA logo | Illustration: Eddie Marshall | Midjourney

In November, the Food and Drug Administration (FDA) held a Digital Health Advisory Committee meeting where it considered treating artificial intelligence mental health chatbots as medical devices. As the FDA more formally describes it, the agency “intends to apply its regulatory oversight” to software functions that it considers “medical devices” in cases where poor “functionality could pose a risk to a patient’s safety.”

The agency clarified that its intended “approach applies to generative AI-enabled products as well.” That’s formal language for a regulatory approach that threatens to rope into the FDA’s broad regulatory oversight many AI chatbots that operate as useful wellness applications, not medical devices by any reasonable definition. It would be a mistake for the agency to apply medical device regulations to such wellness chatbots.

Registering a medical device with the FDA is extremely costly. To start with, there is the $11,423 per year registration fee. From then on, the company will be burdened with stringent government red tape that adds layered costs and harms consumer accessibility to regulated products.

For medical devices, the FDA requests premarket performance paperwork, risk management designs, and postmarket reports to assess reliability and effectiveness, which involve more expenses and costs for companies. Perhaps these costs would be justified if all of the potentially affected mental health care chatbots were actually medical devices—but they are not.

The FDA labels a product or service a medical device if it intends to diagnose, cure, mitigate, treat, or prevent a specific disease, including mental health conditions. AI chatbots do none of this.

What mental health chatbots do is offer general coping skills, mindfulness exercises, and cognitive behavioral therapy–style reframing meant to support users’ well-being. They do not claim and are not intended to treat any specific medical condition. Since they do not evaluate users before or after interactions—and do not tailor specific medical interventions—mental health chatbots clearly fall outside FDA medical device requirements.

However, the FDA does regulate mental health technologies that explicitly market their products as intended treatments. For instance, Rejoyn and DaylightRx, two digital apps that explicitly intend to treat and cure previously diagnosed mental health conditions, have been labeled as medical devices. Both of these apps demand accuracy and, therefore, accountability because they are marketed as treatments for conditions such as depression. It makes sense that they are held to a higher standard than tools that do not intend to do the same.

AI mental health care chatbots are different because they do not claim to do any type of medical diagnosis, treatment, or cure. They are best characterized as “wellness apps,” at their best helping people understand or feel better about themselves.

Nonetheless, AI mental health chatbots can be therapeutic without delivering what the FDA considers treatment.

As psychologists and users have pointed out, these chatbots respond to questions and complaints, provide summaries of conversations, and suggest topics to think about. These are forms of general wellness support, not clinical care. The companies behind these tools are explicit about this.

In the public comments submitted to the FDA on the digital mental health committee, Slingshot AI, the company that developed Ash (a popular AI mental health chatbot), specifies that it aims to provide “general wellness by making mental health support more accessible,” not treatments or diagnoses of mental health issues. Another AI mental health care chatbot, Wysa, whose functions involve listening and responding to emotions and thoughts from users, will not diagnose or attempt to treat any conditions from its users.

But these companies are facilitating a low-cost solution to certain people’s felt needs for communication, one that’s available at all hours, day or night, amid a shortage of mental health care providers affecting millions of Americans.

Therabot, a mental health chatbot, showed that it was able to reduce depressive symptoms by 51 percent and downgrade moderate anxiety to mild in many of those who interacted with it for a couple of weeks. The developers of Ash carried out a 10-week study that found 72 percent of people using their app reported a decrease in loneliness, 75 percent reported an increase in perceived social support, and four out of five users had an increased feeling of hope and greater engagement with their lives. These products are successfully helping people, and the FDA ought not make access to them more expensive or complicated with new regulatory efforts.

Treating mental health care chatbots as medical devices misses the point: Mental health chatbots are not professional therapy. In fact, AI mental health chatbots are more like educational chatbots developed by licensed professionals than like medical advisers. Their advice does not involve clinical relationships or a personalized diagnosis.

Some worry that without being designated and regulated as medical devices, AI mental health chatbots will become decidedly unsafe spaces. But companies in the field are already setting higher standards to prevent such risks. For instance, ChatGPT incorporated input from mental health professionals into its model to recognize distress from users, de-escalate conversations, and avoid affirming ungrounded beliefs, and it guides people to seek in-person mental health care. The company behind the AI Claude is also placing safeguards in its model by partnering with ThroughLine, a global crisis app with mental health care professionals who are helping shape how the model deals with sensitive conversations.

Unlike general-purpose chatbots such as Claude and ChatGPT, AI mental health chatbots are specifically designed to handle sensitive conversations. Ash, for example, relies on experts’ input and scientific evidence to improve user interactions. This is the case for many other AI mental health chatbots, such as Earkick, Elomia, and Wysa.

Labeling AI mental health chatbots as medical devices would stymie progress in helping people in need with simple tools that do not require medical advice. Imposing costly regulations on a technology that provides significant benefits will harm Americans who are seeking help. Any FDA decision to treat AI mental health chatbots as medical devices would be a mistake.

The post Chatbots Are Not Medical Devices appeared first on Reason.com.


Source: https://reason.com/2025/12/03/chatbots-are-not-medical-devices/


Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world.

Anyone can join.
Anyone can contribute.
Anyone can become informed about their world.

"United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.

Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world. Anyone can join. Anyone can contribute. Anyone can become informed about their world. "United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.


LION'S MANE PRODUCT


Try Our Lion’s Mane WHOLE MIND Nootropic Blend 60 Capsules


Mushrooms are having a moment. One fabulous fungus in particular, lion’s mane, may help improve memory, depression and anxiety symptoms. They are also an excellent source of nutrients that show promise as a therapy for dementia, and other neurodegenerative diseases. If you’re living with anxiety or depression, you may be curious about all the therapy options out there — including the natural ones.Our Lion’s Mane WHOLE MIND Nootropic Blend has been formulated to utilize the potency of Lion’s mane but also include the benefits of four other Highly Beneficial Mushrooms. Synergistically, they work together to Build your health through improving cognitive function and immunity regardless of your age. Our Nootropic not only improves your Cognitive Function and Activates your Immune System, but it benefits growth of Essential Gut Flora, further enhancing your Vitality.



Our Formula includes: Lion’s Mane Mushrooms which Increase Brain Power through nerve growth, lessen anxiety, reduce depression, and improve concentration. Its an excellent adaptogen, promotes sleep and improves immunity. Shiitake Mushrooms which Fight cancer cells and infectious disease, boost the immune system, promotes brain function, and serves as a source of B vitamins. Maitake Mushrooms which regulate blood sugar levels of diabetics, reduce hypertension and boosts the immune system. Reishi Mushrooms which Fight inflammation, liver disease, fatigue, tumor growth and cancer. They Improve skin disorders and soothes digestive problems, stomach ulcers and leaky gut syndrome. Chaga Mushrooms which have anti-aging effects, boost immune function, improve stamina and athletic performance, even act as a natural aphrodisiac, fighting diabetes and improving liver function. Try Our Lion’s Mane WHOLE MIND Nootropic Blend 60 Capsules Today. Be 100% Satisfied or Receive a Full Money Back Guarantee. Order Yours Today by Following This Link.


Report abuse

Comments

Your Comments
Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

MOST RECENT
Load more ...

SignUp

Login