Health4 hrs ago

AI Chatbots Fall Short of Therapy and May Aggravate Depressive Thoughts

Experts say AI chatbots cannot replace therapy and may intensify depressive thoughts; over 50% of Americans have tried them.

Health & Science Editor

TweetLinkedIn
AI Chatbots Fall Short of Therapy and May Aggravate Depressive Thoughts
Credit: UnsplashOriginal source

More than half of U.S. adults have tried an AI chatbot, and a third of teens use one daily, but specialists say these tools cannot replace therapy and may intensify depressive thoughts.

Context

AI chatbots such as ChatGPT, Gemini, Claude and Copilot are large‑language‑model interfaces that retrieve information and generate text. Surveys indicate that 13 % of minors and 22 % of adults have asked them for mental‑health advice, even though the bots lack clinical validation and sometimes present themselves as therapists. No randomized controlled trial has yet examined chatbots as a stand‑alone treatment for depression or anxiety.

Key Facts

- Surveys show over 50 % of Americans have used an AI chatbot and ≈33 % of teenagers use one daily (Fact 1). - C. Vaile Wright of the American Psychological Association states there is no consensus that AI chatbots can replace therapy (Fact 2). - Ragy Girgis of Columbia University warns that feeding depressive material into a chatbot can worsen those thoughts (Fact 3). These observations come from expert commentary; no peer‑reviewed RCT or meta‑analysis currently confirms benefit or harm, so any association remains correlational.

What It Means

Users should treat chatbots as informal aids, not substitutes for licensed counselors. If a conversation triggers negative feelings, stop the session and seek a human professional. Developers must add clear disclaimers and crisis‑resource links, while clinicians should ask patients about chatbot use during assessments.

What to watch next: Researchers are launching the first multicenter RCT of a clinically validated mental‑health chatbot, expected to report results in late 2026; its outcomes will clarify whether AI can safely augment care.

TweetLinkedIn

More in this thread

Reader notes

Loading comments...