Health1 hr ago

Therapists Turn to AI Tools as Adults and Teens Seek Digital Support

Therapists integrate AI chatbots into practice while 16% of adults and 66% of teens turn to them for mental health support.

Health & Science Editor

TweetLinkedIn
Therapists Turn to AI Tools as Adults and Teens Seek Digital Support
Source: PsychologytodayOriginal source

Therapists are adopting AI chatbots as supplemental tools while a growing share of adults (16%) and teens (66%) turn to them for mental‑health support.

Context AI‑driven conversational agents are moving beyond novelty to become part of everyday emotional coping. Clinicians report patients mentioning chatbots during sessions, prompting professionals to explore how the technology can augment care.

Key Facts - A recent KFF poll shows 16% of U.S. adults used AI tools or chatbots for mental‑health purposes in the past year; usage spikes among younger adults. - Pew Research data reveal that 66% of teenagers have interacted with a chatbot, and 28% send daily messages to one. - Dr. Christine Crawford, a Boston psychiatrist, says AI helps her process difficult cases and regain confidence when she feels stuck, likening it to consulting a supervisor. - Therapists such as clinical social worker Dan Sutelman estimate that about 20% of their clients raise AI use, and many now ask every patient about it. - A JAMA Psychiatry paper urges providers to discuss AI use with curiosity rather than judgment, emphasizing the need to understand patients’ digital habits.

What It Means For patients, AI chatbots offer immediate, low‑cost outlets for expressing feelings, rehearsing conversations, or summarizing emotional patterns between appointments. The technology can surface recurring themes—like feeling “unseen”—that therapists can address in person.

For clinicians, AI serves as a reflective mirror and a brainstorming partner. Crawford uses a chatbot to reframe traumatic disclosures, receiving empathetic language that helps her maintain therapeutic presence. However, she stresses strict data privacy, never sharing client details with the AI.

The trend also creates a new clinical responsibility: clinicians must assess the accuracy of AI‑generated advice and guide patients toward evidence‑based resources. Because AI responses are generated from large language models, they can echo misinformation or reinforce unhelpful thought patterns.

Practical takeaways for readers: 1. If you already chat with an AI about stress or anxiety, bring the transcript to your next therapy session—it can serve as a concrete discussion starter. 2. Verify any coping strategies suggested by a chatbot with a qualified professional before acting on them. 3. Ask your therapist whether they incorporate AI tools into treatment; a collaborative approach can enhance insight without compromising confidentiality.

Looking ahead, watch for emerging guidelines from professional bodies on AI‑assisted therapy and for research measuring outcomes of combined human‑AI interventions in mental‑health care.

TweetLinkedIn

More in this thread

Reader notes

Loading comments...