PSYREFLECT
INDUSTRYJanuary 5, 20262 min read

AI Is Entering the Therapy Room — But Regulation Hasn't Arrived Yet

Key Findings
  • APA's 2025 Practitioner Pulse Survey shows rising adoption of AI tools among psychologists for note-taking, treatment planning, and practice management
  • Therabot (Dartmouth) — a fully generative AI chatbot — produced significant symptom improvements in a clinical trial for MDD, GAD, and eating disorder risk
  • The FDA's Digital Health Advisory Committee convened in late 2025 to discuss regulatory frameworks for patient-facing mental health AI, but no binding guidance has emerged
  • Practitioners report persistent concerns about patient privacy, clinical accuracy, and ethical boundaries of AI in therapeutic contexts

AI is no longer a hypothetical in mental health practice. Psychologists are using it for clinical documentation, treatment planning, and administrative tasks. A fully generative chatbot has now shown symptom improvement in a controlled trial. Yet the regulatory framework remains a grey zone — and the gap between adoption speed and oversight speed is widening.

The adoption curve

The APA's 2025 Practitioner Pulse Survey captures a profession in transition. AI adoption for clinical documentation — session notes, treatment plans, progress summaries — has moved from novelty to routine for early adopters. The appeal is straightforward: these tasks consume 30–40% of a clinician's working hours, and AI handles them faster.

But the survey also reveals a sharp split. Clinicians using AI for back-office tasks (scheduling, billing, note formatting) report high satisfaction. Those considering patient-facing applications express deep reservations. The concern is not abstract: what happens when a depressed patient interacts with a chatbot that generates a clinically inappropriate response at 2 AM?

The Therabot signal

The first clinical trial results for Therabot — a fully generative AI chatbot developed at Dartmouth — showed significant symptom improvements for major depressive disorder, generalised anxiety, and eating disorder risk. This is not a rule-based system following a decision tree. It is a large language model generating therapeutic responses in real time.

The results demand attention, but also caution. A symptom improvement signal in a controlled trial is not the same as clinical safety at scale. The chatbot operated under research conditions with oversight. The unregulated market offers no such guardrails.

The regulatory vacuum

The FDA's Digital Health Advisory Committee met in late 2025 to discuss patient-facing mental health AI. No binding guidance emerged. Most AI mental health tools currently fall outside existing FDA and FTC oversight frameworks — they are not medical devices (no diagnosis, no treatment), not drugs, not therapy. They exist in a regulatory gap.

For practitioners, the practical question is not whether AI will enter your practice — it is entering. The question is: which tools, under what conditions, with what liability? The answers do not exist yet.

AI has moved from hypothetical to routine in clinical practice — but the regulatory framework remains a blank page, leaving practitioners to navigate adoption without guardrails.

Limitations

The APA survey reflects US-centric practice patterns; AI adoption and regulation vary significantly across jurisdictions. Therabot trial details (sample size, control conditions) not fully disclosed in the APA Monitor article.

Source
APA Monitor on Psychology
AI in the Therapist's Office: Uptake Increases, Caution Persists
2026-03-01·View original
Tags
AI-in-therapyregulationclinical-documentationchatbotsAPA
Related
Industry
37% of UK Adults Already Use AI for Mental Health — NHS Report Maps the Reality
NHS ConfederationRead →
Industry
The Psychedelic Regulatory Map in 2026: Four US States, One Country, and a DEA Quota Boost
Reason Foundation / Psychedelic AlphaRead →
Industry
APA Draws the Line: Ethical Guidance for AI in Clinical Practice
American Psychological AssociationRead →
PsyReflect · Free · Mon & Thu
Get analyses like this every Monday and Thursday.
Only what matters for practice. Curated by a clinical psychologist. 5 minutes instead of 4 hours of monitoring.
Next →
37% of UK Adults Already Use AI for Mental Health — NHS Report Maps the Reality