The Ethics of AI in Your Practice: What the APA Code Actually Says (and What It Doesn't)
- APA's Ethics Code addresses competence (Standard 2.01), informed consent (Standard 3.10), and record-keeping (Standard 6.01) — all directly relevant to AI tool use in clinical practice
- Key principle: using AI tools (for note-taking, treatment planning, or client communication) requires the same competence standard as any clinical tool — clinicians must understand what they are using and its limitations
- Informed consent must address AI: if you use an AI tool that processes client information, the client has a right to know — even for "back-office" tools that do not interact with the client directly
- No specific AI regulation yet — but existing principles create clear obligations that most clinicians using AI tools are currently not meeting
You are probably already using AI in your practice. ChatGPT for session notes. Copilot for treatment plan drafts. A transcription tool that processes therapy recordings. The APA Ethics Code does not mention AI by name — but its existing principles create obligations that most clinicians have not considered.
Competence applies to tools, not just techniques
Standard 2.01 requires psychologists to provide services within the boundaries of their competence. This extends to tools. If you use an AI transcription service to generate session summaries, you need to understand: what happens to the data, whether the AI's output is accurate, what errors it introduces, and whether the service complies with HIPAA or local data protection requirements.
"I didn't know the AI tool stored recordings on overseas servers" is not a defence under competence standards. The obligation to understand your tools is yours.
Informed consent now includes AI
Standard 3.10 requires informed consent that covers "the nature of such services." If an AI tool processes any client-related information — voice recordings, session notes, intake data — the client should know. This is not hypothetical: the HIPAA Privacy Rule already requires disclosure of business associates who handle protected health information. Most AI tool vendors qualify.
The practical question: have you updated your informed consent form to mention AI tools? Most clinicians have not.
What the Code does not address
The APA Ethics Code does not specifically regulate: AI-generated clinical recommendations, chatbot-based interventions used between sessions, or AI tools that interact directly with clients. These are gaps — and they are being filled piecemeal by state boards, federal regulators, and case law rather than by a unified ethical framework.
For your practice
Audit your AI tool use. For each tool: (1) do you understand what it does with client data? (2) is it disclosed in your informed consent? (3) are you competent to evaluate its output? If any answer is no, you have an ethics gap to close — and the APA Code already requires you to close it.
The APA Ethics Code does not mention AI — but its competence and informed consent standards already require clinicians to understand, disclose, and evaluate every AI tool they use in practice.
APA Code is US-centric. Specific AI regulation is evolving rapidly. Interpretations may vary by state licensing board. Code principles are broad — specific guidance for common AI use cases is still emerging.