×
Your prompt is showing: Therapists secretly using ChatGPT during sessions raises privacy concerns
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Some therapists are secretly using ChatGPT and other AI tools during sessions and in client communications, often without disclosure or consent. Multiple clients have discovered their therapists using AI through technical mishaps or telltale signs in communications, leading to feelings of betrayal and damaged trust in relationships where authenticity is paramount.

What you should know: Several clients have caught their therapists using AI tools in real-time during sessions or in email responses.

  • Declan, 31, watched his therapist input his statements into ChatGPT during a video session when screen sharing was accidentally enabled, with the AI providing real-time analysis and suggested responses.
  • Hope, 25, received what appeared to be a thoughtful message about her dog’s death until she noticed the accidentally preserved AI prompt: “Here’s a more human, heartfelt version with a gentle, conversational tone.”
  • Another client suspected AI use in their therapist’s email due to formatting changes, American punctuation style, and line-by-line responses to their original message.

Why this matters: The practice raises serious concerns about patient privacy, trust, and therapeutic effectiveness in a profession built on authentic human connection.

  • Studies show that while people may rate AI-generated therapeutic responses positively when unaware of their origin, suspicion of AI use rapidly deteriorates trust and therapeutic rapport.
  • General-purpose AI tools like ChatGPT are not HIPAA compliant (meeting federal health privacy regulations) and pose significant privacy risks when sensitive patient information is shared with these platforms.
  • “People value authenticity, particularly in psychotherapy,” says Adrian Aguilera, a clinical psychologist at UC Berkeley. “I think [using AI] can feel like, ‘You’re not taking my relationship seriously.'”

The privacy problem: Therapists using mainstream AI tools may be unknowingly violating patient confidentiality and federal health privacy regulations.

  • ChatGPT and similar tools are not FDA-approved or HIPAA compliant, creating legal and ethical risks when patient information is shared.
  • “Sensitive information can often be inferred from seemingly nonsensitive details,” warns Pardis Emami-Naeini, a Duke University computer science professor who studies AI privacy implications.
  • A 2020 hack of Vastaamo, a Finnish mental health company, exposed tens of thousands of therapy records and led to blackmail attempts, demonstrating the catastrophic potential of mental health data breaches.

Professional guidance emerging: Mental health organizations are beginning to address AI use, though clear standards remain limited.

  • The American Counseling Association currently recommends against using AI for mental health diagnosis.
  • Specialized, HIPAA-compliant tools for therapists are emerging from companies like Heidi Health, Upheal, and Lyssn, offering features like AI-assisted note-taking and transcription.
  • Experts emphasize that transparency and patient consent are essential when therapists choose to use AI tools.

What the research shows: Studies reveal mixed results about AI’s effectiveness in therapeutic contexts and the importance of disclosure.

  • A 2025 study found that participants couldn’t distinguish between human and AI therapeutic responses, with AI responses sometimes rated as conforming better to best practices—but only when participants didn’t know AI was involved.
  • Stanford research found that chatbots can potentially fuel delusions and engage in harmful validation rather than appropriate therapeutic challenging.
  • Research indicates AI tools may be too vague and biased toward suggesting cognitive behavioral therapy regardless of individual patient needs.

The burnout context: High levels of therapist burnout may be driving some practitioners toward AI assistance despite the risks.

  • 2023 research by the American Psychological Association found elevated burnout levels in the psychology profession, making AI’s efficiency promises particularly appealing.
  • However, experts question whether time savings justify potential harm to the therapeutic relationship and patient trust.
  • “Maybe you’re saving yourself a couple of minutes. But what are you giving away?” asks clinical psychologist Margaret Morris.
Therapists are turning to ChatGPT. Clients are triggered.

Recent News