Can AI replace therapists?
I think about this question in my line of work, given I’m a licensed Psychologist, a Data Scientist, and a consultant for companies in the digital health space. Ever since ChatGPT and Generative AI exploded onto the scene, everyone wants to know if, or when and in what capacities chatbots can replace therapists.
According to this NBC news article (link here), when chatbots co-wrote emotionally supportive texts with therapists, users actually preferred the result — but only when those users did not know that a chatbot was involved. However, if the users were told that a chatbot helped write the message, the benefits disappeared!
Now, I just want to be clear that we should take this report with a grain of salt — after all, these data never were subjected to a higher level of scrutiny in a peer-reviewed publication. So, it’s possible that these results aren’t entirely accurate. Still, if these findings are real, they would be really provocative, with important implications! What might it mean for the future of AI in mental health, if generating language fluently were not enough?
This report got me thinking about a lightbulb moment I had when I was in graduate school, training to be a Clinical Psychologist. I had one particularly excellent clinical supervisor who told me that the most rewarding thing I could do for my patients was really pay attention to them.
Back then I was working at a Veteran Affairs hospital, training to provide Cognitive Behavioral Therapy (CBT). One of the critical ingredients of CBT is to assign “homework,” such as using a Thought Record to track so-called maladaptive thoughts and generate more adaptive thoughts. Catching these maladaptive thoughts is sort of like noticing that the check engine light in your car has turned on. If you go deeper, and look under the hood, you’ll usually find that automatic negative core beliefs or “scripts” have been triggered — such as “I’m not good enough” or “I’m a failure.” A good CBT therapist helps you recognize and re-engineer these patterns of thinking to help you lead a life worth living.
The sticky part is this: people really have to practice these skills (catch-it, check-it, change-it) throughout the week in order to make true change! Back in the early days of my training, the problem I encountered was that most patients did not complete their homework. And so, I arrived in my clinical supervisor’s office at the VA on one sweaty San Diego afternoon for our training session, and collapsed into his oversized leather chair, complaining that my patients just weren’t doing their homework. He told me that just the simple act of giving my deep attention or complete focus to people’s work is incredibly positively reinforcing — and he was right.
I began to start every session by asking how the homework had gone. I focused my full attention on every detail of my patients’ experiences, without judgment. If someone hadn’t been able to complete the homework, there was no shame, blame, or admonishment. I simply wanted to understand what had unfolded, and what got in the way. As I learned to attune deeply in these conversations, to the smallest microexpression or strained tone of voice, to the unusual word repeated. And under the light of attunement, I watched my patients start to improve in leaps and bounds, and we were creating real behavior change together! A man with chronic pain who overdid his opioid prescription began to sleep better, take brief walks in the morning sunlight, and cut back on his pain medication. It was an amazing experience.
Fast-forwarding from my graduate days to 20 years later. I’ve transitioned from the clinic to the digital realm, where the average time spent on a website is 53 seconds. We seem to be careening toward an age of uber-for-therapists, while the new frontier of generative AI is just emerging from the gray fog along the horizon.
Somewhere along the way, we have lost touch with the unexpected power of deep attunement to others, in a non-judgmental, curious and caring way that gives rise to healing. As humans, we love capturing the attention of others. It makes us feel like the center of the universe. It makes us feel… important.
Really good therapists know how to hold the space around you like an aura that vibrates with stillness. Chatbots wield words. But as a therapist, the more mastery I gained, the better I became at wielding silence. To replace this jedi-like skill with a bot would destroy the magic of attunement with all its scarcity, status, and belonging.
In fact, pricing psychology seems to suggest that providing low-cost therapy using AI may make therapy feel less valuable. After all, humans sometimes infer that because something costs more, it must be worth more. Will the attempt to meet demand end up devaluing it? Perhaps the right question is not how to make bots write like therapists, but how to make clients covet a bot’s attention as they might a celebrity therapist’s. When you generate AI art, you can select a style, like “oil painting” or “hyper realistic.” Along similar lines, will companies discover that they can capitalize by fine-tuning large language models to generate a conversational style that maximizes perceived social status and engagement, rather than empathy and efficacy?
We can already see that after a few freebies, higher quality AI images can cost the user $15. Even if the original premise was to make mental health cost-effective and accessible, once a business can charge, it will. What if your “therapist credits” run out just as you begin reliving a deeply traumatic experience? Remember that licensed Clinical Psychologists are bound by a strict code of ethics that protects clients. Businesses and bots are not.
One of the critical advantages of digital mental health technology is to provide touchpoints between weekly psychotherapy sessions, which can reinforce learning — like helping you practice your CBT homework for example. However, without any code of ethics or effective regulatory body governing digital mental health, this technology could have a dark side. Have you seen the young couple on a date, both staring intently at their phones? Now imagine that teenagers might become overly dependent on the “therapist in their pocket” to cope with life’s challenges. This could undermine their ability to navigate difficult conversations on their own.
Technology itself is not the problem, it is how technology is used and by whom. This is why better guardrails against the knife edge of capitalism in digital mental health are needed.
What does look promising is that the AI toolbox can assist therapists, so their services can be more accessible, scalable, and cost-effective. For example, generative AI might check-in with a client between weekly therapy appointments, with a smidge of empathy, a CBT homework prompt, and a reminder about your next session. I support assistive technologies that can help address the chasm of unmet need for mental health services. Such technology would not replace the therapist, it would add an assistant.
In the coming age of AI, let us remember to honor what humans do uniquely well. Therapy is not merely a recipe for delivering the right words to the right patient at the right moment. Even if we don’t yet know how to quantify it, the effervescent quality of humans attuning to one another does leap across a space… and this very human quality is the crux of healing.
Originally posted on Medium.