Of all the possible applications of artificial intelligence (AI), one of the most controversial is its use in therapy. AI chatbots – tools that allow a human user to talk to an AI stand-in for a real therapist – have proliferated, promising to increase access to mental health care for people who otherwise would not be able or willing to seek out a human professional. Unfortunately, so have the potential harms associated with AI in this highly sensitive space.
What’s the truth about AI therapy? Do AI chatbot therapists actually work? We asked a real-life human – Wes Williams, PhD, WellPower’s vice president and chief information officer – for his thoughts. While we were expecting a definitive and emphatic “NO,” what Williams actually said surprised us. Read on for his take on AI therapy.
AI Therapy Came Up Fast
Part of what makes this question feel so disorienting is how suddenly it materialized. “AI has come on really quickly and there’s a lot of evidence that single interactions with AI can produce helpful suggestions for people,” said Dr. Williams. “And the capabilities of chatbots and LLMs [large language models] do seem to hold a lot of promise. We’ve been talking about a therapist shortage for years, and how AI could be a force multiplier, and here we have tools that can do some of that.”
With 23% of adults in the US – equivalent to 60 million people – experiencing a mental health concern in a given year, anything that could expand access to care could improve – and even save – lives. It’s also important, though, that AI solutions don’t cause more harm.
Do AI Therapists Work?
When we talk about effectiveness of AI therapybots, it’s important to start by asking: What about therapy makes it helpful in the first place?
Dr. Williams explained the current thinking around three main aspects of therapy that may contribute to its effectiveness.
- First, the protocols. Having standardized, “manualized” treatment tools that have been validated by research allow therapists to follow a widely accepted roadmap. Approaches like cognitive behavioral therapy (CBT), dialectical behavior therapy (DBT) and others you may have heard of fall into this category of evidence-based treatments that work best when followed closely.
“If the approach is manualized – follow these standardized steps to treat concerns – a bot could be even more impactful than human because of its closer adherence to the method,” Dr. Williams said. “If you think that’s why therapy works, then a chatbot should be good.” One caveat: today’s AI chatbots show a lot of room for improvement, but with continued development, they could someday be highly effective at this more prescriptive approach to therapy.
- Second, the human element. Two people connecting in a safe place can be powerful – and highly effective when working through sensitive topics. From this perspective, the empathetic, understanding dialogue is the most important part of therapy, and is exactly what makes it work so well. This is especially true for those who see potential value in multiple protocols (see above); the human part is the point.
The fundamental promise of AI chatbots is that they remove the human altogether – that’s the point. “If that’s the case, I’m not sure a chatbot would work as well,” said Dr. Williams. “There are more unanswered questions about how it would work.” - Third, the confidentiality. With the promise that what’s said in the therapy session will stay between you and your therapist, you’re willing to say things that you wouldn’t say otherwise. Talking to friends and family can be helpful in working through many problems; however, in cases of highly sensitive or potentially embarrassing subjects, it’s hard to beat talking with someone who adheres to strict professional ethics codes around confidentiality. Even knowing this, it can be hard for some people to feel comfortable talking to another human at all.
“In that case, maybe a chatbot is even better because you’re not talking to a person,” Dr. Williams offered. “But then that raises issues of trust – talking to computers is less anonymous than people believe.”
“All that being said, I think it’s our responsibility as a field to be exploring this because there is a big opportunity,” said Dr. Williams. “For the reason why therapy works that’s about a human connection, it might be more effective to make people be more efficient than take the people out.”
So how is WellPower, Colorado’s largest community mental health center, approaching AI for therapy?
AI and Therapy at WellPower
Here at WellPower, our top priority is safety for the people we serve and our staff. This applies to AI as well – we are focused on exploring and adopting AI tools that we know we can deliver safely. “We’re rolling out AI for phone transcriptions, clinical documentation, revenue cycle management, meeting minutes, improved search for internal policies and documents and measurement-based care – things that are very safe,” Dr. Williams explained. “We’re enthusiastically embracing use cases that seem to be safe and have good protections.”
Organizations like WellPower follow strict rules, such as the Health Insurance Portability and Accessibility Act (HIPAA) and 42 CFR part 2, which govern patient rights around their health information. All of the tools – AI and otherwise – we use at WellPower adhere to those rules and more. We take privacy and security very seriously.
One of the most fundamental parts of our official WellPower AI policy is that any AI tool needs to be what’s known as an “enterprise version,” which comes with more advanced safety and data security features than the typical off-the-shelf consumer version. For example, our version of Microsoft’s Copilot, which we use for internal efficiency, has multiple layers: ChatGPT is the main layer of the platform, which Microsoft licenses for Copilot, which WellPower uses in a specific for version that is trained on internal organization data.
“When we’ve done our ethics reviews around products and vendors – they either agree not to use corporate data at all, or allow people to opt out,” Dr. Williams explained. “People have rights to their data and how it’s being handled in a model.” The most ethical companies have thought about that and how to enable people to change their mind at any point and have their data removed.
How to Try AI Therapy Safely
“If you’ve found a free therapy bot online, you can assume that they’re using your data,” Dr. Williams said. Even when you’re paying for AI therapy, the company behind the tool is most likely using your data to improve the quality of the tool itself. This raises a range of data privacy and security issues. “That’s why it’s important to go to a trusted organization like WellPower and have them provide an AI model that’s been negotiated in a way that protects safety.”
But for people who want to take an AI therapybot for a spin on their own? Williams recommends keeping it brief and focused. “I think there’s relatively little risk in asking AI a single session question – I’m stuck, how do I proceed? That’s pretty safe. If you’re using it for something you go to regularly for a year, that’s probably not a good idea.”
It’s also important to keep some key terminology in mind as you’re exploring options. According to Dr. Williams, “If you’re receiving ‘therapy’ from a human, there are rules around that designation. ‘Coaching’ doesn’t have the same rules.” A similar difference can apply to AI tools: if a product is explicitly packaged as a “therapy bot” it may be more likely to be based on actual therapy approaches and guardrails than a “wellness” or “coaching” app. But even when the tool is marketed as an AI therapist, take all claims with a grain of salt – especially as we continue to learn about the limitations of AI in the therapeutic world.
How to Access Expert Care from Humans
If you or someone you know needs urgent support, call, text or chat 988, the Mental Health Lifeline. In Colorado, you can also visit a walk-in center for immediate, in-person help in a crisis. Denver’s walk-in center (operated by WellPower) is at 4353 E. Colfax Ave. Find the location closest to you here.
Join Our Email List
Sign up to receive our monthly newsletter, invitations to events and opportunities to support our mission.