Artificial Intelligence (AI) can be defined as ‘ability of a system to understand external data, learn from it and accomplish specific goals through adaptation’.
ChatGPT is an AI based chatbot, that simulates human conversation. Asking ChatGPT to define AI produces a much more digestible answer than I offered above. ChatGPT describes AI as ‘a simulation of human intelligence in machines that allow them perform tasks that would typically need human cognition’. ChatGPT and AI more generally is being used more and more in mental healthcare, primarily in detection of mental health difficulties and more so in assisting in psychotherapy. As AI evolves at pace now, it lends itself to the question that if it can outperform me in kicking off this blog, where else could it wield its power.
A recent systematic review looked at how good the evidence is for AI in mental healthcare, exploring diagnosis, monitoring and intervention. In the area of treatment, this review found that AI chatbots showed variable performance in treating mental health conditions including depression and anxiety.
Studies that showed positive effects included a chatbot that were rooted in CBT principles (CBT itself has a proven evidence base in treating a range of mental health conditions) and several that were more effective than treatment as usual in supporting people experiencing suicidal, depressive and anxiety symptoms. One study showed an improvement in anxiety and depression (although no better than CBT with a person) and for high frequency users, a reduction in loneliness. Other studies found that the AI app showed only a small improvement in symptoms, or another which showed that the best effect came from a combination of AI and human.
This tiny snapshot of studies offers a fascinating glimpse into the beginnings of AI in mental health treatment. For instance, the idea that a chatbot is better than treatment as usual for someone in crisis in an A&E department is potentially hard to understand and merits a further look at that study. This study showed that the chatbot helped people cope with thoughts of suicide and also helped them feel calmer and less agitated.
In digging into this study, you can see that the AI app is based on the CAMS approach, and also includes aspects of DBT, both of which are proven to be effective in helping people cope with thoughts of suicide. So perhaps the finding that this app/chatbot helps people cope with thoughts of suicide begins to look more understandable. Also, it is important to note that the treatment as usual in this case, is waiting to see a clinician (both groups saw a clinician for their mental health after a wait).
So, what about the finding that saw a reduction in agitation and distress?
The group of people who got the app (only 14 people in total) also had someone sit in with them for two hours as they were using the app. The impact of the presence of another human when waiting alone in the ED department was not factored into this trial so maybe we can’t be sure if this too helped reduce the experience of agitation and distress, a limitation which the authors also mention. Also, a ‘placebo tablet’ was not offered, so it can’t be said definitively whether it was having something interactive to do, with someone in the room, or the app itself which led to the improvement. Nonetheless, the authors wondered whether people would find this to be an acceptable or tolerable intervention in the ED when in crisis, which they did.
I offer this review to show that perhaps the devil is in the detail of potentially headline grabbing research that posits the takeover of AI in mental health.
The systematic review also notes some of the downsides of AI therapy including some observations from an author in 2018 which saw the limitations as the following: AI not being able to read body language cues and tone of voice etc, the interaction sometimes feeling unnatural and the reply from the chatbot as being irrelevant or not interactive enough. In the near 7 years that have passed since these limits of AI were written, I would argue ChatGPT has evolved significantly in this time. Take for instance, this very sophisticated and human sounding response to my question of whether AI will take over psychotherapy:
“AI can be a helpful tool in psychotherapy, but it won’t fully replace human therapists. When it comes to real healing, feeling truly seen and heard and understood, humans can still do that better.” Italics are mine, as I wonder if I ask the same question in another 7 years what the answer might be.
Drawing from my own clinical experience of working alongside people for nearly 20 years, I have very rarely seen someone find their way in a recovery journey with one thing alone, whatever that is. Invariably recovery or healing, whatever word we use, involves relationship, meaning finding, connection, inward reflection and taking small steps for big changes. And continuing to do this. Perhaps AI will help with this, maybe too in ways we don’t know yet.
For now, however, we know that human connection and support can help when life feels overwhelming. We are proud in Aware to offer the Solace Café, which is based on a recovery orientated approach for people experiencing mental health crisis. We also have a range of skills based mental health education programmes and mindfulness groups to empower people to better understand mental health and cope with anxiety and depression. You can find more information here.
This blog is by Dr Susan Brannick, Clinical Director at Aware as part of a monthly blog series.