top of page

Could AI Chatbots Be Future Therapists? A Clinical Psychology Podcast Episode.

Could AI Chatbots Be Future Therapists? A Clinical Psychology Podcast Episode.

Considering the popularity of the topic of Artificial Intelligence in recent years because of the release of ChatGPT and other massive language learning models, we need to investigate could these have any implications for psychologists. To explain simply this is a massive piece of artificial intelligence that is trained on millions, if not billions, of pieces of conversational text so the AI knows what to say and how to respond at a given moment depending on the language input or prompt a user gives it. That is one oversimplified explanation of the language model these Chatbots runoff. Therefore, in today’s episode we’ll be looking at the pros and cons of how these chatbots could be used as future therapists. If you enjoy learning about psychotherapy, clinical psychology and the future of psychology then you’ll love today’s episode.

Today’s episode has been sponsored by Abnormal Psychology: The Causes And Treatments Of Depression, Anxiety and More. Available from all major eBook retailers and you can order the paperback and hardback copies from Amazon, your local bookstore and local library, if you request it. Also, you can buy the eBook directly from me at

Are There Any Signs Chatbots Are Useful In Therapy?

Before we actually dive into today’s episode, I want to be honest and upfront with all of you that I do have massive problems with the inputs of Artificial Intelligence. I don’t have a problem with how artificial intelligence is used for bad, because everything is both good and bad. Just look at what great benefits the internet has given us as a species but also the damage the internet can do at the same time.

My problem is a more advanced issue that affects me as an author but not as a psychology person but I will not explore the topic in depth. Yet my problem is how these Artificial Intelligence programmes are illegally taking copyrighted works and training their models on them. This is illegal and if you’re interested in finding out more then you can look up at the lawsuits currently going on especially surrounding AI Art.

And this is why I’ve put off doing this topic for so long.

Anyway, whilst I’ve learnt a lot about mobile mental health apps in recent years and I’ve learnt a lot about how Chatbots are used in those apps, I wanted to focus a little more on the research side. Since there are early signs that chatbots can be used as psychotherapists.

Due to research shows chatbots are promising therapists for certain types of therapies that are structured, skills-based and concrete (Abd-Alrazap et al., 2019) making chatbots effective for Cognitive Behavioural Therapies, Health coaching as well as Dialectical behavioural therapy.

And chatbots are effective for getting people to stop smoking (Whittaker et al., 2022) and chatbots are being used in almost every single industry on the planet. Especially as whenever we go on Amazon, Google, YouTube, etc. we are training artificial intelligence on our preferences.

Moreover, and this is something I learnt when I was investigating mobile mental health apps during my academic placement year, there are apps, like Woebot, that use chatbots to help people’s mental health that are “based” on cognitive behavioural therapy. Now I say in air quotes because the problem with the literature in this area is that it hasn’t really been empirically validated, but this is definitely beyond the scope of this podcast episode.

Anyway, Chatbots are already being used on mobile mental health apps as therapists with some effectiveness.

What Could The Benefits Of Chatbot Therapists Be?

Personally, I think this is a really interesting question because a lot of people don’t think there would be any but let’s really think about it.

Firstly, chatbots would be absolutely amazing for public health services because they’re cheap, accessible to everyone with a phone and they’re scalability. Since if the chatbots are used correctly then this can bring mental health services to more people in the comfort of their own homes in their own time.

Secondly, a very interesting idea is that chatbots could be good for personalisation of therapy because it’s important to note that ChatGPT generates conversations and answers based on what the person inputs, making the chatbot more likely to respond more personally to the client compared to older, less effective chatbots.

Thirdly, there is an argument that chatbots could help connect our clients to more psychoeducation resources. Since the current “problem” is that if a therapist wants to give a client a particular resource for them to use outside of the therapy session then the human therapist needs to remember to do that, find the link or reference, send it to them and know exactly what resource would be good for that particular client in that particular moment.

Chatbots could connect clients to a particular website, book or online tool instantly by giving a link the moment they need it. Therefore, clients might be able to get more psychoeducation through chatbots.

Finally, and I think is this is an important one because clinical psychology is a science, chatbots allow therapies to be uniform, standardised and trackable. This is important because chatbots can deliver a more standardised and predictable set of responses allowing researchers to be able to review and analyse these interactions later on.

However, I personally think that chatbots will only ever be able to augment psychotherapy alongside a human therapist because you need that human interaction too.

Or do you?

What Could The Limitations and Challenges of Chatbot Therapists Be?

Firstly, the biggest problem and this is something I kept finding when I was researching these mobile mental health apps was retention rates. Due to people are more likely to show up and be accountable to human therapists when compared to chatbots and user engagement with mental health apps is very, very questionable. Especially as Kaveladze et al. (2022) only 4% of users continued to use a mental health app after 15 days and only 3% of users continue after 30 days. When we consider that CBT typically takes 3 months of weekly sessions to bring about therapeutic change, this is extremely worrying.

Secondly, another major problem I found when I was researching mobile mental health apps was the increasing need for improved data security, privacy and transparency. These are all very unethical and questionable uses of this very sensitive data because users have no idea how their data and discussions about extremely personal topics are being used by these massive companies. This is even more alarming when only 2% of mental health apps have research to back up their claims about their effectiveness and user experience (Wei, 2022).

Thirdly, I am strongly against artificial intelligence being used in high-risk cases because AI augment with human oversight is safer than AI replacement in these kinds of situations. These high-risk situations include suicide assessment, crisis management and other mental health difficulties that would typically be seen by a Tier 4 Child and Adolescent Mental Health Services (CAMHS) in the UK. This is even worst when we consider the open legal and ethical questions surrounding who is liable in cases of faulty AI. Since we have no idea who is responsible a chatbot therapist fails to assess or manage a mental health crisis, including suicidality. We also don’t know if a chatbot therapist will alert a human therapist or at least flag them, if a client is self-harming or suicidal or poses a risk to others.

These questions are important to saving a person’s life and the lives of others and until these questions are ethically and legally answered then I will always be opposed to AI chatbots being therapists to high-risk clients.

Finally, and I feel like this is the most important limitation of all. A chatbot cannot have the level of empathy required in certain therapeutic situations. Since research shows that even if a chatbot offers a person empathic language and writes the right words for a person then this isn’t always enough. You still always need that human-human interaction in certain emotional situations, like if you’re venting to someone or being angry. This might have been shown best in Tasi et al. (2021) because these researchers showed when a client was angry they were less comfortable and satisfied with a chatbot compared to a human.

As well as people don’t always feel heard or even understood when they don’t have a human at the other end of a conversation. The therapeutic alliance might need or depend on the human-to-human connection between the therapist and the client because the client might want another human to witness their difficulties and suffering. An AI replacement will likely never work for all these situations.

Clinical Psychology Conclusion

Personally, forgetting my concerns about copyright and artificial intelligence, in the realm of psychology and mental health, I am not against artificial intelligence being used in therapeutic settings. I think there is a place for it but chatbots can never and should never replace human therapists because humans are a social species and we need that social connection in therapeutic settings.

I think chatbots will only ever be able to augment psychotherapy alongside a human therapist, and that’s relatively okay, I don’t mind that.

I just don’t think chatbots will ever be able to replace human therapists. And when we consider that depression is one of the most common mental health conditions, and the extremely close relationship between depression and suicide. I don’t think chatbots ever should be allowed to replace therapists just in case suicidal and other high-risk clients slip through the cracks.

I really hope you enjoyed today’s clinical psychology podcast episode.

If you want to learn more, please check out:

Abnormal Psychology: The Causes And Treatments Of Depression, Anxiety and More. Available from all major eBook retailers and you can order the paperback and hardback copies from Amazon, your local bookstore and local library, if you request it. Also, you can buy the eBook directly from me at

Have a great day.

Clinical Psychology and Cyberpsychology References

Whittaker, R., Dobson, R., & Garner, K. (2022). Chatbots for Smoking Cessation: Scoping Review. Journal of medical Internet research, 24(9), e35556.

Abd-Alrazaq, A. A., Alajlani, M., Alalwan, A. A., Bewick, B. M., Gardner, P., & Househ, M. (2019). An overview of the features of chatbots in mental health: A scoping review. International journal of medical informatics, 132, 103978.

Kaveladze, B. T., Wasil, A. R., Bunyi, J. B., Ramirez, V., & Schueller, S. M. (2022). User Experience, Engagement, and Popularity in Mental Health Apps: Secondary Analysis of App Analytics and Expert App Reviews. JMIR human factors, 9(1), e30766.

Camacho E, Cohen A, Torous J. Assessment of Mental Health Services Available Through Smartphone Apps. JAMA Netw Open. 2022;5(12):e2248784. doi:10.1001/jamanetworkopen.2022.48784

Goldberg SB, Lam SU, Simonsson O, Torous J, Sun S (2022) Mobile phone-based interventions for mental health: A systematic meta-review of 14 meta-analyses of randomized controlled trials. PLOS Digital Health 1(1): e0000002.

Garland, A. F., Jenveja, A. K., & Patterson, J. E. (2021). Psyberguide: A useful resource for mental health apps in primary care and beyond. Families, Systems, & Health, 39(1), 155–157.

Tsai, W. S., Lun, D., Carcioppolo, N., & Chuan, C. H. (2021). Human versus chatbot: Understanding the role of emotion in health marketing communication for vaccines. Psychology & marketing, 38(12), 2377–2392.

CBT at your fingertips: A review of mHealth Apps and their ability to deliver CBT to users. (Under Submission)

I truly hope that you’ve enjoyed this blog post and if you feel like supporting the blog on an ongoing basis and get lots of rewards, then please head to my Patreon page.

However, if want to show one-time support and appreciation, the place to do that is PayPal. If you do that, please include your email address in the notes section, so I can say thank you.

Which I am going to say right now. Thank you!

Click for a one-time bit of support.

11 views0 comments


bottom of page