The IMMERSE blog


Please follow our Twitter account @Immerse_Project to stay tuned.

Jeroen Weermeijer

Postdoctoral Researcher

KU Leuven

27/08/2024

RoboTherapists: Coming to a Couch Near You, or Still Stuck in the Cloud?


In about two months, I’ll celebrate my six-year mark studying the digitalization of mental health care, and what a ride it’s been. When I started this journey, I envisioned digital technology as a trusty sidekick — an enhancement to how we understand and address mental health issues. Six years ago, I never would have thought AI could actually 'replace' a therapist. Yet, here I am, and the idea is both fascinating and a little bit concerning to say the least. Below I will discuss three elements that may facilitate the existence of a ‘RoboTherapist’, and I will end with answering the question on whether I would personally seek advice from a RoboTherapist should it ever (or when it does?) come to exist.


Advancement 1: AI's Human-Like Voice


If you’ve watched sci-fi movies, you’ve probably noticed that AI often has a voice that’s distinctly non-human. If it does sound human, it’s usually devoid of emotional nuance. For quite some time this was also how things were in the real world. Take, for example, the 2016 Sophia robot from Hanson Robotics — her voice and manner of communication, while advanced, were still notably robotic and emotionless. However, recent advancements in technology have changed the game. With just 90 seconds of audio, software can now clone voices almost perfectly, capturing emotional nuances like intonation and fluctuations in speed. Some AI voices are so human-like, you might think you’re chatting with a real person. For instance, GPT-4o demonstrates this progress with incredibly lifelike interactions.


Advancement 2: AI’s Ability to Understand Psychological Meaning


A common argument against a RoboTherapist is that AI can’t grasp the deeper psychological meaning behind what people say. Yet, this argument is becoming less valid as AI technology advances. Modern AI systems are designed to analyze text in ways that are increasingly similar to human understanding. They can pick up on emotional and psychological details by detecting patterns and sentiments, allowing them to interpret not just the literal meaning but also the underlying emotions. For example, AI can now perform tasks like mood detection in text. While AI might not fully match human emotional insight yet, it’s making impressive strides.


Advancement 3: Pattern recognition


While AI can process data and generate responses, it doesn’t actually "feel" or understand in the human sense. This limitation highlights the fundamental difference between human and artificial intelligence: AI’s responses come from patterns and algorithms, not from genuine empathy or lived experiences. However, detecting patterns, and making patients aware of them, is a central element of what clinicians do in therapy. This raises an intriguing question: If you recorded hundreds of hours of therapy sessions and fed that data into an AI, could it develop adequate pattern recognition necessary for providing its own form of therapy?


Why I Would Still Choose Human Therapy


To summarize, modern AI now clones voices with emotional nuance, understands psychological meaning by analyzing text, and, despite lacking true empathy, could perhaps provide therapy through some form of advanced pattern recognition. However, the interaction is not genuinely human, and I am not sure whether I personally would be able to share my deepest thoughts and feelings with something as compared to someone. It’s a bit like the red pill-blue pill dilemma from The Matrix: taking the blue pill means enjoying a comforting illusion, while the red pill means facing a potentially unsettling reality. Opting for an AI therapist is like choosing the blue pill — enjoying the illusion of human interaction without confronting the fact that it’s still an artificial construct. AI might offer a convincing facade of empathy and understanding, but it fundamentally (in my opinion) lacks the lived experiences, emotional depth, and genuine human connection that real therapists can provide. Just as the red pill reveals the truth behind the illusion, knowing that an AI isn’t truly human can be a stark reminder of its limitations. After nearly six years in this field, I remain hopeful that technology will keep supporting therapy — as we aim to do with our DMMH tool — rather than replace the deeply human touch that’s essential for true therapeutic success. However, I cannot help but wonder what others would do if given the choice — would you visit a RoboTherapist?