Ever since the Tamagotchi virtual pet was launched in the late 90s, humankind’s relationship with machines and technology has slowly...
Can a chatbot heal emotions? Picture IStock
Ever since the Tamagotchi virtual pet was launched in the late 90s, humankind’s relationship with machines and technology has slowly ramped up to where we are today. In a world where binary code controls almost every action and reaction, the way we communicate has changed. We either talk to one another through machines or cut out people completely and chinwag with chatbots. And it’s everywhere.
Mental health support has joined the autotune queue. Generative artificial intelligence tools programmed in the therapeutic space deliver quick access, affordability and machine empathy on demand. Virtual assistants like Woebot and Wysa reach out their virtual hands of measurement and method. These platforms track moods; prompt reflective moments and dish out neatly packaged advice dug deep from within its code. Their appeal is obvious, said medical doctor and psychologist Dr Jonathan Redelinghuys. “They’re anonymous, instant and never overbooked.”
AI-based chatbots significantly reduced symptoms of depression
A review published in 2023 saw a study that considered more than 7000 academic records, narrowed them down to 35 studies and came to interesting conclusions. It found that AI-based chatbots significantly reduced symptoms of depression and distress, especially when embedded into instant messaging apps. While results were promising for clinically diagnosed patients and elderly users who may teeter on the edge of mental wellness, the same review noted that the technology didn’t significantly improve broader psychological well-being.
Also Read: Love hurts: Seven common sex injuries
Relief, yes. Recovery not so much said Dr Redelinghuys. “The usefulness of technology should not be confused with therapeutic depth,” he said. “There’s value in having something to turn to in moments of need but that doesn’t make it therapy. Therapy is relational. It’s anchored in nuance and emotional feedback, which a machine just doesn’t have.” Emotional intelligence is still a human trait and while a computer or an app can pretend to understand, it does not and cannot process grief, shame or longing. “It can’t notice when someone’s about to cry but doesn’t. It won’t pause, adjust tone or sit in silence when silence says more than words,” said Dr Redelinghuys.
AI can’t notice when someone’s about to cry
A review done by the University of California in 2019 explored how AI could predict and classify mental health issues using everything from electronic health records and brain imaging to smartphone data and social media activity. The findings showed strong predictive capabilities, but limitations in scale and applicability. Most of the underlying studies were small, and there is a risk of generalisation while mental health is, well, unique to an individual.
Human therapists adapt on go based on patient input, said Dr Redelinghuys. “Humans pick up what’s not being said, read body language and know when to sit back or take note. A machine can’t go beyond what it was programmed to do. It can learn language, it can talk back, but it can’t feel you. “Therapy is a process that involves building a relationship with someone who gets to know you over time. Support isn’t always about saying the right thing because it or you are hardwired to do so. Sometimes it’s about sitting with someone in discomfort until they find their own way through.”
Healing is not plug-and-play
Remember, said Dr Redelinghuys, healing is not a plug-and-play device. The role of AI can be supportive and even provide a measure of comfort, he said. “But it cannot replace humanness.”
Online, opinions vary on channels like Reddit. Some users report positive outcomes with chatbots, especially in managing day-to-day anxiety or spirals. Others use them for mood tracking; diary prompts and even crisis moments. But those dealing with trauma, identity confusion or challenging emotional issues often find AI support limited and, as one user called it, emotionally sterile. “Uncoded or human therapists come with ethical standards, formal training and legal responsibilities. They are accountable,” said Dr Redelinghuys. “Chatbots and their programmers are not held to answer. Confidentiality might be implied, but there are no professional boards or licensing bodies governing a chatbot’s conduct. Data privacy is a real concern.”
Now Read: Doing Niksen; the art of nothingness