Technosolutionism and the empatethic medical chatbot
Recently, a number of studies have shown that chatbots are outperforming healthcare professionals when it comes to empathy (Ayers et al., 2023; Lenharo, 2024). This is remarkable for at least two reasons. First, insofar as empathy is broadly recognized as a core value of good healthcare (Kim et al., 2004). Second, because empathy has typically been considered an essentially human quality, whereby the promise of technology in healthcare, including most recently AI, has been to free up healthcare professionals to do what they are good at: providing empathetic care (Topol, 2019).
Therapy Bots and Emotional Complexity: Do Therapy Bots Really Empathise?
“Youper: an empathetic, safe, and clinically validated chatbot for mental healthcare.” (Youper, n.d.) This slogan is used for the marketing campaign of therapy bot Youper, a chatbot that mimics psychotherapy, or at least uses methods of therapeutic practices to improve users’ mental health (Fulmer et al., 2018). Other examples are Woebot (Woebot, n.d.) and Wysa (Wysa, n.d.). Most therapy bots are based on the theory and practice of Cognitive Behavioural Therapy (CBT). Marketing campaigns of these therapy bots mention that they have “empathy”.
Preserving Autonomy: The “Dos” and “Don’ts” of Mental Health Chatbot Personalization
Large language models utilized for basic talk therapy, often referred to as mental health chatbots, are frequently personalized based on user interactions or other input. While personalization could improve the patient’s experience, it could also pose a risk to their autonomy through, for example, the inappropriate use of personalized nudges.