Presenters
Kris Goffin
Katleen Gabriels
Kind of session / presentation

Therapy Bots and Emotional Complexity: Do Therapy Bots Really Empathise?

“Youper: an empathetic, safe, and clinically validated chatbot for mental healthcare.” (Youper, n.d.) This slogan is used for the marketing campaign of therapy bot Youper, a chatbot that mimics psychotherapy, or at least uses methods of therapeutic practices to improve users’ mental health (Fulmer et al., 2018). Other examples are Woebot (Woebot, n.d.) and Wysa (Wysa, n.d.). Most therapy bots are based on the theory and practice of Cognitive Behavioural Therapy (CBT). Marketing campaigns of these therapy bots mention that they have “empathy”. Moreover, preliminary research at Google Research (Tu et al., 2024) suggests that Large Language Models perform well in empathetic communication and even performed greater diagnostic accuracy than (human) primary care physicians.

This presentation seeks to analyse the concept ‘empathy’ in the context of therapy bots. Empathy researchers distinguish between two kinds of empathy (e.g., Healey & Grossman, 2018): imagining what a person feels in their inner world (cognitive empathy) and having an emotional or caring response to a person’s inner world (affective empathy). No one claims that therapy bots have affective empathy, as they are not (yet?) capable of having the affective empathetic responses. It is debatable whether this is an advantage or disadvantage. We will argue that, following the guidelines of CBT practices, affective empathy is not necessary for good therapy practices. 

When companies emphasise that their bots have empathy, they refer to cognitive empathy: the bot can make predictions of which output is relevant to give, given a person’s emotional state. We will argue that cognitive empathy is necessary for good therapy practices. However, therapy bots lack a sophisticated emotional understanding. Therefore, drawing on emotion theory and emotion science (see Goffin, 2022), therapy bots currently use a far too simplistic emotional categorisation system. It is incorrect to attribute ‘empathy’ to these therapy bots, even if ‘empathy’ is defined in a liberal way (i.e., in the sense of the kind of cognitive empathy that is relevant for good practices in CBT). This also raises compelling ethical questions. Besides improving the emotional abilities of therapy bots, the companies behind therapy bots need to be more transparent about the current capabilities of therapy bots’ ‘empathetic’ skills, to prevent misleading people.