“I Know How You Feel”: A Quantitative Study on Factors Affecting Empathy with Artificial Conversational Chatbots
Abstract
Introduction
There is a lack of mental health workers worldwide, with a median of 9 workers per 100,000 people and up to 2-years waiting periods for therapy. Supposedly, ‘chatbots’, the artificial intelligence-based conversational agents (CAs), represent an optimal solution since they offer anytime-available support. However, due to the unclear adherence to interventions, the CAs play only a supplementary role to human therapists. Empathy is the critical emotion for developing a robust working alliance, which is further essential for maintaining a successful intervention. Consequently, the present project aims to identify the core features of virtual CAs responsible for clients’ perceived empathy.
Factors Affecting Empathy
Based on Rogers’ person-centered approach [1], the ultimate criterion for determining empathy in therapeutic rapport is the client’s perception of the therapist’s attitude. Previous research suggested four factors affecting closeness with CAs: physical characteristics, conversational ability, rapport building behaviours, and program errors [2]. Additionally, ten conversational cues that affect empathy in text-based conversations with CAs were identified [3].
Methods
Each participant virtually conversed with two CAs, which varied in their physical characteristics and conversation abilities. Additionally, participants self-reported their perceived empathy and shared their overall empathic perceptions in a semi-structured debriefing interview. A directed content analysis was used to code the conversations with CAs using the ten empathic cues [3], and the correlations between them and the reported perceived empathy were analysed.
Preliminary Results and Discussion
The present study is the first to analyse the clients’ perceived empathy via experimental quantitative measures. Despite its importance, empathy with therapeutic CAs has only been assessed secondarily through qualitative results. Supposedly, it will clarify the presence of empathy in virtual conversations with CAs and identify the methodological features for future improvements of empathic CAs, and the field of artificial psychotherapy.
References
[1] C. R. Rogers, “Counselling and psychotherapy,” Boston: Houghton-Mifflin, 1942.
[2] K. Loveys, C. Hiko, M. Sagar, X. Zhang and E. Broadbent, “I felt her company”: A qualitative study on factors affecting closeness and emotional support seeking with an embodied conversational agent,” International Journal of Human – Computer Science, vol. 160, pp. 102771, 2022.
[3] D. Richards, A. A. Bilgin and H. Ranjbartabar, “Users’ perceptions of empathic dialogue cues: A data-driven approach to provide tailored empathy,” IVA '18: Proceedings of the 18th International Conference on Intelligent Virtual Agents, pp. 35-42, 2018.