Real-time Facial Emotion Recognition for Human-Robot Interaction

Authors

  • Jelena Epifanic Comenius University Bratislava

Abstract

Introduction

Emotions guide our decisions, affect our behaviour and represent a crucial role in human-human interaction. Given that machines, especially robots, collaborate with and assist humans in diverse domains, it is important that they also possess the ability to affectively engage with people. Researchers have been studying the importance of human-robot interaction (HRI) and working on the development of "affective computing" solutions for more than three decades [1], with the aim of endowing machines with the skills to handle affective data, interpret emotions, and react properly. Such systems should be able to interact with people in a more natural, spontaneous, and efficient manner.

Although emotional states are conveyed through different modalities, the majority of emotional recognition solutions have focused on facial emotion recognition [2]. With the recent advances in the field of artificial intelligence, especially in deep learning [3] and hardware technologies, the number and accuracy of these applications have been accelerated.

The focus of our project is on the development and implementation of a facial emotion recognition system, as well as the appropriate facial responses of the robot. It will serve as the basis for the future development of more advanced designs that will also consider context and other modalities of emotion recognition. The goal is to enrich HRI and enable further studies of the social and psychological aspects of this interplay.

Methods

With the objective of enhancing HRI, we will develop a system for real-time facial emotion recognition and appropriate robot facial responses. To achieve improved accuracy for the emotion classification task, we will fine-tune the pre-trained VGG-16 model using the AffectNet dataset. The classifier will be able to identify six emotions (sadness, happiness, anger, fear, disgust, and surprise) and the seventh class - neutral. Additionally, we will construct a lightweight deep learning network for detecting facial feature points to map head pose and facial expressions. In the final stage of the project, a robot mirroring response will be generated in real-time using 68 detected facial feature points. This will enable the robot to mimic human facial expressions effectively.

References

[1] R. W. Picard, Affective Computing, Cambridge: The MIT Press, 2000.

[2] N. Rawal and R. M. Stock-Homburg, “Facial emotion expressions in human–robot interaction: A survey,” International Journal of Social Robotics, vol. 14, no. 7, pp. 1583–1604, 2022. doi:10.1007/s12369-022-00867-0

[3] S. Li and W. Deng, "Deep Facial Expression Recognition: A Survey," in IEEE Transactions on Affective Computing, vol. 13, no. 3, pp. 1195-1215, Sept. 2022. doi:10.1109/TAFFC.2020.2981446.

Published

2023-06-05