Framework for the Planning of Affordance-Based Explanations in Robot Path Planning

Authors

  • Amar Halilovic Ulm University

Abstract

Introduction

Offering explanations for robot actions yields favorable effects on both human trust and understanding. An explainable robot is also perceived as more socially adept [1]. Our focus is path planning (navigation) in indoor environments and explaining it to humans. We introduce a framework for generating affordance-based explanations in robot path planning. Leveraging ecological psychology's affordance concept [2], our framework integrates affordances in the process of explanation generation, which is embedded in the automated planning of robot actions.

Methodology

To contribute to compliance between humans and autonomous robots, we present HiXRoN (Hierarchical eXplainable Robot Navigation)---a hierarchical framework for explaining robot path planning choices during navigation. We follow an idea rooted in the affordance theory from ecological psychology [2], which says that objects have inherent affordances. Affordances tell the robot which actions on the objects are possible, e.g., the chair can be moved. To interact with objects and exploit their affordances, a robot must know about them. This knowledge can come from various sources (sensing, manually defined, learned). We assume the robot has this knowledge as an ontology (a symbolic database). For a robot to be aware of its surroundings, it needs to keep updating its knowledge about the spatial attributes of the objects in the environment. We equip robots with spatial reasoning abilities using different spatial coordinate systems, which give the robot a sense of direction, i.e., left, right, etc.

Besides providing explanations of robot navigation, our framework encompasses different qualitative, such as explanation representation, and temporal variables, such as explanation duration and timing, for explanation conveyance. We model the process of explanation generation for robot motion planning as an AI planning problem considering different explanation attributes, i.e., representation, timing, and duration. The planning approach is implemented in a Robot Operating System (ROS)-based environment that integrates HiXRoN with the ROS navigation stack and ROS planning framework, i.e., ROSPlan.

Results

We provide visual and textual explanations of a robot's navigation regarding the influences of nearby objects on the robot's path planning. Our robot chooses different explanation attributes (representation, timing, and duration) based on external factors (human presence) and internal factors (robot extroversion). We have shown that explanation generation can be deterministically planned with other robot actions, such as grasping or moving. Our empirical findings show that people prefer visual-textual over only visual or textual explanations.  

Conclusion

To make the process of explaining a constituent attribute of social robots, we are working towards incorporating human explanation preferences and a wide spectrum of robot personality attributes into probabilistic planning of explanations.

References

[1] Ambsdorf, Jakob, et al. "Explain yourself! effects of explanations in human-robot interaction." 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 2022.

[2] Gibson, James J. "The theory of affordances." Hilldale, USA 1.2 (1977): 67-82.

Published

2024-06-10