DANCR: An Improvisational Tool for Contemporary Dance


  • Patrick Burns University of Vienna


The implementation of robots into the artistic space of dance has been a topic of increasing research in the past decade. Previous research has focused on the ability of a humanoid robot to embody different dance positions and create aesthetic performances. However, few have researched the ability of a humanoid to improvise with a human dance partner. Improvisation in the dance world is an act of communication, connection, and expression that leads to novelty in movement [1]. It is elusive and subjective to the creative’s experience which allows for different avenues to explore within the research field. One past study has been done to explore improvisation with a non-anthropomorphic robot coded with preset motions [2]. While other work has focused on a virtual avatar rather than a physical robot [3]. Much of this research relies on technical metrics or audience-based methods to evaluate the performance measure, with few focusing on evaluating the interaction between a human dancer and the robot from the dancer’s perspective. The goal of DANCR is to create a system that serves as an improvisational tool that contemporary dancers can use to explore new ideas and movements through interacting with the system in real time. The system is set up in the following way: A dancer is positioned in front of three motion sensors which capture their joint positions, this information is fed into a machine learning model that has been trained on data sets from four expert dancers, the model processes the information and returns a set of joint angles that can be embodied by the robot, in our case the pepper robot manufactured by SoftBank robotics. The result is a movement that corresponds to the input from the human dancer and has the potential to influence novel movements. Our specific task of the project is to evaluate the performance of the system as it pertains to creating moments of improvisation. In order to approach this issue, we have created a qualitative framework that can judge the quality of the system from the experience of the dancer. 


[1] L. A. Blom and L. T. Chaplin, The Moment of Movement: Dance Improvisation. Pittsburgh, PA: University of Pittsburgh Press, 1988, pp. 6-7.

[2] E. Jochum and J. Derks, “Tonight we improvise!,” Proceedings of the 6th International Conference on Movement and Computing, 2019. doi:10.1145/3347122.3347129 

[3] A. Berman and V. James, “Kinetic Imaginations: Exploring the Possibilities of Combining AI and Dance,” Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence, 2015.