Title :
Robots that express emotion elicit better human teaching
Author :
Leyzberg, Dan ; Avrunin, Eleanor ; Liu, Jenny ; Scassellati, Brian
Author_Institution :
Dept. of Comput. Sci., Yale Univ., New Haven, CT, USA
Abstract :
Does the emotional content of a robot´s speech affect how people teach it? In this experiment, participants were asked to demonstrate several “dances” for a robot to learn. Participants moved their bodies in response to instructions displayed on a screen behind the robot. Meanwhile, the robot faced the participant and appeared to emulate the participant´s movements. After each demonstration, the robot received an accuracy score and the participant chose whether or not to demonstrate that dance again. Regardless of the participant´s input, however, the robot´s dancing and the scores it received were arranged in advance and constant across all participants. The only variation between groups in this study was what the robot said in response to its scores. Participants saw one of three conditions: appropriate emotional responses, often-inappropriate emotional responses, or apathetic responses. Participants that taught the robot with appropriate emotional responses demonstrated the dances, on average, significantly more frequently and significantly more accurately than participants in the other two conditions.
Keywords :
computer aided instruction; human-robot interaction; psychology; apathetic response; dancing task; emotional content; human teaching; often-inappropriate emotional response; robot speech; Accuracy; Computers; Educational institutions; Humans; Robots; Speech; affect; emotion; human teacher; robot;
Conference_Titel :
Human-Robot Interaction (HRI), 2011 6th ACM/IEEE International Conference on
Conference_Location :
Lausanne
Print_ISBN :
978-1-4673-4393-0
Electronic_ISBN :
2167-2121