DocumentCode :
133840
Title :
Gestural and facial communication with smart phone based robot partner using emotional model
Author :
Botzheim, Janos ; Jinseok Woo ; Wi, Noel Tay Nuo ; Kubota, Naoyuki ; Yamaguchi, Toru
Author_Institution :
Grad. Sch. of Syst. Design, Tokyo Metropolitan Univ., Hino, Japan
fYear :
2014
fDate :
3-7 Aug. 2014
Firstpage :
644
Lastpage :
649
Abstract :
When conducting natural communication in addition to perform verbal communication, a robot partner should also understand non-verbal communication such as facial and gestural information. The word “understand” for the robot means how to grasp the meaning of the gesture itself. In this paper we propose a smart phone based system, where an emotional model connects the facial and gestural communication of a human and a robot partner. The input of the emotional model is based on face classification and gesture recognition from the human side. Based on the emotional model, the output action such as gestural and facial expressions for the robot is calculated.
Keywords :
face recognition; gesture recognition; human-robot interaction; image classification; smart phones; emotional model; face classification; facial communication; facial expression; facial information; gestural communication; gestural expression; gestural information; gesture recognition; natural communication; robot partner; smart phone; verbal communication; Face; Gesture recognition; Mood; Neural networks; Neurons; Robots; Training;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
World Automation Congress (WAC), 2014
Conference_Location :
Waikoloa, HI
Type :
conf
DOI :
10.1109/WAC.2014.6936076
Filename :
6936076
Link To Document :
بازگشت