DocumentCode :
3763097
Title :
Human behavior modeling for multimodal interaction with robot partner
Author :
Takenori Obo;Loo Chu Kiong;Naoyuki Kubota
Author_Institution :
Faculty of Computer Science and Information Technology, University of Malaya, Kuala Lumpur 50603, Malaysia
fYear :
2015
Firstpage :
1
Lastpage :
7
Abstract :
If a robot partner can provide a multimodal interaction with a person, it may offer more smooth and natural human-robot communication. This paper proposes a method for modeling relationship between behavioral and emotional expression through human-robot interaction. First, we applied a neuro-fuzzy system to the classification of human posture. Next, we used the facial expression recognition algorithms based on Constrained Local Model and two-stage fuzzy reasoning model. Furthermore, we utilized spiking neural network for the relation modeling. In this paper, we show an experimental example to examine the proposed method.
Keywords :
"Cognition","Adaptation models"
Publisher :
ieee
Conference_Titel :
Micro-NanoMechatronics and Human Science (MHS), 2015 International Symposium on
Type :
conf
DOI :
10.1109/MHS.2015.7438251
Filename :
7438251
Link To Document :
بازگشت