DocumentCode :
241109
Title :
Perceiving intimacy from both robot view and first-person view in dyadic human interaction
Author :
Ting-Sheng Chu ; Yi-Shiu Chiang ; Chung Dial Lim ; Tung-Yen Wu ; Shih-Huan Tseng ; Li-Chen Fu
Author_Institution :
Dept. of Comput. Sci. & Inf. Eng., Nat. Taiwan Univ., Taipei, Taiwan
fYear :
2014
fDate :
11-13 Sept. 2014
Firstpage :
26
Lastpage :
31
Abstract :
In this paper, we propose a framework to perceive the level of intimacy in dyadic human interactions from both robot perspective and first-person perspective. First of all, for catching the insight of human interaction with three different degrees of intimacy persons, namely normal, familiar and close, we have done a preliminary user study of social interaction. Next, from the field of social science and our study, we design four types of social interaction features, consisting of proxemics, non-verbal, verbal, and temporal features to categorize the intimacy level from observations, but including three of them reaches the best result. Finally, to validate our work here, several experiments have been conducted, and the results show that the framework perceives the aforementioned intimacy level with accuracy up to 86.11% in average.
Keywords :
human-robot interaction; social sciences; dyadic human interaction; first-person view; intimacy persons; nonverbal features; proxemics features; robot view; social interaction features; social science; temporal features; verbal features; Accuracy; Cameras; Feature extraction; Robot vision systems; Social network services; Speech;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Advanced Robotics and its Social Impacts (ARSO), 2014 IEEE Workshop on
Conference_Location :
Evanston, IL
Type :
conf
DOI :
10.1109/ARSO.2014.7020975
Filename :
7020975
Link To Document :
بازگشت