DocumentCode :
3575960
Title :
Robotic gesture generation based on a cognitive basis for non-verbal communication
Author :
Jeong-Yean Yang ; Dong-Soo Kwon
Author_Institution :
Human-Robot Interaction Res. Center, Korea Adv. Inst. of Sci. & Technol., Daejeon, South Korea
fYear :
2014
Firstpage :
683
Lastpage :
687
Abstract :
This paper introduces a semantic synthesis method that enables robots to generate human-like gestures by recognizing cognitive and emotional behaviors based on a given situation. Assuming that the human cognitive process is represented as a series of associated events, we proposed a virtually touchable space associated with robotic hands. Additionally, in a humanoid robot, the motions of two arms are considered as a crucial non-verbal communication channel because large spatial changes capture the attention of a human agent. Additionally, virtual spaces related to certain events are described by robotic hands. The concept of virtual spaces is tested with regard to the expression of the robot´s cognitive process with a combination of predefined motion sets.
Keywords :
cognitive systems; gesture recognition; human-robot interaction; humanoid robots; motion control; cognitive behavior recognition; emotional behavior recognition; human-robot interaction; humanoid robot; motion set; nonverbal communication; robot cognitive process; robotic gesture generation; robotic hand; semantic synthesis method; virtual space; Humanoid robots; Joints; Robot sensing systems; Shape; Thumb; Semantic representation; cognitive and emotional behavior; gesture generation; non-verbal communication;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Ubiquitous Robots and Ambient Intelligence (URAI), 2014 11th International Conference on
Type :
conf
DOI :
10.1109/URAI.2014.7057497
Filename :
7057497
Link To Document :
بازگشت