Title :
Object search framework based on gaze interaction
Author :
Photchara Ratsamee;Yasushi Mae;Kazuto Kamiyama;Mitsuhiro Horade;Masaru Kojima;Kiyoshi Kiyokawa;Tomohiro Mashita;Yoshihiro Kuroda;Haruo Takemura;Tatsuo Arai
Author_Institution :
Cyber Media Center, Osaka University, Japan
Abstract :
In this research, we present an object search framework using robot-gaze interaction that support patients with motor paralysis conditions. A patient can give commands by gazing to the target object and a robot starts to search autonomously. Apart from multiple gaze interaction, our approach uses few gaze interaction to specify location clue and object clue and thus integrates the RGB-D sensing to segment unknown objects from the environment. Based on hypotheses from gaze information, we utilize multiregion Graph Cuts method along with an analysis of depth information. Furthermore, our search algorithm allows a robot to find a main observation point which is the point where user can clearly observe the target object. If a first segmentation was not satisfied by the user, the robot is able to adapt its pose to find different views of object. The approach has been implemented and tested on the humanoid robot ENON. With a few gaze guidance, the success rate of segmentation of unknown objects was achieved to be 85%. The experimental results confirm its applicability on a wide variety of objects even when the target object was occluded by another object.
Keywords :
"Search problems","Object segmentation","Navigation","Robot sensing systems","Three-dimensional displays","Gaze tracking"
Conference_Titel :
Robotics and Biomimetics (ROBIO), 2015 IEEE International Conference on
DOI :
10.1109/ROBIO.2015.7419066