Title :
An object-based visual attention model for robots
Author :
Yu, Yuanlong ; Mann, George K I ; Gosine, Raymond G.
Author_Institution :
Fac. of Eng. & Appl. Sci., Memorial Univ. of Newfoundland, St. John´´s, NL
Abstract :
In this paper an object-based visual attention model extending Duncan´s integrated competition hypothesis is presented for robots. Based on Gestalt rules the model segments the visual field into primitive groupings by evaluating both edge continuity and color similarity. An object representation is also built in long-term memory by using contour and color features. Dependent on the task and object representation, top-down modulation performs on pre-attentive features, followed by bottom-up competition. The object-based salience is evaluated by combination of pixel-wise salience within each pre-attentive grouping. The attended object is finally refined to reach an accurate representation in working memory. This model has been applied into two tasks of mobile robots: task-specific still and moving object detection. Experimental results in cluttered scenes are shown to validate this model.
Keywords :
edge detection; image colour analysis; image motion analysis; image representation; mobile robots; robot vision; Gestalt rules; color similarity evaluation; edge continuity evaluation; mobile robots; moving object detection; object representation; object-based salience; object-based visual attention model; task-specific still detection; top-down modulation; Computational modeling; Feature extraction; Layout; Mobile robots; Object detection; Robotics and automation; Shape; Sun; USA Councils;
Conference_Titel :
Robotics and Automation, 2008. ICRA 2008. IEEE International Conference on
Conference_Location :
Pasadena, CA
Print_ISBN :
978-1-4244-1646-2
Electronic_ISBN :
1050-4729
DOI :
10.1109/ROBOT.2008.4543326