DocumentCode :
3098095
Title :
Homing in scale space
Author :
Churchill, David ; Vardy, Andrew
Author_Institution :
Dept. of Comput. Sci., Memorial Univ. of Newfoundland, St. John´´s, NL
fYear :
2008
fDate :
22-26 Sept. 2008
Firstpage :
1307
Lastpage :
1312
Abstract :
Local visual homing is the process of determining the direction of movement required to return an agent to a goal location by comparing the current image with an image taken at the goal, known as the snapshot image. One way of accomplishing visual homing is by computing the correspondences between features and then analyzing the resulting flow field to determine the correct direction of motion. Typically, some strong assumptions need to be posited in order to compute the home direction from the flow field. For example, it is difficult to locally distinguish translation from rotation, so many authors assume rotation to be computable by other means (e.g. magnetic compass). In this paper we present a novel approach to visual homing using scale change information from Scale Invariant Feature Transforms (SIFT) which we use to compute landmark correspondences. The method described here is able to determine the direction of the goal in the robotpsilas frame of reference, irrespective of the relative 3D orientation with the goal.
Keywords :
mobile robots; multi-robot systems; robot vision; goal location; local visual homing; motion correct direction; scale invariant feature transforms; snapshot image; Databases; Distance measurement; Feature extraction; Pixel; Robots; Transforms; Visualization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Robots and Systems, 2008. IROS 2008. IEEE/RSJ International Conference on
Conference_Location :
Nice
Print_ISBN :
978-1-4244-2057-5
Type :
conf
DOI :
10.1109/IROS.2008.4651166
Filename :
4651166
Link To Document :
بازگشت