DocumentCode
383141
Title
Vision-based urban navigation procedures for verbally instructed robots
Author
Kyriacou, Theocharis ; Bugmann, Guido ; Lauria, Stanislao
Author_Institution
Robotic Intelligence Lab., Univ. of Plymouth, UK
Volume
2
fYear
2002
fDate
2002
Firstpage
1326
Abstract
Humans who explain a task to a robot, or to another human, use chunks of actions that are often complex procedures for robots. An instructible robot needs to be able to map such chunks to existing preprogrammed primitives. We investigate the nature of these chunks in an urban visual navigation context and describe the implementation of one of the primitives: "take the nth turn right/left". This implementation requires the use of a "short-lived" internal map updated as the robot moves along. The recognition and localisation of intersections is done using task-guided template matching. This approach takes advantage of the content of human instructions to save computation time and improve robustness.
Keywords
computerised navigation; interactive systems; learning systems; mobile robots; pattern matching; robot vision; speech recognition; human instructions; instruction-based learning; mobile robots; task-guided template matching; urban navigation; verbal instructions; verbally instructed robots; vision-based navigation; Cities and towns; Computer aided instruction; Humans; Intelligent robots; Natural languages; Navigation; Roads; Robot vision systems; Robustness; World Wide Web;
fLanguage
English
Publisher
ieee
Conference_Titel
Intelligent Robots and Systems, 2002. IEEE/RSJ International Conference on
Print_ISBN
0-7803-7398-7
Type
conf
DOI
10.1109/IRDS.2002.1043938
Filename
1043938
Link To Document