DocumentCode
565657
Title
Robots as interfaces to haptic and locomotor spaces
Author
Kulyukin, Vladimir ; Gharpure, Chaitanya ; Pentico, Cassidy
Author_Institution
Dept. of Comput. Sci., Utah State Univ., Logan, UT, USA
fYear
2007
fDate
9-11 March 2007
Firstpage
325
Lastpage
331
Abstract
Research on spatial cognition and navigation of the visually impaired suggests that vision may be a primary sensory modality that enables humans to align the egocentric (self to object) and allocentric (object to object) frames of reference in space. In the absence of vision, the frames align best in the haptic space. In the locomotor space, as the haptic space translates with the body, lack of vision causes the frames to misalign, which negatively affects action reliability. In this paper, we argue that robots can function as interfaces to the haptic and locomotor spaces in supermarkets. In the locomotor space, the robot eliminates the necessity of frame alignment and, in or near the haptic space, it cues the shopper to the salient features of the environment sufficient for product retrieval. We present a trichotomous ontology of spaces in a supermarket induced by the presence of a robotic shopping assistant and analyze the results of robot-assisted shopping experiments with ten visually impaired participants conducted in a real supermarket.
Keywords
handicapped aids; haptic interfaces; service robots; allocentric frames; egocentric frames; frame alignment; haptic spaces; locomotor spaces; navigation; primary sensory modality; product retrieval; robot-assisted shopping experiments; spatial cognition; supermarket; trichotomous ontology; visually impaired; Abstracts; Laboratories; Reliability; Robots; USA Councils; assistive robotics; haptic and locomotor interfaces; spatial cognition;
fLanguage
English
Publisher
ieee
Conference_Titel
Human-Robot Interaction (HRI), 2007 2nd ACM/IEEE International Conference on
Conference_Location
Arlington, VA
ISSN
2167-2121
Print_ISBN
978-1-59593-617-2
Type
conf
Filename
6251707
Link To Document