DocumentCode :
1636546
Title :
Spatially unconstrained, gesture-based human-robot interaction
Author :
Doisy, Guillaume ; Jevtic, Aleksandar ; Bodiroza, S.
Author_Institution :
Dept. of IEM, Ben-Gurion Univ. of the Negev, Beer-Sheva, Israel
fYear :
2013
Firstpage :
117
Lastpage :
118
Abstract :
For a human-robot interaction to take place, a robot needs to perceive humans. The space where a robot can perceive humans is restrained by the limitations of robot´s sensors. These restrictions can be circumvented by the use of external sensors, like in intelligent environments; otherwise humans have to ensure that they can be perceived. With the robotic platform presented here, the roles are reversed and the robot autonomously ensures that the human is within the area perceived by the robot. This is achieved by a combination of hardware and algorithms capable of autonomously tracking the person, estimating their position and following them, while recognizing their gestures and navigating through environment.
Keywords :
gesture recognition; human-robot interaction; object tracking; robot vision; gesture recognition; gesture-based human-robot interaction; person tracking; position estimation; robot sensors; Gesture recognition; Intelligent sensors; Robot kinematics; Robot sensing systems; Tracking; Gesture Recognition; Human-Robot Interaction; Person Following; Person Tracking;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Human-Robot Interaction (HRI), 2013 8th ACM/IEEE International Conference on
Conference_Location :
Tokyo
ISSN :
2167-2121
Print_ISBN :
978-1-4673-3099-2
Electronic_ISBN :
2167-2121
Type :
conf
DOI :
10.1109/HRI.2013.6483529
Filename :
6483529
Link To Document :
بازگشت