DocumentCode :
3694941
Title :
Visual pointing gestures for bi-directional human robot interaction in a pick-and-place task
Author :
Camilo Perez Quintero;Romeo Tatsambon;Mona Gridseth;Martin Jägersand
Author_Institution :
Department of Computing Science, University of Alberta, Edmonton, AB, T6G2E8, Canada
fYear :
2015
Firstpage :
349
Lastpage :
354
Abstract :
This paper explores visual pointing gestures for two-way nonverbal communication for interacting with a robot arm. Such non-verbal instruction is common when humans communicate spatial directions and actions while collaboratively performing manipulation tasks. Using 3D RGBD we compare human-human and human-robot interaction for solving a pick-and-place task. In the human-human interaction we study both pointing and other types of gestures, performed by humans in a collaborative task. For the human-robot interaction we design a system that allows the user to interact with a 7DOF robot arm using gestures for selecting, picking and dropping objects at different locations. Bi-directional confirmation gestures allow the robot (or human) to verify that the right object is selected. We perform experiments where 8 human subjects collaborate with the robot to manipulate ordinary household objects on a tabletop. Without confirmation feedback selection accuracy was 70–90% for both humans and the robot. With feedback through confirmation gestures both humans and our vision-robotic system could perform the task accurately every time (100%). Finally to illustrate our gesture interface in a real application, we let a human instruct our robot to make a pizza by selecting different ingredients.
Keywords :
"Robot sensing systems","Robot kinematics","Human-robot interaction","Three-dimensional displays","Service robots","Containers"
Publisher :
ieee
Conference_Titel :
Robot and Human Interactive Communication (RO-MAN), 2015 24th IEEE International Symposium on
Type :
conf
DOI :
10.1109/ROMAN.2015.7333604
Filename :
7333604
Link To Document :
بازگشت