Title :
Learning object-specific grasp affordance densities
Author :
R. Detry;E. Baseski;M. Popovic;Y. Touati;N. Kruger;O. Kroemer;J. Peters;J. Piater
Author_Institution :
University of Li?ge, Belgium
fDate :
6/1/2009 12:00:00 AM
Abstract :
This paper addresses the issue of learning and representing object grasp affordances, i.e. object-gripper relative configurations that lead to successful grasps. The purpose of grasp affordances is to organize and store the whole knowledge that an agent has about the grasping of an object, in order to facilitate reasoning on grasping solutions and their achievability. The affordance representation consists in a continuous probability density function defined on the 6D gripper pose space-3D position and orientation-, within an object-relative reference frame. Grasp affordances are initially learned from various sources, e.g. from imitation or from visual cues, leading to grasp hypothesis densities. Grasp densities are attached to a learned 3D visual object model, and pose estimation of the visual model allows a robotic agent to execute samples from a grasp hypothesis density under various object poses. Grasp outcomes are used to learn grasp empirical densities, i.e. grasps that have been confirmed through experience. We show the result of learning grasp hypothesis densities from both imitation and visual cues, and present grasp empirical densities learned from physical experience by a robot.
Keywords :
"Robots","Humans","Biological system modeling","Grippers","Probability density function","Solid modeling","Autonomous agents","Encoding","Density functional theory","Kernel"
Conference_Titel :
Development and Learning, 2009. ICDL 2009. IEEE 8th International Conference on
Print_ISBN :
978-1-4244-4117-4
Electronic_ISBN :
2161-9476
DOI :
10.1109/DEVLRN.2009.5175520