Title :
Going beyond the perception of affordances: Learning how to actualize them through behavioral parameters
Author :
Emre Ugur;Erhan Oztop;Erol Şahin
Author_Institution :
Biological ICT, National Institute of Information and Communication Technology, Kyoto, Japan
fDate :
5/1/2011 12:00:00 AM
Abstract :
In this paper, we propose a method that enables a robot to learn not only the existence of affordances provided by objects, but also the behavioral parameters required to actualize them, and the prediction of effects generated on the objects in an unsupervised way. In a previous study, it was shown that through self-interaction and self-observation, analogous to an infant, an anthropomorphic robot can learn object affordances in a completely unsupervised way, and use this knowledge to make plans in its perceptual space. This paper extends the affordances model proposed in that study by using parametric behaviors and including the behavior parameters into affordance learning and goal-oriented plan generation. Furthermore, for handling complex behaviors and complex objects (such as execution of precision grasp on a mug), the perceptual processing is improved by using a combination of local and global features. Finally, a hierarchical clustering algorithm is used to discover the affordances in non-homogenous feature space. In short, object affordances for object manipulation are discovered together with behavior parameters based on the monitored effects.
Keywords :
"Cameras","Robot vision systems","Prototypes","Accuracy","Support vector machines"
Conference_Titel :
Robotics and Automation (ICRA), 2011 IEEE International Conference on
Print_ISBN :
978-1-61284-386-5
DOI :
10.1109/ICRA.2011.5980299