DocumentCode :
3643021
Title :
Going beyond the perception of affordances: Learning how to actualize them through behavioral parameters
Author :
Emre Ugur;Erhan Oztop;Erol Şahin
Author_Institution :
Biological ICT, National Institute of Information and Communication Technology, Kyoto, Japan
fYear :
2011
fDate :
5/1/2011 12:00:00 AM
Firstpage :
4768
Lastpage :
4773
Abstract :
In this paper, we propose a method that enables a robot to learn not only the existence of affordances provided by objects, but also the behavioral parameters required to actualize them, and the prediction of effects generated on the objects in an unsupervised way. In a previous study, it was shown that through self-interaction and self-observation, analogous to an infant, an anthropomorphic robot can learn object affordances in a completely unsupervised way, and use this knowledge to make plans in its perceptual space. This paper extends the affordances model proposed in that study by using parametric behaviors and including the behavior parameters into affordance learning and goal-oriented plan generation. Furthermore, for handling complex behaviors and complex objects (such as execution of precision grasp on a mug), the perceptual processing is improved by using a combination of local and global features. Finally, a hierarchical clustering algorithm is used to discover the affordances in non-homogenous feature space. In short, object affordances for object manipulation are discovered together with behavior parameters based on the monitored effects.
Keywords :
"Cameras","Robot vision systems","Prototypes","Accuracy","Support vector machines"
Publisher :
ieee
Conference_Titel :
Robotics and Automation (ICRA), 2011 IEEE International Conference on
ISSN :
1050-4729
Print_ISBN :
978-1-61284-386-5
Type :
conf
DOI :
10.1109/ICRA.2011.5980299
Filename :
5980299
Link To Document :
بازگشت