DocumentCode :
1247816
Title :
Learning viewpoint invariant perceptual representations from cluttered images
Author :
Spratling, Michael W.
Author_Institution :
Div. of Eng., King´´s Coll., London, UK
Volume :
27
Issue :
5
fYear :
2005
fDate :
5/1/2005 12:00:00 AM
Firstpage :
753
Lastpage :
761
Abstract :
In order to perform object recognition, it is necessary to form perceptual representations that are sufficiently specific to distinguish between objects, but that are also sufficiently flexible to generalize across changes in location, rotation, and scale. A standard method for learning perceptual representations that are invariant to viewpoint is to form temporal associations across image sequences showing object transformations. However, this method requires that individual stimuli be presented in isolation and is therefore unlikely to succeed in real-world applications where multiple objects can co-occur in the visual input. This paper proposes a simple modification to the learning method that can overcome this limitation and results in more robust learning of invariant representations.
Keywords :
computer vision; image representation; image sequences; learning (artificial intelligence); object recognition; cluttered images; image sequences; invariant perceptual representation; learning method; object recognition; object transformations; temporal associations; Anodes; Brain modeling; Image sequences; Learning systems; Neural networks; Neurons; Object recognition; Robustness; Temporal lobe; Index Terms- Computational models of vision; neural nets.; Algorithms; Artificial Intelligence; Biomimetics; Computer Simulation; Humans; Image Enhancement; Image Interpretation, Computer-Assisted; Imaging, Three-Dimensional; Information Storage and Retrieval; Models, Biological; Pattern Recognition, Automated; Visual Perception;
fLanguage :
English
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
Publisher :
ieee
ISSN :
0162-8828
Type :
jour
DOI :
10.1109/TPAMI.2005.105
Filename :
1407878
Link To Document :
بازگشت