DocumentCode :
3285929
Title :
Adaptive network for optimal linear feature extraction
Author :
Földiák, Peter
Author_Institution :
Physiol. Lab., Cambridge, UK
fYear :
1989
fDate :
0-0 1989
Firstpage :
401
Abstract :
A network of highly interconnected linear neuron-like processing units and a simple, local, unsupervised rule for the modification of connection strengths between these units are proposed. After training the network on a high (m) dimensional distribution of input vectors, the lower (n) dimensional output will be a projection into the subspace of the n largest principal components (the subspace spanned by the n eigenvectors of the largest eigenvalues of the input covariance matrix) and maximize the mutual information between the input and the output in the same way as principal component analysis does. The purely local nature of the synaptic modification rule (simple Hebbian and anti-Hebbian) makes the implementation of the network easier, faster, and biologically more plausible than rules depending on error propagation.<>
Keywords :
adaptive systems; eigenvalues and eigenfunctions; matrix algebra; neural nets; pattern recognition; virtual machines; adaptive network; adaptive systems; anti-Hebbian; connection strengths; eigenvalues; eigenvectors; highly interconnected linear neuron-like processing; input covariance matrix; matrix algebra; mutual information; neural nets; optimal linear feature extraction; pattern recognition; synaptic modification rule; Adaptive systems; Eigenvalues and eigenfunctions; Matrices; Neural networks; Pattern recognition; Virtual computers;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
Type :
conf
DOI :
10.1109/IJCNN.1989.118615
Filename :
118615
Link To Document :
بازگشت