Title :
Self-organization by delta rule
Abstract :
In a certain two-layer network architecture, the delta learning rule leads to a self-organization of connection weights. A formal analysis of this learning rule shows that the weight vectors of n second-layer nodes converge to a rotation of the first n principal components. Therefore, the delta-rule-based self-organization performs optimal encoding and decoding of data in the sense of the principal component analysis. This property of the delta learning rule has been verified by a series of computational experiments, which also showed good convergence stability of the rule. For data compression tasks, it performs substantially better than a three-layer autoassociative perceptron with linear or nonlinear hidden units
Keywords :
data compression; learning systems; neural nets; computational experiments; connection weights; convergence stability; data compression tasks; delta rule; learning rule; neural nets; optimal encoding; principal component analysis; self-organization; two-layer network architecture; weight vectors;
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/IJCNN.1990.137731