DocumentCode :
276633
Title :
Error back propagation with minimum-entropy weights: a technique for better generalization of 2-D shift-invariant NNs
Author :
Zhang, Wei ; Hasegawa, Akio ; Itoh, Kazuyoshi ; Ichioka, Yoshiki
Author_Institution :
Dept. of Appl. Phys., Osaka Univ., Japan
Volume :
i
fYear :
1991
fDate :
8-14 Jul 1991
Firstpage :
645
Abstract :
For better generalization of shift-invariant neural networks, the authors propose a modified backpropagation learning rule, which reduces the complexity of the neural network as a whole, instead of removing the particular hidden units. The measure of the complexity of the neural network is defined as the entropy of the connectivity pattern. Learning in the neural network is carried out to minimize this measure as well as the output error. An example of line-feature detection using a shift-invariant neural network is presented. Simulation results show that the neural network response function is generalized by the modified learning rule to be independent of the angle of lines, after being trained at some discrete angles
Keywords :
computerised pattern recognition; entropy; learning systems; neural nets; 2D shift invariant neural nets; connectivity pattern; discrete angles; error backpropagation learning rule; generalization; line-feature detection; minimum-entropy weights; network complexity reduction; response function; simulation; training; Entropy; Feedforward neural networks; Measurement units; Neural networks; Physics;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
Type :
conf
DOI :
10.1109/IJCNN.1991.155255
Filename :
155255
Link To Document :
بازگشت