DocumentCode
768372
Title
Benefits of gain: speeded learning and minimal hidden layers in back-propagation networks
Author
Kruschke, John K. ; Movellan, Javier R.
Author_Institution
Dept. of Psychol., California Univ., Berkeley, CA, USA
Volume
21
Issue
1
fYear
1991
Firstpage
273
Lastpage
280
Abstract
The gain of a mode in a connectionist network is a multiplicative constant that amplifies or attenuates the net input to the node. The benefits of adaptive gains in back-propagation networks are explored. It is shown that gradient descent with respect to gain greatly increases learning speed by amplifying those directions in weight space that are successfully chosen by gradient descent on weights. Adaptive gains also allow normalization of weight vectors without loss of computational capacity, and the authors suggest a simple modification of the learning rule that automatically achieves weight normalization. A method for creating small hidden layers by making hidden node gains compete according to similarities between nodes in an effect to improve generalization performance is described. Simulations show that this competition method is more effective than the special case of gain decay
Keywords
learning systems; neural nets; adaptive gains; back-propagation networks; connectionist network; gradient descent; minimal hidden layers; neural nets; speeded learning; Computational modeling; Feedforward systems; Helium; Intelligent networks; Nonhomogeneous media; Performance gain; Psychology;
fLanguage
English
Journal_Title
Systems, Man and Cybernetics, IEEE Transactions on
Publisher
ieee
ISSN
0018-9472
Type
jour
DOI
10.1109/21.101159
Filename
101159
Link To Document