DocumentCode
288352
Title
Introducing invariance: a principled approach to weight sharing
Author
Shawe-Taylor, John
Author_Institution
Dept. of Comput. Sci., London Univ., UK
Volume
1
fYear
1994
fDate
27 Jun-2 Jul 1994
Firstpage
345
Abstract
The paper describes a framework for addressing the training problem of multi-layer perceptrons by a principled introduction of weight sharing. The technique not only reduces the size of the class from which the learning algorithm must select its hypothesis but also reduces the number of examples required for a given level of generalization. The question of assessing the functionality of the weight sharing network is addressed, with a view to ensuring that the weight constraints introduced have not excluded the target functions of the learning task
Keywords
learning (artificial intelligence); multilayer perceptrons; neural nets; functionality; invariance; learning algorithm; multilayer perceptrons; principled approach; target functions; training problem; weight constraints; weight sharing; Computer science; Feedforward neural networks; Handwriting recognition; Large-scale systems; Multilayer perceptrons; Neural networks; Pediatrics;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location
Orlando, FL
Print_ISBN
0-7803-1901-X
Type
conf
DOI
10.1109/ICNN.1994.374187
Filename
374187
Link To Document