DocumentCode :
1169237
Title :
Two original weight pruning methods based on statistical tests and rounding techniques
Author :
Ledoux, C. ; Grandin, J.F.
Author_Institution :
INRETS-MAIA, Arcueil, France
Volume :
141
Issue :
4
fYear :
1994
fDate :
8/1/1994 12:00:00 AM
Firstpage :
230
Lastpage :
237
Abstract :
The authors focus on the use of neural networks to approximate continuous decision functions. In this context, the parameters to be estimated are the synaptic weights of the network. The number of such parameters and the quantity of data (information) available for training greatly influence the quality of the solution obtained. A previous study analysed the influence and interaction of these two features. In order to reach the architecture of the net leading to the best fitting of the training data, two original pruning techniques are proposed. The evolution of the neural network performances, training and test rates, as the number of synaptic weights pruned increases, is shown experimentally. Two kinds of synaptic weights are obvious: irrelevant synaptic weights, which can be suppressed from the model; and relevant synaptic weights, which cannot be removed. In the test problem, it is possible to reduce the size of the network up to 42%. A 4% improvement of the performance in generalisation is observed
Keywords :
decision theory; learning (artificial intelligence); neural nets; parameter estimation; statistical analysis; best fit; continuous decision functions approximation; neural networks; performance; quantity of data; rounding techniques; statistical tests; synaptic weights; test problem; test rates; training; weight pruning methods;
fLanguage :
English
Journal_Title :
Vision, Image and Signal Processing, IEE Proceedings -
Publisher :
iet
ISSN :
1350-245X
Type :
jour
DOI :
10.1049/ip-vis:19941328
Filename :
318025
Link To Document :
بازگشت