DocumentCode :
286756
Title :
Comparing parameters selection methods and weights rounding techniques to optimize the learning in neural networks
Author :
Grandin, J.F. ; Braban, B. ; Ledoux, C. ; Halioua, A.
Author_Institution :
Thomson-CSF, RCM, Paris, France
fYear :
1993
fDate :
25-27 May 1993
Firstpage :
46
Lastpage :
50
Abstract :
Neural network techniques can be used for the approximation of decision functions. In such a case, the function´s parameters to be estimated are the synaptic weights of the network. A small amount of data available for training limits the number of parameters that can be correctly estimated. Furthermore, all weights are not necessarily significant. An interesting point is to be able to decide which weights are useful in the network. Selecting the useful weights and suppressing the others can improve the quality of generalization. An other approach consists in evaluating the precision that has been reached for each weight through the learning phase. Weights rounding techniques can then be applied to remove the noise introduced by nonsignificant bits. This may be seen as a refinement of weights selection and suppression. This paper proposes a comparison of selection methods and a weights rounding technique to optimize the learning in the case of functions approximation
Keywords :
function approximation; learning (artificial intelligence); neural nets; decision function approximation; learning; neural networks; parameter reduction; parameters selection methods; weight suppression; weights rounding;
fLanguage :
English
Publisher :
iet
Conference_Titel :
Artificial Neural Networks, 1993., Third International Conference on
Conference_Location :
Brighton
Print_ISBN :
0-85296-573-7
Type :
conf
Filename :
263259
Link To Document :
بازگشت