DocumentCode :
350997
Title :
Experimental study on the precision requirements of RBF, RPROP and BPTT training
Author :
Vollmer, Urs ; Strey, Alfred
Author_Institution :
Dept. of Neural Inf. Process., Ulm Univ., Germany
Volume :
1
fYear :
1999
fDate :
1999
Firstpage :
239
Abstract :
Most neurocomputer architectures support only fixed point arithmetic which allows a higher degree of VLSI integration but limits the range and precision of all variables. Up to now the effect of this limitation on neural network training algorithms has been studied only for standard models like SOM or BP. This paper presents the results of an experimental study in which the precision requirements of three other learning algorithms (RBF, RPROP and BPTT) on exemplary task have been investigated. While the RBF and BPTT key variables required more than 16 bit for training to solve the selected problems, the RPROP algorithm showed good results with far less than 16 bit
Keywords :
radial basis function networks; BPTT; RBF; RPROP; VLSI integration; backpropagation through time; neural network training algorithms; neurocomputer architectures; precision requirements; resilient propagation;
fLanguage :
English
Publisher :
iet
Conference_Titel :
Artificial Neural Networks, 1999. ICANN 99. Ninth International Conference on (Conf. Publ. No. 470)
Conference_Location :
Edinburgh
ISSN :
0537-9989
Print_ISBN :
0-85296-721-7
Type :
conf
DOI :
10.1049/cp:19991115
Filename :
819727
Link To Document :
بازگشت