Title :
Experimental study on the precision requirements of RBF, RPROP and BPTT training
Author :
Vollmer, Urs ; Strey, Alfred
Author_Institution :
Dept. of Neural Inf. Process., Ulm Univ., Germany
Abstract :
Most neurocomputer architectures support only fixed point arithmetic which allows a higher degree of VLSI integration but limits the range and precision of all variables. Up to now the effect of this limitation on neural network training algorithms has been studied only for standard models like SOM or BP. This paper presents the results of an experimental study in which the precision requirements of three other learning algorithms (RBF, RPROP and BPTT) on exemplary task have been investigated. While the RBF and BPTT key variables required more than 16 bit for training to solve the selected problems, the RPROP algorithm showed good results with far less than 16 bit
Keywords :
radial basis function networks; BPTT; RBF; RPROP; VLSI integration; backpropagation through time; neural network training algorithms; neurocomputer architectures; precision requirements; resilient propagation;
Conference_Titel :
Artificial Neural Networks, 1999. ICANN 99. Ninth International Conference on (Conf. Publ. No. 470)
Conference_Location :
Edinburgh
Print_ISBN :
0-85296-721-7
DOI :
10.1049/cp:19991115