DocumentCode :
820899
Title :
Learning with limited numerical precision using the cascade-correlation algorithm
Author :
Hoehfeld, Markus ; Fahlman, Scott E.
Author_Institution :
Sch. of Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA
Volume :
3
Issue :
4
fYear :
1992
fDate :
7/1/1992 12:00:00 AM
Firstpage :
602
Lastpage :
611
Abstract :
A key question in the design of specialized hardware for simulation of neural networks is whether fixed-point arithmetic of limited numerical precision can be used with existing learning algorithms. An empirical study of the effects of limited precision in cascade-correlation networks on three different learning problems is presented. It is shown that learning can fail abruptly as the precision of network weights or weight-update calculations is reduced below a certain level, typically about 13 bits including the sign. Techniques for dynamic rescaling and probabilistic rounding that allow reliable convergence down to 7 bits of precision or less, with only a small and gradual reduction in the quality of the solutions, are introduced
Keywords :
correlation methods; learning systems; neural nets; cascade-correlation algorithm; dynamic rescaling; fixed-point arithmetic; learning algorithms; limited numerical precision; neural networks; probabilistic rounding; weight-update calculations; Algorithm design and analysis; Arithmetic; Computational modeling; Computer architecture; Computer networks; Computer science; Computer simulation; Neural network hardware; Neural networks; Predictive models;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.143374
Filename :
143374
Link To Document :
بازگشت