DocumentCode :
2703413
Title :
Back propagation simulations using limited precision calculations
Author :
Holt, Jordan L. ; Baker, Thomas E.
Author_Institution :
Washington Univ., Seattle, WA, USA
fYear :
1991
fDate :
8-14 Jul 1991
Firstpage :
121
Abstract :
The precision required for neural net algorithms is an important question facing hardware architects. The authors present simulation results that compare floating point and limited precision integer back-propagation simulators. Data sets from the neural network benchmark suite maintained by Carnegie Mellon University were used to compare integer and floating point implementations. The simulation results indicate that integer computation works quite well for the back-propagation algorithm. In all cases except one, the limited precision integer simulations performed as well as the floating point simulations. The effect of reducing the precision of the trained weights is also reported
Keywords :
digital arithmetic; learning systems; neural nets; virtual machines; back-propagation simulators; floating point arithmetic; integer computation; limited precision calculations; neural net algorithms; Algorithm design and analysis; Artificial neural networks; Backpropagation algorithms; Computational modeling; Computer networks; Convergence; Databases; Neural network hardware; Neural networks; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
Type :
conf
DOI :
10.1109/IJCNN.1991.155324
Filename :
155324
Link To Document :
بازگشت