DocumentCode :
3147032
Title :
Finite precision error analysis for neural network learning
Author :
Holt, Jordan L. ; Hwang, Jenq-Neng
Author_Institution :
Dept. of Electr. Eng., Washington Univ., Seattle, WA, USA
fYear :
1991
fDate :
23-26 Jul 1991
Firstpage :
237
Lastpage :
241
Abstract :
The high speed desired in the implementation of many neural network algorithms, such as backpropagation learning in a multilayer perceptron (MLP), may be attained through the use of finite precision hardware. This finite precision hardware, however, is prone to errors. A method of theoretically deriving and statistically evaluating this error is presented and could be used as a guide to the details of hardware design and algorithm implementation. The paper is devoted to the derivation of the techniques involved as well as the details of the backpropagation example. The intent is to provide a general framework by which most neural network algorithms under any set of hardware constraints may be evaluated
Keywords :
backpropagation; error analysis; neural nets; AI; algorithms; backpropagation; design; error analysis; finite precision hardware; learning; multilayer perceptron; neural network learning; Computer networks; Error analysis; Error correction; Information processing; Laboratories; Multi-layer neural network; Multilayer perceptrons; Neural network hardware; Neural networks; Neurons;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks to Power Systems, 1991., Proceedings of the First International Forum on Applications of
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0065-3
Type :
conf
DOI :
10.1109/ANN.1991.213471
Filename :
213471
Link To Document :
بازگشت