Title :
Effects of limited precision weight values on the accuracy of feedforward networks
Abstract :
Summary form only given. In any implementation of an artificial neural network (ANN) it is important to assess the impact of the quantization error on the system´s accuracy and to establish the satisfactory performance of the network over a range of quantization levels. An ANN´s ability to work with noisy data and damaged network components suggests that a network will possess a high tolerance to degradation, even with a very limited number of quantization levels. The effects of weight quantization on the output of a number of digital implementations of feedforward networks were investigated over a range of quantization levels. The results were obtained through a software simulation developed to determine the effects on network accuracy of the number of bits used for interconnection weight representation, the training strategy, and the details of the architecture of the implementation
Keywords :
neural nets; feedforward networks; interconnection weight representation; network accuracy; neural network; quantization error; training strategy; weight quantization; Artificial neural networks; Computer architecture; Degradation; Noise level; Quantization;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155644