DocumentCode :
3394736
Title :
Quantization noise improvement in a distributed-neuron architecture
Author :
Djahanshahi, H. ; MacLean, B. ; Ahmadi, M. ; Jullien, G.A. ; Miller, W.C.
Author_Institution :
Dept. of Electr. Eng., Windsor Univ., Ont., Canada
Volume :
2
fYear :
1997
fDate :
3-6 Aug. 1997
Firstpage :
1282
Abstract :
In conventional sigmoidal neural networks with lumped neurons, the effect of weight quantization becomes more apparent at the output as the network becomes larger. It is shown here, however, using a statistical approach, that the self-scaling property of a special hardware architecture with distributed neurons reduces the effect of quantization noise as the number of neuron inputs increases.
Keywords :
neural chips; neural net architecture; quantisation (signal); statistical analysis; distributed-neuron architecture; hardware architecture; neural networks; neuron inputs; quantization noise improvement; self-scaling property; statistical approach; Dynamic range; Intelligent networks; Multi-layer neural network; Neural network hardware; Neural networks; Neurons; Noise reduction; Quantization; Signal to noise ratio; Statistical analysis;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems, 1997. Proceedings of the 40th Midwest Symposium on
Print_ISBN :
0-7803-3694-1
Type :
conf
DOI :
10.1109/MWSCAS.1997.662315
Filename :
662315
Link To Document :
بازگشت