DocumentCode :
1611716
Title :
Characterization of artificial neural network algorithms
Author :
Baker, Tom ; Hammerstrom, Dan
Author_Institution :
Dept. of Comput. Sci. & Eng., Oregon Grad. Center, Beaverton, OR, USA
fYear :
1989
Firstpage :
78
Abstract :
Tradeoffs must be made when artificial neural network models are implemented efficiently. One popular artificial neural network model, the back-propagation algorithm, promises to be a powerful and flexible learning model. The effects on its performance when the model is modified for efficient hardware implementation are discussed. The modifications examined concern limited precision architectures, sign/threshold propagation, sum weight changes, and the addition of noise. It is found that reduced precision computation can be used successfully for the back-propagation algorithm, the communication between processors can be reduced when propagating the weights, accumulating the weight changes can improve the execution time of the algorithm, and noise can have a positive effect on the learning algorithm
Keywords :
learning systems; neural nets; parallel architectures; addition of noise; artificial neural network algorithms; artificial neural network models; back-propagation algorithm; communication between processors; design tradeoffs; execution time; hardware implementation; learning model; limited precision architectures; modifications; performance; reduced precision computation; sign propagation; sum weight changes; threshold propagation; weights propagation; Application software; Artificial neural networks; Biological system modeling; Biology computing; Computational modeling; Computer architecture; Computer science; Hardware; Neurons; Semiconductor device modeling;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems, 1989., IEEE International Symposium on
Conference_Location :
Portland, OR
Type :
conf
DOI :
10.1109/ISCAS.1989.100296
Filename :
100296
Link To Document :
بازگشت