DocumentCode
2260331
Title
Backpropagation algorithm for logic oriented neural networks
Author
Kamio, Takeshi ; Tanaka, Shinichiro ; Morisue, Mititada
Author_Institution
Hiroshima City Univ., Hiroshima, Japan
Volume
2
fYear
2000
fDate
2000
Firstpage
123
Abstract
Multilayer feedforward neural network (MFNN) trained by the backpropagation (BP) algorithm is one of the most significant models in artificial neural networks. Although they have been implemented as analog, mixed analog-digital and fully digital VLSI circuits, it is still difficult to realize their hardware implementation with BP learning function. This paper describes the BP algorithm for the logic oriented neural network (LOGO-NN) which we have proposed as a kind of MFNN with quantized weights and multilevel threshold neurons. Since both weights and neuron outputs are quantized to integer values in LOGO-NNs, it is expected that LOGO-NNs with BP learning can be more effectively implemented than the common MFNNs. Finally, it is shown by simulations that the proposed BP algorithm has good performance for LOGO-NNs
Keywords
backpropagation; convergence; feedforward neural nets; formal logic; pattern recognition; backpropagation; convergence; feedforward neural network; learning function; logic oriented neural networks; multilevel threshold neurons; pattern recognition; quantized weights; Analog-digital conversion; Artificial neural networks; Backpropagation algorithms; Circuits; Feedforward neural networks; Logic; Multi-layer neural network; Neural networks; Neurons; Very large scale integration;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location
Como
ISSN
1098-7576
Print_ISBN
0-7695-0619-4
Type
conf
DOI
10.1109/IJCNN.2000.857885
Filename
857885
Link To Document