DocumentCode :
2831789
Title :
A training algorithm for discrete multilayer perceptrons
Author :
Park, Sungkwon ; Kim, Jung H. ; Chung, Ho-Sun
Author_Institution :
Dept. of Electr. Eng., Tennessee Technol. Univ., Cookeville, TN, USA
fYear :
1991
fDate :
11-14 Jun 1991
Firstpage :
1493
Abstract :
A learning algorithm for discrete multilayer perceptrons for binary patterns which guarantees convergence is introduced. Only two layers (one hidden layer) are required for binary patterns. Neurons in the hidden layer develop, as necessary, by learning without supervision. The computational amount is much less than that of the backpropagation algorithm. In the networks, neurons with hard limiters as their activation functions and integer weights and thresholds are used. Hence, accurate hardware implementation of trained networks can be easily realized using readily available VLSI technology
Keywords :
convergence; learning systems; neural nets; VLSI technology; activation functions; binary patterns; convergence; discrete multilayer perceptrons; hard limiters; hardware implementation; hidden layer; integer weights; learning algorithm; trained networks; training algorithm; Algorithm design and analysis; Equations; Hardware; Hypercubes; Multilayer perceptrons; Neurons; Nonhomogeneous media; Pattern analysis; Very large scale integration;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems, 1991., IEEE International Sympoisum on
Print_ISBN :
0-7803-0050-5
Type :
conf
DOI :
10.1109/ISCAS.1991.176658
Filename :
176658
Link To Document :
بازگشت