DocumentCode :
2831511
Title :
Parallel distributed processing with multiple one-output back-propagation neural networks
Author :
Jou, I-Chang ; Tsay, Yuh-Jiuan ; Tsay, Shuh-Chuan ; Wu, Quen-Zong ; Yu, Shih-Shien
Author_Institution :
Telecommun. Lab., MOC, Chung-Li, Taiwan
fYear :
1991
fDate :
11-14 Jun 1991
Firstpage :
1408
Abstract :
A novel architecture of neural networks with distributed structures which is designed so that each class in the application has a one-output backpropagation subnetwork is presented. A novel architecture (one-net-one-class) can overcome the drawbacks of conventional backpropagation architectures which must be completely retrained whenever a class is added. This architecture features complete parallel distributed processing in that the network is comprised of subnetworks each of which is a single output two-layer backpropagation which can be trained and retrieved parallely and independently. The proposed architecture also enjoys rapid convergence in both the training phase and the retrieving phase
Keywords :
learning systems; neural nets; parallel architectures; parallel processing; convergence; multiple one-output back-propagation neural networks; one-net-one-class architecture; parallel distributed processing; retrieving phase; subnetworks; training phase; Artificial neural networks; Computer architecture; Convergence; Distributed processing; Laboratories; Neural networks; Neurons; Pattern classification; Pattern recognition;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems, 1991., IEEE International Sympoisum on
Print_ISBN :
0-7803-0050-5
Type :
conf
DOI :
10.1109/ISCAS.1991.176636
Filename :
176636
Link To Document :
بازگشت