DocumentCode :
3283260
Title :
On computational limitations of neural network architectures
Author :
Hoffmann, Achim G.
Author_Institution :
Dept. of Comput. Sic., Tech. Univ. Berlin, Germany
fYear :
1990
fDate :
9-13 Dec 1990
Firstpage :
818
Lastpage :
825
Abstract :
Recently, there have been many attempts to develop neural network architectures which are capable of intelligent behavior. Thorough analyses of two-layer nets have been provided in the past. However, analyses of the computational abilities of multi-layer nets have been suffered so far from their inherent complexity of interaction. The paper introduces a powerful method for analyzing the computational abilities of neural nets which is based on algorithmic information theory. The method shows that the idea of many interacting computing units, as present in neural networks, does not essentially facilitate the task of constructing intelligent systems. The same holds for building powerful learning systems
Keywords :
computational complexity; information theory; learning systems; neural nets; algorithmic information theory; intelligent behavior; intelligent systems; learning systems; multi-layer nets; neural network architectures; two-layer nets; Algorithm design and analysis; Buildings; Computational intelligence; Computer architecture; Computer networks; Information analysis; Information theory; Intelligent networks; Intelligent systems; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Parallel and Distributed Processing, 1990. Proceedings of the Second IEEE Symposium on
Conference_Location :
Dallas, TX
Print_ISBN :
0-8186-2087-0
Type :
conf
DOI :
10.1109/SPDP.1990.143652
Filename :
143652
Link To Document :
بازگشت