Title :
An information theoretic methodology for prestructuring neural networks
Author :
Chambless, Bjorn ; Lendaris, George G. ; Zwick, Martin
Author_Institution :
NW Comput. Intelligence Lab., Portland State Univ., OR, USA
Abstract :
Absence of a priori knowledge about a problem domain typically forces use of overly complex neural network structures. An information-theoretic method based on calculating information transmission is applied to training data to obtain a priori knowledge that is useful for prestructuring (reducing complexity) of neural networks. The method is applied to a continuous system, and it is shown that such prestructuring reduces training time, and enhances generalization capability
Keywords :
feedforward neural nets; information theory; learning (artificial intelligence); complexity reduction; continuous system; generalization capability; information theoretic methodology; information transmission; neural networks; prestructuring; training data; training time; Artificial neural networks; Computational intelligence; Continuous time systems; Data analysis; Equations; Function approximation; Information analysis; Laboratories; Neural networks; Performance analysis;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.939047