Title :
Training MLPs layer-by-layer with the information potential
Author :
Xu, Dongxin ; Principe, Jose C.
Author_Institution :
Comput. NeuroEng. Lab., Florida Univ., Gainesville, FL, USA
Abstract :
In the area of information processing one fundamental issue is how to measure the relationship between two variables based only on their samples. In a previous paper, the idea of information potential which was formulated from the so called quadratic mutual information was introduced, and successfully applied to problems such as blind source separation and pose estimation of SAR (synthetic aperture radar) Images. This paper shows how information potential can be used to train a MLP (multilayer perceptron) layer-by-layer, which provides evidence that the hidden layer of a MLP serves as an “information filter” which tries to best represent the desired output in that layer in the statistical sense of mutual information
Keywords :
entropy; learning (artificial intelligence); multilayer perceptrons; signal sampling; MLP; hidden layer; information filter; information potential; information processing; layer-by-layer training; multilayer perceptron; quadratic mutual information; Area measurement; Blind source separation; Electric variables measurement; Gain measurement; Information entropy; Information processing; Laboratories; Mutual information; Neural engineering; Synthetic aperture radar;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1999. Proceedings., 1999 IEEE International Conference on
Conference_Location :
Phoenix, AZ
Print_ISBN :
0-7803-5041-3
DOI :
10.1109/ICASSP.1999.759922