DocumentCode :
1842587
Title :
Training MLPs layer-by-layer with the information potential
Author :
Xu, Dongxin ; Principe, Jose C.
Author_Institution :
Dept. of Electr. & Comput. Eng., Florida Univ., Gainesville, FL, USA
Volume :
3
fYear :
1999
fDate :
1999
Firstpage :
1716
Abstract :
In the area of information processing one fundamental issue is how to measure the statistical relationship between two variables based only on their samples. The authors previously (1998) presented the idea of information potential which was formulated from the quadratic mutual information, and successfully applied it to problems such as blind source separation and pose estimation of SAR images. This paper shows how information potential can be used to train a MLP (multilayer perceptron) layer-by-layer, which provides evidence that the hidden layer of a MLP serves as an “information filter” which tries to best represent the desired output in that layer in the statistical sense of mutual information
Keywords :
filtering theory; learning (artificial intelligence); multilayer perceptrons; signal processing; statistical analysis; MLP layer-by-layer training; SAR images; blind source separation; hidden layer; information filter; information potential; information processing; multilayer perceptron; mutual information; pose estimation; quadratic mutual information; statistical relationship; Area measurement; Blind source separation; Electric variables measurement; Gain measurement; Information entropy; Information processing; Laboratories; Mutual information; Neural engineering; Synthetic aperture radar;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-5529-6
Type :
conf
DOI :
10.1109/IJCNN.1999.832634
Filename :
832634
Link To Document :
بازگشت