DocumentCode :
814318
Title :
Generalized information potential criterion for adaptive system training
Author :
Erdogmus, Deniz ; Principe, Jose C.
Author_Institution :
Computational Neuroengineering Lab., Florida Univ., Gainesville, FL, USA
Volume :
13
Issue :
5
fYear :
2002
fDate :
9/1/2002 12:00:00 AM
Firstpage :
1035
Lastpage :
1044
Abstract :
We have previously proposed the quadratic Renyi´s error entropy as an alternative cost function for supervised adaptive system training. An entropy criterion instructs the minimization of the average information content of the error signal rather than merely trying to minimize its energy. In this paper, we propose a generalization of the error entropy criterion that enables the use of any order of Renyi´s entropy and any suitable kernel function in density estimation. It is shown that the proposed entropy estimator preserves the global minimum of actual entropy. The equivalence between global optimization by convolution smoothing and the convolution by the kernel in Parzen windowing is also discussed. Simulation results are presented for time-series prediction and classification where experimental demonstration of all the theoretical concepts is presented.
Keywords :
adaptive systems; entropy; learning (artificial intelligence); probability; smoothing methods; Parzen windowing; adaptive system training; classification; convolution smoothing; cost function; density estimation; entropy criterion; generalized information potential criterion; global minimum; global optimization; kernel function; quadratic Renyi error entropy; time-series prediction; Adaptive systems; Chaos; Convolution; Cost function; Entropy; Feature extraction; Kernel; Mutual information; Signal processing; Source separation;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2002.1031936
Filename :
1031936
Link To Document :
بازگشت