Title :
An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems
Author :
Erdogmus, Deniz ; Principe, Jose C.
Author_Institution :
Computational NeuroEngineering Lab., Florida Univ., Gainesville, FL, USA
fDate :
7/1/2002 12:00:00 AM
Abstract :
The paper investigates error-entropy-minimization in adaptive systems training. We prove the equivalence between minimization of error´s Renyi (1970) entropy of order α and minimization of a Csiszar (1981) distance measure between the densities of desired and system outputs. A nonparametric estimator for Renyi´s entropy is presented, and it is shown that the global minimum of this estimator is the same as the actual entropy. The performance of the error-entropy-minimization criterion is compared with mean-square-error-minimization in the short-term prediction of a chaotic time series and in nonlinear system identification
Keywords :
adaptive systems; backpropagation; chaos; identification; minimisation; minimum entropy methods; multilayer perceptrons; nonlinear systems; prediction theory; probability; time series; Csiszar distance measure minimization; MLP; MSE minimization; PDF matching; TDNN; adaptive systems training; backpropagation algorithm; chaotic time series; error Renyi entropy minimization; error-entropy minimization algorithm; error-entropy-minimization criterion; global minimum; mean-square-error-minimization; multiplayer perceptrons; nonlinear adaptive systems; nonlinear system identification; nonparametric estimator; probability density matching; short-term prediction; supervised training; time delay neural networks; Adaptive systems; Entropy; Gaussian processes; Minimization methods; Nonlinear dynamical systems; Nonlinear systems; Probability density function; Probability distribution; Statistics; Supervised learning;
Journal_Title :
Signal Processing, IEEE Transactions on
DOI :
10.1109/TSP.2002.1011217