Title :
Fast subspace tracking and neural network learning by a novel information criterion
Author :
Miao, Yongfeng ; Hua, Yingbo
Author_Institution :
Dept. of Electr. & Electron. Eng., Melbourne Univ., Parkville, Vic., Australia
fDate :
7/1/1998 12:00:00 AM
Abstract :
We introduce a novel information criterion (NIC) for searching for the optimum weights of a two-layer linear neural network (NN). The NIC exhibits a single global maximum attained if and only if the weights span the (desired) principal subspace of a covariance matrix. The other stationary points of the NIC are (unstable) saddle points. We develop an adaptive algorithm based on the NIC for estimating and tracking the principal subspace of a vector sequence. The NIC algorithm provides a fast on-line learning of the optimum weights for the two-layer linear NN. We establish the connections between the NIC algorithm and the conventional mean-square-error (MSE) based algorithms such as Oja´s algorithm (Oja 1989), LMSER, PAST, APEX, and GHA. The NIC algorithm has several key advantages such as faster convergence, which is illustrated through analysis and simulation
Keywords :
adaptive estimation; adaptive signal processing; convergence of numerical methods; covariance matrices; information theory; learning (artificial intelligence); multilayer perceptrons; search problems; sequences; tracking; APEX; GHA; LMSER; NIC algorithm; Oja´s algorithm; PAST; adaptive algorithm; convergence; covariance matrix; estimation; fast subspace tracking; global maximum; information criterion; mean-square-error; neural network learning; on-line learning; optimum weights; principal subspace; stationary points; two-layer linear neural network; unstable saddle points; vector sequence; Adaptive algorithm; Algorithm design and analysis; Analytical models; Approximation algorithms; Convergence; Covariance matrix; Neural networks; Principal component analysis; Signal processing algorithms; Vectors;
Journal_Title :
Signal Processing, IEEE Transactions on