Title :
Differential learning and random walk model
Abstract :
This paper presents a learning algorithm for differential decorrelation, the goal of which is to find a linear transform that minimizes the concurrent change of associated output nodes. First the algorithm is derived from the minimization of the objective function which measures the differential correlation. Then we show that the differential decorrelation learning algorithm can also be derived in the framework of maximum likelihood estimation of a linear generative model with assuming a random walk model for latent variables. Algorithm derivation and local stability analysis are given with a simple numerical example.
Keywords :
Hebbian learning; decorrelation; independent component analysis; maximum likelihood estimation; minimisation; random processes; recurrent neural nets; stability; Hebbian rule; adaptive differential decorrelation; decorrelation learning algorithm; differential ICA algorithm; differential decorrelation; differential learning; independent component analysis; latent variables; linear feedforward network; linear generative model; linear transform; local stability analysis; maximum likelihood estimation; natural gradient algorithm; objective function minimization; output nodes; random walk model; Computer science; Decorrelation; Independent component analysis; Maximum likelihood estimation; Minimization methods; Neurofeedback; Neurons; Output feedback; Stability analysis; Vectors;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03). 2003 IEEE International Conference on
Print_ISBN :
0-7803-7663-3
DOI :
10.1109/ICASSP.2003.1202468