DocumentCode :
1686662
Title :
Optimization transfer for computational learning: a hierarchy from f-ICA and alpha-EM to their offsprings
Author :
Matsuyama, Yasuo ; Imahara, Shuichiro ; Katsumata, Naoto
Author_Institution :
Dept. of Electr., Electron. & Comput. Eng., Waseda Univ., Tokyo, Japan
Volume :
2
fYear :
2002
fDate :
6/24/1905 12:00:00 AM
Firstpage :
1883
Lastpage :
1888
Abstract :
Likelihood optimization methods for learning algorithms are generalized and faster algorithms are provided. The idea is to transfer the optimization to a general class of convex divergences between two probability density functions. The first part explains why such optimization transfer is significant. The second part contains derivation of the generalized independent component analysis (ICA). Experiments on brain fMRI maps are reported. The third part discusses this optimization transfer in the generalized expectation-maximization (EM) algorithm. Hierarchical descendants to this algorithm, such as vector quantization and self-organization, are also explained
Keywords :
biomedical MRI; brain; learning (artificial intelligence); optimisation; principal component analysis; probability; vector quantisation; EM algorithm; brain fMRI maps; convex divergences; independent component analysis; learning algorithms; likelihood optimization; probability density functions; self organization; vector quantization; Algorithm design and analysis; Convergence; Cost function; Independent component analysis; Learning systems; Optimization methods; Scattering; Speech; Stochastic processes; Vector quantization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
Conference_Location :
Honolulu, HI
ISSN :
1098-7576
Print_ISBN :
0-7803-7278-6
Type :
conf
DOI :
10.1109/IJCNN.2002.1007806
Filename :
1007806
Link To Document :
بازگشت