Title :
Classification using localized mixture of experts
Author_Institution :
IDIAP, Martigny, Switzerland
Abstract :
A mixture of experts consists of a gating network that learns to partition the input space, and expert networks attributed to these different regions. This paper focuses on the choice of the gating network. First, a localized gating network based on a mixture of linear latent variable models is proposed that extends a gating network introduced by Xu et al. (1995), based on Gaussian mixture models. It is shown that this localized mixture of expert model can be trained with the expectation maximization algorithm. The localized model is compared on a set of classification problems, with mixtures of experts having single or multilayer perceptrons as gating network. It is found that the standard mixture of experts with feedforward networks as gate often outperforms the other models
Keywords :
feedforward neural nets; EM algorithm; Gaussian mixture models; expectation maximization algorithm; experts networks; feedforward neural networks; gating network; learning; mixture of experts; pattern classification;
Conference_Titel :
Artificial Neural Networks, 1999. ICANN 99. Ninth International Conference on (Conf. Publ. No. 470)
Conference_Location :
Edinburgh
Print_ISBN :
0-85296-721-7
DOI :
10.1049/cp:19991216