Title :
A modified mixtures of experts architecture for classification with diverse features
Author :
Chen, Ke ; Chi, Huisheng
Author_Institution :
Nat. Lab. of Machine Perception, Peking Univ., Beijing, China
Abstract :
A modular neural architecture, MME, is considered here as an alternative to the standard mixtures of experts architecture for classification with diverse features. Unlike the standard mixtures of experts architecture, a gate-bank consisting of multiple gating networks is introduced to the proposed architecture, and those gating networks in the gate-bank receive different input vectors while expert networks may be receiving different input vectors. As a result, a classification task with diverse features can be learned by the modular neural architecture through the use of different features simultaneously. In the proposed architecture, learning is treated as a maximum likelihood problem and an EM algorithm is presented for adjusting the parameters of the architecture. Comparative simulation results are presented for a real world problem called text-dependent speaker identification
Keywords :
maximum likelihood estimation; neural net architecture; pattern classification; EM algorithm; MME; classification; diverse features; gate-bank; maximum likelihood problem; modified experts mixtures architecture; modular neural architecture; multiple gating networks; parameter adjustment; text-dependent speaker identification; Application software; Cognitive science; Data mining; Feature extraction; Information science; Pattern classification; Pattern recognition; Training data;
Conference_Titel :
Neural Networks,1997., International Conference on
Print_ISBN :
0-7803-4122-8
DOI :
10.1109/ICNN.1997.611667