Title :
Advances in using hierarchical mixture of experts for signal classification
Author :
Ramamurti, Viswanath ; Ghosh, Joydeep
Author_Institution :
Dept. of Electr. & Comput. Eng., Texas Univ., Austin, TX, USA
Abstract :
The hierarchical mixture of experts (HME) architecture is a powerful tree structured architecture for supervised learning. An efficient one-pass algorithm to solve the M-step of the EM iterations while training the HME network to perform classification tasks, is first described. This substantially reduces the training time compared to using the IRLS method to solve the M-step. Further, a pre-processing stage is proposed, consisting of radial basis function kernels, aimed at reducing the tree height of the HME network. Alternatively, employment of a localized form of gating network is suggested to reduce the tree height. Shorter HME trees, with much fewer network parameters, are significantly faster to train. Simulation results are presented on a real life data set
Keywords :
feedforward neural nets; iterative methods; learning (artificial intelligence); multilayer perceptrons; neural net architecture; signal processing; EM iterations; IRLS method; classification; expectation maximization; hierarchical mixture of experts architecture; localized gating network; multilayer perceptrons; network parameters; one-pass algorithm; preprocessing stage; radial basis function kernels; real life data set; signal classification; simulation results; supervised learning; training time reduction; tree height reduction; tree structured architecture; Classification tree analysis; Computational efficiency; Computer architecture; Employment; Iterative algorithms; Jacobian matrices; Kernel; Least squares methods; Pattern classification; Tree data structures;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1996. ICASSP-96. Conference Proceedings., 1996 IEEE International Conference on
Conference_Location :
Atlanta, GA
Print_ISBN :
0-7803-3192-3
DOI :
10.1109/ICASSP.1996.550800