Title :
A recurrent log-linearized Gaussian mixture network
Author :
Tsuji, Toshio ; Bu, Nan ; Fukuda, Osamu ; Kaneko, Makoto
Author_Institution :
Dept. of Artificial Complex Syst. Eng., Hiroshima Univ., Higashi, Japan
fDate :
3/1/2003 12:00:00 AM
Abstract :
Context in time series is one of the most useful and interesting characteristics for machine learning. In some cases, the dynamic characteristic would be the only basis for achieving a possible classification. A novel neural network, which is named "a recurrent log-linearized Gaussian mixture network (R-LLGMN)," is proposed in this paper for classification of time series. The structure of this network is based on a hidden Markov model (HMM), which has been well developed in the area of speech recognition. R-LLGMN can as well be interpreted as an extension of a probabilistic neural network using a log-linearized Gaussian mixture model, in which recurrent connections have been incorporated to make temporal information in use. Some simulation experiments are carried out to compare R-LLGMN with the traditional estimator of HMM as classifiers, and finally, pattern classification experiments for EEG signals are conducted. It is indicated from these experiments that R-LLGMN can successfully classify not only artificial data but real biological data such as EEG signals.
Keywords :
hidden Markov models; learning (artificial intelligence); pattern classification; recurrent neural nets; speech recognition; time series; machine learning; neural networks; pattern classification; recurrent neural networks; time series; Backpropagation; Bayesian methods; Biological system modeling; Brain modeling; Electroencephalography; Hidden Markov models; Neural networks; Pattern classification; Recurrent neural networks; Space technology;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2003.809403