DocumentCode :
3601382
Title :
A Recurrent Probabilistic Neural Network with Dimensionality Reduction Based on Time-series Discriminant Component Analysis
Author :
Hayashi, Hideaki ; Shibanoki, Taro ; Shima, Keisuke ; Kurita, Yuichi ; Tsuji, Toshio
Author_Institution :
Grad. Sch. of Eng., Hiroshima Univ., Higashi-Hiroshima, Japan
Volume :
26
Issue :
12
fYear :
2015
Firstpage :
3021
Lastpage :
3033
Abstract :
This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.
Keywords :
backpropagation; hidden Markov models; pattern classification; probability; recurrent neural nets; time series; Gaussian mixture model; Lagrange multiplier method; TSDCA; TSDCN; backpropagation; continuous-density hidden Markov model; dimensionality reduction; electroencephalogram signals; high-accuracy classification; high-dimensional artificial data; high-dimensional time-series patterns classification; lower dimensional space; network coefficients; orthogonal transformations; posterior probabilities; recurrent probabilistic neural network; reduced-dimensional space; time-based learning algorithm; time-series discriminant component analysis; time-series discriminant component network; Artificial neural networks; Data models; Hidden Markov models; Probabilistic logic; Probability; Vectors; Dimensionality reduction; Gaussian mixture model (GMM); hidden Markov model (HMM); neural network (NN); pattern classification; pattern classification.;
fLanguage :
English
Journal_Title :
Neural Networks and Learning Systems, IEEE Transactions on
Publisher :
ieee
ISSN :
2162-237X
Type :
jour
DOI :
10.1109/TNNLS.2015.2400448
Filename :
7045517
Link To Document :
بازگشت