DocumentCode :
3422488
Title :
Linear Sequence Discriminant Analysis: A Model-Based Dimensionality Reduction Method for Vector Sequences
Author :
Bing Su ; Xiaoqing Ding
Author_Institution :
Dept. of Electron. Eng., Tsinghua Univ., Beijing, China
fYear :
2013
fDate :
1-8 Dec. 2013
Firstpage :
889
Lastpage :
896
Abstract :
Dimensionality reduction for vectors in sequences is challenging since labels are attached to sequences as a whole. This paper presents a model-based dimensionality reduction method for vector sequences, namely linear sequence discriminant analysis (LSDA), which attempts to find a subspace in which sequences of the same class are projected together while those of different classes are projected as far as possible. For each sequence class, an HMM is built from states of which statistics are extracted. Means of these states are linked in order to form a mean sequence, and the variance of the sequence class is defined as the sum of all variances of component states. LSDA then learns a transformation by maximizing the separability between sequence classes and at the same time minimizing the within-sequence class scatter. DTW distance between mean sequences is used to measure the separability between sequence classes. We show that the optimization problem can be approximately transformed into an eigen decomposition problem. LDA can be seen as a special case of LSDA by considering non-sequential vectors as sequences of length one. The effectiveness of the proposed LSDA is demonstrated on two individual sequence datasets from UCI machine learning repository as well as two concatenate sequence datasets: APTI Arabic printed text database and IFN/ENIT Arabic handwriting database.
Keywords :
handwriting recognition; handwritten character recognition; hidden Markov models; image sequences; learning (artificial intelligence); text analysis; APTI Arabic printed text database; DTW distance; HMM; IFN-ENIT Arabic handwriting database; LSDA; UCI machine learning repository; concatenate sequence dataset; eigen decomposition problem; linear sequence discriminant analysis; mean sequence; model-based dimensionality reduction method; nonsequential vectors; separability maximization; sequence class; sequence class variance; statistics; vector sequences; within-sequence class scatter minimization; Analytical models; Databases; Hidden Markov models; Optimization; Time series analysis; Training; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Vision (ICCV), 2013 IEEE International Conference on
Conference_Location :
Sydney, NSW
ISSN :
1550-5499
Type :
conf
DOI :
10.1109/ICCV.2013.115
Filename :
6751220
Link To Document :
بازگشت