Title :
KLDA - An Iterative Approach to Fisher Discriminant Analysis
Author :
Lu, Fangfang ; Li, Hongdong
Author_Institution :
Australian Nat. Univ., Acton
fDate :
Sept. 16 2007-Oct. 19 2007
Abstract :
In this paper, we present an iterative approach to Fisher discriminant analysis called Kullback-Leibler discriminant analysis (KLDA) for both linear and nonlinear feature extraction. We pose the conventional problem of discriminative feature extraction into the setting of function optimization and recover the feature transformation matrix via maximization of the objective function. The proposed objective function is defined by pairwise distances between all pairs of classes and the Kullback-Leibler divergence is adopted to measure the disparity between the distributions of each pair of classes. Our proposed algorithm can be naturally extended to handle nonlinear data by exploiting the kernel trick. Experimental results on the real world databases demonstrate the effectiveness of both the linear and kernel versions of our algorithm.
Keywords :
feature extraction; iterative methods; matrix algebra; Fisher discriminant analysis; KLDA iterative approach; Kullback-Leibler discriminant analysis; function optimization; linear feature extraction; maximization; nonlinear feature extraction; transformation matrix; Algorithm design and analysis; Australia; Covariance matrix; Information analysis; Iterative methods; Kernel; Linear discriminant analysis; Matrix decomposition; Pattern analysis; Scattering; Kernel Fisher Discriminant Analysis; Kullback-Leibler Divergence; Linear Discriminant Analysis; Optimization;
Conference_Titel :
Image Processing, 2007. ICIP 2007. IEEE International Conference on
Conference_Location :
San Antonio, TX
Print_ISBN :
978-1-4244-1437-6
Electronic_ISBN :
1522-4880
DOI :
10.1109/ICIP.2007.4379127