Author :
Foley, Donald H. ; Sammon, John W., Jr.
Abstract :
A new method for the extraction of features in a two-class pattern recognition problem is derived. The main advantage is that the method for selecting features is based entirely upon discrimination or separability as opposed to the more common approach of fitting. The classical example of fitting is the use of the eigenvectors of the lumped covariance matrix corresponding to the largest eigenvalues. In an analogous manner, the new technique selects discriminant vectors (or features) corresponding to the largest "discrim-values." The new method is compared to some of the more popular alternative techniques via both data-dependent and mathematical examples. In addition, a recursive method for obtaining the discriminant vectors is given.
Keywords :
Dimensionality reduction, discriminants, eigenvectors, feature extraction, feature ranking, feature selection, Karhunen-Loeve expansions, multivariate data-analysis, pattern classification, pattern recognition.; Covariance matrix; Eigenvalues and eigenfunctions; Feature extraction; Karhunen-Loeve transforms; Logic design; Pattern analysis; Pattern classification; Pattern recognition; Piecewise linear techniques; Vectors; Dimensionality reduction, discriminants, eigenvectors, feature extraction, feature ranking, feature selection, Karhunen-Loeve expansions, multivariate data-analysis, pattern classification, pattern recognition.;