Abstract :
The distance measures and the information functions for feature selection are compared. The comparison is based on the available tight upper and lower bounds of the probability of misrecognition, the rates of change of such probability, the effectiveness of a feature subset, and the computational complexity.
Keywords :
Bhattacharyya coefficients, computational complexity, entropy criterion, error bounds, feature subsets, probability of misrecognition.; Computational complexity; Entropy; Feature extraction; Gaussian distribution; Pattern recognition; Probability; Upper bound; Bhattacharyya coefficients, computational complexity, entropy criterion, error bounds, feature subsets, probability of misrecognition.;