Abstract :
A feature-selection procedure is proposed for the class of distribution-free pattern classifiers [1], [2]. The selection procedure can be readily carried out on fixed (large) training samples using matrix inversion. If direct matrix inversion is to be avoided, the approximation method [4] or the stochastic-approximation procedure [2] can be applied to the training samples. The proposed procedure, aside from furnishing a statistical interpretation, has a mapping interpretation. It has the unique property of designing a pattern classifier under a single-performance criterion instead of the conventional division of receptor and categorizer. It enables the system to come closest to the minimum-risk ideal classifier. In particular, for two-class problems having normal distributions with equal covariance matrices, equal costs for misrecognition, and equal a priori probabilities, the proposed procedure yields the optimum Bayes procedure without the knowledge of the class distributions. Furthermore, the proposed feature-selection procedure is the same as that of the divergence computation. Experimental results are presented. They are considered satisfactory.