The edited

nearest neighbor rule (

-NNR) consists of 1) eliminating those samples from the data which are not classified correctly by the

-NNR and the remainder of the data, and 2) using the NNR with the samples which remain from 1) to classify new observations. Wilson has shown that this rule has an asymptotic probability of error which is better than that of the

-NNR. A key step in his development is showing the convergence of the edited nearest neighbor. His lengthy argument is replaced here by a somewhat simpler one which uses an intuitive fact about the editing procedure.