The rate of convergence of the nearest neighbor (NN) rule is investigated when independent identically distributed samples take values in a

-dimensional Euclidean space. The common distribution of the sample points need not be absolutely continuous. An upper bound consisting of two exponential terms is given for the probability of large deviations of error probability from the asymptotic error found by Cover and Hart. The asymptotically dominant first term of this bound is distribution-free, and its negative exponent goes to infinity approximately as fast as the square root of the number of preclassified samples. The second term depends on the underlying distributions, but its exponent is proportional to the sample size. The main term is explicitly given and depends very weakly on the dimension of the space.