## Rate of Convergence of $k$-Nearest-Neighbor Classification Rule

*Maik Döring, László Györfi, Harro Walk*; 18(227):1−16, 2018.

### Abstract

A binary classification problem is considered. The excess error
probability of the $k$-nearest-neighbor classification rule
according to the error probability of the Bayes decision is
revisited by a decomposition of the excess error probability
into approximation and estimation errors. Under a weak margin
condition and under a modified Lipschitz condition or a local
Lipschitz condition, tight upper bounds are presented such that
one avoids the condition that the feature vector is bounded. The
concept of modified Lipschitz condition is applied for discrete
distributions, too. As a consequence of both concepts, we
present the rate of convergence of $L_2$ error for the
corresponding nearest neighbor regression estimate.

[abs][pdf][bib]