Abstract
A novel pattern classification method, called the nearest feature line (NFL), has been proposed by one of authors recently. The NFL provides a better alternative to the popular nearest neighbor (NN) classifier when multiple prototypes per class are available. It has been shown to achieve consistently better performance than the NN in terms of the error rate with simulated data as well as real application data.This paper gives a theoretical justification of the NFL. The main result is a proof that the NFL can achieve lower probabilistic error than the NN when the number of available prototypes for each class is finite and the dimension of a feature space is high. A simulation experiment shows that the NFL produces considerably lower error rate than the NN.