Abstract
It is well known that editing techniques can be applied to (large) sets of prototypes in order to bring the error rate of the Nearest Neighbor classifier close to the optimal Bayes risk. However, in practice, the behavior of these techniques uses to be much worse than expected from the asymptotic predictions. A novel editing technique is introduced here which explicitly aims at obtaining a good editing rule for each given prototype set. First learning an adequate assignment of a weight to each prototype and then pruning out those prototypes having large weights achieve this. Experiments are presented which clearly show the superiority of this new method, especially for small data sets and/or large dimensions.