Pattern Recognition, International Conference on
Download PDF

Abstract

In pattern recognition literature, it is well known that a finite number of training samples cause practical difficulties in designing a classifier. Moreover, the generalization error of the classifier tends to increase, as the number of features gets large. In this paper, we will study the generalization error of several classifiers (MLPNN, RBFNN, K_NN) in high dimensional spaces, under a practical condition: the ratio of the training sample to the dimensionality is small. Experimental results show that the generalization error of neuronal classifiers decreases as a function of dimensionality while it increases for statistical classifiers.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!