中文摘要 |
The small sample size (SSS) problem has been an essential issue for high-dimensional data classification because the cost of collecting the samples is generally expensive and difficult. The SSS problem often results in unsatisfied classification results. Feature extraction (or selection) and classifier enhancement are two commonly-used approaches for overcoming the SSS problem to achieve better classification results. The former method is to reduce the data dimensionality, and then applying the reduced-dimensionality data set to train a classifier. The latter is to modify the classifier design to be suitable for the SSS problem. The recent literature manifests that a classifier performs better on the data set transformed by nonparametric feature extraction than that by the well-known parametric method, linear discriminant analysis (LDA). In this paper, we propose a novel K-nearest neighbor (KNN) classifier, namely adaptive KNN (AKNN), which is a KNN-type classifier embedding the merits of a well-performed nonparametric feature extraction method, namely NLDA. In AKNN, the distance metrics are formed by NLDA features. In the training phase of AKNN, a metric is estimated and assigned to each training sample and then, in the classification phase, a weighted metric with respect to the test sample is computed by its KNN training samples’ metrics. Based on the weighted metric, the test sample is classified to the majority category of its KNNs. One remotely sensed hyperspectral benchmark image is included for investigating the effectiveness of AKNN. Experimental results demonstrate that the proposed AKNN can perform better than the classic KNN and support vector machine (SVM) classifier. |