Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
IKNN: Informative K-Nearest Neighbor Classification

Yang Song, Jian Huang, Ding Zhou, Hongyuan Zha, and C. Lee Giles


The K-nearest neighbor (KNN) decision rule has been a ubiquitous classification tool with good scalability. Past experience has shown that the optimal choice of K depends upon the data, making it laborious to tune the parameter for different applications. We introduce a new metric that measures the informativeness of objects to be classified. When applied as a query-based distance metric to measure the closeness between objects, two novel KNN procedures, Locally Informative-KNN (LI-KNN) and Globally Informative-KNN (GI-KNN), are proposed. By selecting a subset of most informative objects from neighborhoods, our methods exhibit stability to the change of input parameters, number of neighbors(K) and informative points (I). Experiments on UCI benchmark data and diverse real-world data sets indicate that our approaches are application-independent and can generally outperform several popular KNN extensions, as well as SVM and Boosting methods.


Publication typeInproceedings
Published inPKDD 2007
PublisherSpringer Verlag
> Publications > IKNN: Informative K-Nearest Neighbor Classification