keyboard_arrow_up
An Empirical Comparison of Weighting Functions for Multi-Label Distance Weighted K - Nearest Neighbour Method

Authors

Jianhua Xu, Nanjing Normal University, China

Abstract

Multi-label classification is an extension of classical multi-class one, where any instance can be associated with several classes simultaneously and thus the classes are no longer mutually exclusive. It was experimentally shown that the distance-weighted k-nearest neighbour (DWkNN) algorithm is superior to the original kNN rule for multi-class learning. But, it has not been investigated whether the distance-weighted strategy is valid for multi-label learning and which weighting function performs well. In this paper, we provide a concise multi-label DWkNN form (MLC-DWkNN). Furthermore, four weighting functions, Dudani’s linear function varying from 1 to 0, Macleod’s linear function ranging from 1 to 1/2, Dudani’s inverse distance function, and Zavrel’s exponential function, are collected and then investigated by detailed experiments on three benchmark data sets with Manhattan distance. Our study demonstrates that Dudani’s linear and Zavrel’s exponential functions work well, and moreover MLC-DWkNN with such two functions outperforms an existing kNN-based multi-label classifier ML-kNN.

Keywords

Multi-label Classification, k-Nearest Neighbours, Weighting Function, Manhattan Distance

Full Text  Volume 1, Number 3