|Introduction Method Applications|
We apply our method to the UIUC textures, Oxford flowers and Caltech 101 and 256 object categorisation databases. Since we would like to test how general the technique is, we assume that no prior knowledge is available and that no descriptor is a priori preferable to any other. We therefore set σk to be constant for all k and do not make use of the constraints Ad ≥ p (unless otherwise stated). The only parameters left to be set are C, the misclassification penalty, and the kernel parameters γk. These parameters are not tweaked. Instead, C is set to 1000 for all classifiers and databases and γk is set to one over the mean of the kth distances over the training set for the given pairwise classification task. Note that the kernel parameters could instead have been learnt by creating many base kernels, each with a different value of γk, and then seeing which ones gets selected. It is also possible to analogously learn 1/C in an l2 SVM setting.
We compare our algorithm to the Multiple Kernel Learning Block l1 regularisation method of [Bach et al. NIPS 2004] for which code is publicly available. All experimental results are calculated over 20 random train/test splits of the data except for 1-vs-All results which are calculated over 3 splits.
|3.1. UIUC Textures 3.2. Oxford Flowers 3.3. Caltech 101 3.4. Caltech 256|
Please note that the results reported on the Caltech databases are not valid. Some of the kernel matrices on which the results were based have errors. Thus, please disregard the Caltech results and do not cite the paper in reference to these datasets. The results on the other two datasets, textures and flowers, are still valid.
|«||Introduction Method Applications||»|