Kernel matrix completion by semidefinite programming

We consider the problem of missing data in kernel-based learning algorithms. We explain how semidefinite programming can be used to perform an approximate weighted completion of the kernel matrix that ensures positive semidefiniteness and hence Mercer's condition. In numerical experiments we apply a support vector machine to the XOR classification task based on randomly sparsified kernel matrices from a polynomial kernel of degree 2. The approximate completion algorithm leads to better generalisation and to fewer support vectors as compared to a simple spectral truncation method at the cost of considerably longer runtime. We argue that semidefinite programming provides an interesting convex optimisation framework for machine learning in general and for kernel-machines in particular.

graepel02.ps.gz
File

In  Proceedings of the International Conference on Neural Networks, ICANN 2002

Publisher  Springer

Details

TypeInproceedings
Pages694–699
Number2415
SeriesLecture Notes in Computer Science
> Publications > Kernel matrix completion by semidefinite programming