Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
Kernel matrix completion by semidefinite programming

Thore Graepel


We consider the problem of missing data in kernel-based learning algorithms. We explain how semidefinite programming can be used to perform an approximate weighted completion of the kernel matrix that ensures positive semidefiniteness and hence Mercer's condition. In numerical experiments we apply a support vector machine to the XOR classification task based on randomly sparsified kernel matrices from a polynomial kernel of degree 2. The approximate completion algorithm leads to better generalisation and to fewer support vectors as compared to a simple spectral truncation method at the cost of considerably longer runtime. We argue that semidefinite programming provides an interesting convex optimisation framework for machine learning in general and for kernel-machines in particular.


Publication typeInproceedings
Published inProceedings of the International Conference on Neural Networks, ICANN 2002
SeriesLecture Notes in Computer Science
> Publications > Kernel matrix completion by semidefinite programming