Multi-Task Learning: Multiple Kernels for Multiple Tasks

Wei Wu, Hang Li, Yunhua Hu, and Rong Jin

Abstract

Many kernel based methods for multi-task learning have been proposed, which leverage relations among tasks to enhance the overall learning accuracies. Most of the methods assume that the learning tasks share the same kernel [e.g., 13], which could limit their applications because in practice different tasks may need different kernels. In this paper, we consider utilizing multiple kernels for multiple tasks. The main challenge of introducing multiple kernels into multiple tasks is that functions from different Reproducing Kernel Hilbert Spaces (RKHSs) are not comparable, making it difficult to exploit relations among tasks. This paper addresses the challenge by defining the problem in the Square Integrable Space (SIS). Specially, it proposes a kernel based method which makes use of a regularization term defined in the SIS to represent task relations. We prove a new representer theorem for the proposed approach in SIS. We further derive a practical method for solving the learning problem and conduct consistency analysis of the method. We discuss the relations between our method and the existing method by showing the inequality relation between the two regularization terms in the two methods. We also give an SVM based implementation of our method for multi-label classification. Experiments on an artificial example and three real-world data sets show significant improvements of the proposed method over existing methods.

Details

Publication typeTechReport
NumberMSR-TR-2010-87
> Publications > Multi-Task Learning: Multiple Kernels for Multiple Tasks