Shike Mei, Bin Cao, and Jian-Tao Sun
Multi-task learning (MTL) aims to improve the performance of each task by borrowing the knowledge learned from other related tasks. Identifying the underlying structures among tasks is crucial for MTL to understand the relationship among tasks. In this paper, we propose a novel multi-task learning model to simultaneously consider low-rank structure and sparse structure. Combining these two types of structures could not only improve the learner's performance, but also make the interpretation of learned structures easier. The standard subgradient optimization method for solving this problem could only achieve a rate of convergence O(1/sqrt(k)). We propose using a novel optimization method combining the Moreau approximation and an accelerated proximal method to achieve a rate of convergence O(1/k). We conduct experiments on several real-world data sets and the results show the gains of our model in comparison with state-of-the-art baselines.
Publisher Microsoft Technical Report